Topics in the world of web development and other technologies we find interesting.

Blue Valley Technologies - TechTalk

We want to give back! We have learned much of what we use everyday by others that have taken the time to post that information to the web. Here is where we will post the things that we hope will in turn be useful to others as we explore the technologies in web development and any other topics we find interesting.
You are free to use any of the code snips or techniques that we have posted here for any purpose, personal or commercial. However if we credit another individual or group please visit their website for information about their usage terms.

"Error [60] SSL Certificate problem: unable to get local issuer certificate)." when running PHP under Windows IIS


We ran into this probelm recently while setting up a CloudFront CDN for a WordPress site using the W3 Total Cache plugin. 

The server was running Windows IIS and PHP 7.2, although the solution is the same for other versions of PHP as well.

Once the CloudFront distribution was all setup and configured in the W3TC plugin, clicking the the "Test CloudFront distribution" button resulted in the following error message:

 "Error: Unable to list distributions (S3::listDistributions(): [60] SSL certificate problem: unable to get local issuer certificate)."

Why the Problem Happens

The reason we are getting this error is because the plugin is using PHP CURL extension to make a request to the Amazon CDN using HTTPS (SSL). The CURL extension does not trust the certificate and throws the error.

The Solution

The solution is configure the PHP Curl extension with the proper certificate authority bundle so that it can verify the ssl certificate.

You will need to verify that you have the proper certificate bundle in your PHP folder. We used the cacert.pem file from Mozilla, which you can get from here:

Place the file in your PHP folder located here: C:\Program Files\PHP\v7.2\extras\ssl\cacert.pem 

The PHP version / path may be different for your setup. Older versions are sometimes located in C:\Program Files (x86)\PHP\v5.6\ for example.

Once you have verfied the ca bundle is there, you will need to edit your php.ini to tell CURL where to find the bundle. 

Edit your php.ini (ours was located: C:\Program Files\PHP\v7.2\php.ini ) and use Ctrl +F to serach for the following line:  

;curl.cainfo =

Change this line to containt your path for the ca bundle, like so:

curl.cainfo = "C:\Program Files\PHP\v7.2\extras\ssl\cacert.pem"

Don't forget to un-comment the line (remove the ';' at the beginning) and make sure you use quotes around the path if it contains spaces.

Save the changes to php.ini and re test. You may also have to restart PHP or reset IIS to make the changes take effect depending on your config.


Posted at 10:39

WordPress Pretty Permalink Settings Results in 404 Error on Windows IIS

If you are enabling any of the alternate permalink settings in WordPress (located under Settings -> Permalinks) you may run into the common issue of seeing an ugly  404 - File or directory not found error when trying to view a post. Don't worry, windows IIS does support this ability, it just needs one extra step.

The Solution

According to the WordPress documentation, if you are running IIS 7+, you will need to also have the URL Rewrite 1.1+ module installed as well. This is a pretty standard module and is often already installed. 

What they don't mention however, is that you will need to enable this functionality in your web.config file. Look for this file in the root folder for your wordpress site, and if it does not yet exist, just create it (you can use notepad, but make sure to save the file as web.config, and not the typical .txt extension).

You will need to add the following content:

<?xml version="1.0" encoding="UTF-8"?>
<rule name="wordpress" patternSyntax="Wildcard">
<match url="*"/>
<add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true"/>
<add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true"/>
<action type="Rewrite" url="index.php"/>

If you already had a web.config file, just add the tags that are missing, without deleting any of the existing settings.

Posted by Mark at 16:19
Tags :
Categories :

How to Backup up a Webserver

It is very important that any production use webserver have a robust backup plan. This generally is accomplished with the combination of a few strategies in order to protect your server from various types of problems.

File Backups

A good backup plan should first provide you with the ability to save backups of all the important files on the server to an offsite location. This allows you to recover any files accidently deleted / modified by the users, or due to file corruption, malicious malware, etc. We will share some scripts we have used in another post to roll your own scheduled file backups to S3 storage. Files should be saved at minimum at least daily, and for many sites more frequent copies may be desired.

Database Backups

When it comes to modern website applications, most of them rely on databases as well. So you will need to ensure that those databases are also backed up. Often the database and the files should be backed up as closely together as possible, to ensure their are not any concurrency issues. We will share some scripts we have used in another post to roll your own scheduled database backups for both MySQL as well as MS-SQL. If you are having trouble deciding how often is often enough, consider how much data loss would be acceptable to your business.  If your website accepts e-commerce orders, and you last all orders from the last 24 hours, would you have the info you need in other places, such as order confirmations in your email?

Disaster Recovery Backups

The idea behind a disaster recovery backup is that there may come a time when you need to restore everything, including the server and its operating system. This can be necessary if the server is no longer functioning properly, or it has been compromised by hackers and can no longer be trusted. To do this manually it can take hours, or sometimes even days of installing software, configuring services, and finally restoring the data itself. With a disaster recovery backup, you can restore to a known working point often in minutes, and then restore the files and data to the most recent point.

A disaster recovery backup does not need to be updated nearly as often, so long as the important file and database data is being backed up regularly. We typically update our disaster recovery backups every couple of months, or whenever we have made significant configuration changes to the server. If you use Amazon Web Services (AWS), Rackspace cloud hosting, or other cloud server offerings, the disaster recovery backup is easily accomplished with imaging or snapshotting the server instance and its related attached storage. If you are still using on-premise hardware for your webservers, you will need to use 3rd-party software for this backup.

A Good Backup Plan Needs Testing

If your business could suffer major losses when your website goes down, a backup plan is not a good plan until it has been tested.  Far too many system admins have learned the hard way, and too late that just because a backup said it completed succesfully, that there were crucial files missing, or things that just didn't work as planned. If your site is mission critical, than it is worth the time to occasionally stage a full server recovery on test hardware. Not only does this ensure that your backups are going to work when you need them, it will help you to know what to do to restore them under pressure when it really counts.


Posted by Mark at 12:10
Categories :

IIS 500 Errors when loading a static image in WordPress

I ran into this problem today and it was a frustrating one to track down. The WordPress site was a very basic blog running on IIS and using the freindly perma-links settings.

When loading a static image that had been uploaded in WordPress the browser would return a 500 error, and the IIS log would show: "GET /wp-content/uploads/2013/10/Sale.jpg - 80 - Mozilla/5.0+(Windows+NT+6.2;+WOW64;+rv:24.0)+Gecko/20100101+Firefox/24.0 500 50 5 46"

I finally found the solution on another blog, posted by iis_isz.  I have copied the solution here to make sure I can find it again in the future, and for anyone else that has the same issue!


Explanation of the Error

The image issue was a permission issue, but simply setting it manually on the original image file or parent folder is inadequate.  The behavior of WordPress is that it writes the original file using IUSR to a temporary system directory that is defined in the PHP.ini file.  This temp folder does not have IIS_IUSRS permissions on it, so when windows moves this file from the temp folder to the application's upload folder, its final home, IIS_IUSRS only has read permissions, so the permissions are not inherited from the file's parent folder.

To fix this, there are two solutions

1. Change the permissions on the temp folder giving IIS_IUSRS write/modify.

2. Change the path of the temp folder in the PHP.ini file to a folder that does have IIS_IUSRS write/modify permission.

Here is a good source detailing the problem:

I chose to move the temp folder in my PHP.ini to C:\inetpub\temp\uploads and also give it permissions.

After uploading an image in wp-admin, I was able to access the image (original, not resized) from a browser wihout the 500.50 error.



Posted by Mark at 12:26
Categories :

Creating a Rackspace Ubuntu Cloud Server

Rackspace Cloud servers are ultra-reliable and robust cloud based servers that can be configured with a variety base OS images. This walkthrough will take you step by step through creating and setting a base install of the popular Ubuntu 10.04 LTS (Lucid Lynx). Later walkthroughs will pick up where this walkthrough leaves off with setting up common webserver platforms and frameworks.

We begin by logging into our Rackspace Cloud Server control panel.

  1. Add new server instance.
  2. Select Linux tab, then Ubuntu 10.04 LTS (Lucid).
  3. Type a unique name to identify the new server.
  4. Select the desired hardware. I started with the least expensive; it is easy to scale up later if needed.
  5. The Rackspace panel will give you a pre-generated password. Save this for later reference.
  6. While the server is building you will want to install an SSH Client. If you are on Windows we recommend Bitvise Tunnelier (free for individual use).
  7. Once the server build is complete, you will get an email with the public IP address and the root/admin password for your new Linux server.
  8. Open your SSH Client and connect to the new server with the public IP address as the Host, Default Port 22, root for the username, and the auto-generated password from earlier for the password.  On successful connection you should get a console command window.
  9. The first thing we should do is changed the root password. Since the auto-generated one was sent unencrypted via email it can no longer be considered safe. At the console prompt, type the command: passwd and hit enter. You will be prompted for the new password twice. Be sure to use an unique password that is fairly long, contains a mix of letters, numbers and at least one punctuation character. Save the new password in a secure location.

At this point you will have a working base install of Ubuntu 10.04 LTS (Lucid) running on a Rackspace Cloud Server. The next walk through will help take you step by step for Installing the LAMP Frawework stack.

Posted by Mark at 13:27
Categories :


Recent Comments

Powered by Disqus Error loading MacroEngine script (file: uBlogsyListBlogRoll.cshtml)