How to Use HTTPS Source URLs With Google’s Feedburner
Posted on Updated onWith the recent announcement from Google regarding the use of HTTPS/SSL as a ranking signal, many webmasters – including me – bought an SSL certificate and moved their website to HTTPS.
But migrating to HTTPS can be quite complicated as you will come across several problems along the way. One the issues I had is with Google’s Feedburner. After migrating to HTTPS, I assumed that Feedburner would be automatically redirected from HTTP to HTTPS and that my RSS feed would be fine. However that was not the case. Once I noticed that my feed wasn’t updated anymore, I tried to set the feed source URL to HTTPS:
Unfortunately, it threw out the error message “Received HTTP error code 400 while fetching source feed.”:
It seems that while Google enforces the use of HTTPS, Feedburner doesn’t support it yet. From a technical perspective, the easy way out of this would be to use another feed management service such as Feedblitz and Feedpress. However if you don’t want to lose your feed subscribers, here’s an easy way to work around Feedburner’s limitation.
Workaround
The first thing you need to do is to setup a subdomain by using your hosting control panel:
Next, create a file named “index.php” with the following content and upload it to the root folder of your subdomain. Of course, you need to change the RSS feed URL:
<?php $opts = array( 'http'=>array( 'method'=>"GET", 'header'=> "User-Agent: FeedBurner/1.0\r\n" ) ); $context = stream_context_create($opts); $feed = file_get_contents("https://webhostinghero.org/feed",false,$context); echo $feed; ?>
Now verify that the RSS feed is served through HTTP by accessing your subdomain’s URL with your web browser:
Finally, change the feed source to your subdomain’s URL in your Feedburner settings:
While Feedburner’s refreshes its feeds every 30 minutes, you can ping it manually. Since the ping tool will actually choke on your HTTPS website, you can enter your Feedburner URL instead:
If you want to avoid search engines from indexing your HTTP feed, you can upload a file named “robots.txt” at the root of your subdomain with the following content:
User-agent: * Disallow: /
If you’ve found a better way to do this, let me know in the comments.