The Google crawler looks for a robots.txt file when accessing your site. My first thought was to modify my existing robots.txt to block https. In researching how to do that, I learned:
- Google treats http and https as two separate sites
- You need separate robots.txt files for each of them.
I created a file called
robots-ssl.txt to provide the commands for https links. It contains this text to prevent indexing any links:
Contents of robots-ssl.txt
User-agent: *
Disallow: /
Place the
robots-ssl.txt file on the root folder of the website. Thats it you are done !!