URL Blocking Bots

I've now fixed this issue. It was indeed to do with the robots.txt file as I had initially thought. The reason it didn't initially work is the site was still cached with the old file, which was blocking bots throughout the site.

I added some code to my .htaccess file to force re-validation of the cache:

ExpiresActive On
ExpiresDefault A1
Header append Cache-Control must-revalidate

After doing this and with my new robots.txt file, I was able to install the new SSL certificate to the website which is now working! I hope this helps anyone having the same issue.

Just remember to remove the above code afterwards, as it will temporarily stop your site from caching.