Setting up Magento Staging Environment with Restricted Access

Our primary means of locking down (most) staging environments is BASIC authentication. But we also have preventative measures in place to prevent them from being discovered by engines, barring a link showing up on a public website (this never happens), and also to prevent them from being indexed by Google.

I've setup a rule in /etc/httpd/conf.d/robots.conf with the following alias:

Alias /robots.txt /<path_to_public_html>/robots.txt
<Location /robots.txt>
  Satisfy any
</Location>

The alias routes all requests to the robots.txt location to a locked down file. This means it doesn't matter what is in the robots.txt file in the Magento staging root, the server will always serve up the following rules:

User-agent: *
Disallow: /

The location and satisfy any allows the robots.txt file to be served up to anyone regardless of the authentication since we do not have global disallow from any rules.

For the password authentication, I've got the rules setup so that I can open the authentication on a single site temporarily by adding Satisfy any to the .htaccess file. This is because we run multiple stage sites on the same dedicated internal staging server. It will also allow setting allow from rules along with the Satisfy any to remove password authentication to specific IP addresses while maintaining it for everyone else (if I really need to).

The reason I do not like IP based whitelisting across the board (i.e. with no password based authentication) is because client's IP addresses are not always static. Which means we then would have to update their IPs to get them access on a (potentially) daily or weekly basis depending on how long their ISPs DHCP holds the lease.

For DNS, we use wildcard DNS so that DNS crawlers will not pickup on all the stage site hostnames which need to have public DNS. Google will actually pick up a site from DNS records. This prevents that, meaning the only way for them to find it is if someone leaves a link laying somewhere. But with forcing the robots file to serve a disallow rule will stop them indexing it if they do find a link.

Were I in the place of a merchant running a stage site for the company website, I would do things a bit differently and would just straight up block all traffic coming to the stage box unless it came known IP addresses. Anyone working on the site remotely (in-house) would be required to connect to a company VPN to access if they did not have a static IP which I could whitelist.

Having public DNS is a must if you need to test things like payment processor integrations, which, unlike most gateways, must make callbacks to the server to go through the payment process.


A few suggestions - some are built-in!

- Developer IP restriction is built-in in System Config > Developer:

This doesn't restrict IP access. Move along.

  • IP restriction is tough and I prefer to handle this at the firewall, personally. IP tables is also a candidate, as is htaccess restriction or via $_SERVER['REMOTE_ADDR'] in index.php.

  • Update the default per-page robots meta in the CMS to NOINDEX/NOFOLLOW while in staging in System Config > Design > HTML Head:

enter image description here

  • In the same config area, is the ability to display a demo store notice:

enter image description here


I have developed a module for enabling a maintenance mode which can be used with the same effect of blocking users accessing the fronted (not the admin which can be limited with Magento's native IP blocking feature).

You can infact allow some IPs to access the frontend even with maintenance mode enabled.

Maybe you can give it a try, hoping it could help. It's free and open source: https://github.com/aleron75/Webgriffe_Maintenance

Tags:

Staging