How do I allow Google to index login-required parts of my site?

This is frowned upon.

  1. Your page rank suffers if Google sees this (google "expert exchange google").
  2. People can still access your site with Google Cache
  3. You have to take into account every other search engine too, as you'll have to use "browser sniffing"

What you'd do is sniff for the Googlebot/2.1 (+http://www.google.com/bot.html) and Mozilla/5.0 (compatible; Yahoo! Slurp; http://help.yahoo.com/help/us/ysearch/slurp) user-agents (at least), and deliver to that client different content altogether.

Another option, is to do what expersexchange does, and have all the information buried deep down on the page. That way you have your cake, and eat it too: users don't completely hate you, your page rank doesn't suffer, and you still discriminate non-registered users give a better experience to registered users.


Google supports the concept of flexible sampling where you can show a limited amount of content to users from Google search results while allowing Googlebot to crawl all of it.

Google allows you to either show users a limited number of articles (metering) or show a portion of each article (lead-in).

From a technical standpoint you need to:

  • Allow Googlebot to access all your content by user agent or by validating Googlebot's IP addresses.
  • Mark up your pages with structured data that indicates that they have paywall content.
  • Implement showing limited amounts of content to users who come to your site from Google.

This is a replacement for Google's older "First Click Free" policy under which you had to show a full page of content every time somebody clicked to your site from a Google search result.