Robots.txt for multiple domains
The robots.txt can only inform the search engines of sitemaps for its own domain. So that one will be the only one it honors when it crawls that domain's robots.txt. If all three domains map to the same website and share a robots.txt then the search engines will effectively find each sitemap.
I'm using the following solution in .htaccess after all domain redirects and www to non-www redirection.
# Rewrite URL for robots.txt
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
Create a new directory in your root called robots. Create a text file filled with the specific robots information for every domain.
- /robots/abc.com.txt
- /robots/abc.se.txt
- /robots/abc.de.txt
Based on Hans2103's answer, I wrote this one that should be safe to be included in just about every web project:
# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]
This rewrite condition should just serve the normal robots.txt
if it's present and only look for a robots/
directory with the specified file robots/<domain.tld>.txt
.