sitemap for multiple domains of same site

You should use server-side code to send the correct sitemap based on the domain name for requests to /sitemap.xml


Apache rewrite rules for /robots.txt requests

If you're using Apache as a webserver, you can create a directory called robots and put a robots.txt for each website you run on that VHOST by using Rewrite Rules in your .htaccess file like this:

# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]

NginX mapping for /robots.txt requests

When using NginX as a webserver (while taking yourdomain1.tld and yourdomain2.tld as example domains), you can achieve the same goal as post above with the following conditional variable (place this outside your server directive):

map $host $robots_file {
    default /robots/default.txt;
    yourdomain1.tld /robots/yourdomain1.tld.txt;
    yourdomain2.tld /robots/yourdomain2.tld.txt;
}

This way you can use this variable in a try_files statement inside your server directive:

location = /robots.txt {
    try_files /robots/$robots_file =404;
}

Content of /robots/*.txt

After setting up the aliases to the domain-specific robots.txt-files, add the sitemap to each of the robots files (e.g.: /robots/yourdomain1.tld.txt) using this syntax at the bottom of the file:

# Sitemap for this specific domain
Sitemap: https://yourdomain1.tld/sitemaps/yourdomain1.tld.xml

Do this for all domains you have, and you'll be set!