How can I serve robots.txt on an SPA using React with Firebase hosting?
In my /public
directory, I created a robots.txt
.
In my /src
directory, I did the following:
I created /src/index.js
:
import React from 'react'
import ReactDOM from 'react-dom'
import {TopApp} from './TopApp'
import registerServiceWorker from './registerServiceWorker'
import {BrowserRouter} from 'react-router-dom'
ReactDOM.render(
<BrowserRouter>
<TopApp/>
</BrowserRouter>,
document.getElementById('react-render-root')
)
registerServiceWorker()
I created /src/TopApp.js
:
import React from 'react'
import {
Switch,
Route
} from 'react-router-dom'
import {ComingSoon} from './ComingSoon'
import {App} from './App'
export class TopApp extends React.Component {
render() {
return (
<div className="TopApp">
<Switch>
<Route path='/MyStuff' component={App}/>
<Route exact path='/' component={ComingSoon}/>
</Switch>
</div>
)
}
}
Because path /robots.txt
is not covered by the router paths provided, it took it from my public directory and robots file was published as desired.
The same could be done for sitemap.xml
.
Just add the following rules to the "rewrites" section in firebase.json
"rewrites": [
{
"source": "/robots.txt",
"destination": "/robots.txt"
},
{
"source": "**",
"destination": "/index.html"
}
]