This should fix the issues we have with search engines.
Being that we have search for 4.0+ and it's working in the future, I'd say this is the only way.
Please chime in.
To test it locally set HUGO_ENV=production
. The Netlify preview won't show this, but the result should be the following:
# www.robotstxt.org
# Allow crawling of all content
User-agent: *
Sitemap: http://localhost:9001//sitemap.xml
Disallow:
Disallow: /1.0.0/
Disallow: /1.1.0/
Disallow: /1.1.1/
Disallow: /1.2.0/
Disallow: /1.3.0/
Disallow: /1.4.0/
Disallow: /2.0.0/
Disallow: /2.0.1/
Disallow: /2.0.2/
Disallow: /2.0.3/
Disallow: /2.0.4/
Disallow: /2.1.0/
Disallow: /2.1.1/
Disallow: /2.2.0/
Disallow: /2.2.1/
Disallow: /2.2.2/
Disallow: /2.3.0/
Disallow: /2.3.1/
Disallow: /2.3.2/
Disallow: /docs/3.3/
Disallow: /docs/3.4/
Disallow: /docs/4.0/
Disallow: /docs/4.1/
Disallow: /docs/4.2/