How to add a sitemap to the Sphinx project?¶
Sitemap is essential part of making your website more visible for search engines. It is usually represented by the
sitemap.xmlfile and lists URLs of all website pages, translations of pages in alternative languages, etc. sphinx-sitemap extension can easily generate sitemap for your Sphinx documentation project.
As an example, have a look to this website sitemap.xml.
Add and configure sphinx-sitemap¶
All hard work will be done by an amazing sphinx-sitemap extension.
pip install sphinx-sitemap
Add or append
extensions = [ # ... 'sphinx_sitemap' ]
If you haven’t set it already, enter your documentation public URL, e.g.:
html_baseurl = 'https://techwriter.documatt.com'
Build the docs! The output directory will contain automatically generated
robots.txt is somewhat similar to sitemap. They both talk to search engine crawlers. A sitemap is a list of pages to index, while robots.txt is used to ignore (do not index) some pages. robots.txt is expected at the root of your website, e.g., https://techwriter.documatt.com/robots.txt.
One way to “announce” sitemap to search engines is to report it in robots.txt by
Create or update
robots.txtin the project root (folder with
conf.py). If you don’t have pages to exclude, it may look like this:
User-agent: * Sitemap: https://techwriter.documatt.com/sitemap.xml
html_extra_path = ["robots.txt"]option to
"robots.txt"if this option already exist.
html_extra_pathis a list of paths to be copied to the root of the documentation.