Many robots.txt files contain errors hidden to humans.
Run your website’s robots file through the tool below and see if it has any errors.
Like all good robot checkers, this tool validates /robots.txt files according to the robots exclusion de-facto standard.
The robots.txt validator checks the syntax and structure of the document,
is able to spot typos and can be used to test if a specific crawler is permitted access to a given URL
If no ERROR is listed within your robots.txt file – then you’re all good to go! Otherwise, the tool will outline what ERROR you need to fix.