Skip to main content

Crawl problem

Comments

1 comment

  • rbairwell

    I assume you are using a third party website to create the sitemap.xml file instead of using an "on server" solution (such as a WordPress plugin if you are using WordPress).

    There are many things that could cause this issue for a remote/3rd party website: from a problem with their system (misconfiguration, server issue, your site being on a denylist, overload etc etc), problems reaching your site (internet issues between them and you) and then issues on "your end" - such as a robots.txt file blocking crawlers, no-index or similar meta tags on pages, the admin/server having blocked the crawler for abuse (it may make a large number of requests per second which the server may interpret as an attack), incorrectly typed website address, web pages returning an error code etc etc.

    Does it give any more information than "Crawl error" (such as a 3 digit code or a more detailed error?), do you see any accesses in your website's access log or error log?

    0

Please sign in to leave a comment.