- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
Network unreachable: robots.txt unreachable and Network unreachable: robots.txt unreachable
Hi there,
In Google Webmasters I've started getting the subjected errors in Search Console for my sitemap.xml file
The first error contains the following information:
We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely.
The second error contains no further information. However, both have the following in the description of the error:
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit
This only started happening towards the backend of last week. Prior to that, no issues were experienced at all. I've checked with my domain name provider and everything looks fine from their end, and I've even logged support on the Google Webmasters forums but as yet haven't received any useful info.
Any assistance that can be provided on here would be greatly appreciated.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
It looks like you have a few pages blocked on your robots.txt file. You'll want to check "Settings" > "SEO" and make sure you have "hide on search engines" turned off. Also under "pages" > click on the page on the left > SEO settings > make sure at the bottom it doesn't have the checkbox checked for hide from search engines.
Disallow: /under-11s.html Disallow: /juniorpresday.html Disallow: /under-10s.html Disallow: /under-9s.html Disallow: /under-8s.html Disallow: /under-17s.html Disallow: /under-7s.html Disallow: /under-16s.html Disallow: /under-6s.html Disallow: /under-15s.html Disallow: /under-14s.html Disallow: /under-13s.html Disallow: /under-12s.html
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
SEO definitely does not have "hide in search engines" ticked, so that's all good.
As for the pages that are hidden, these are purposefully hidden as they are not yet ready for consumption by visitors to the site. These have also been blocked in the robots.txt file for a number of weeks without any issue. This particular problem only appeared late last week.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I don't see any reason why Google would have trouble access either file; like you mentioned your DNS are fine, and I can view both files myself without any trouble. It's possible it was just some kind of temporary issue that prevented Google from viewing them, and it hasn't tried since.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hi Adam,
I've tried resubmitting the sitemap.xml in the Google Search Console and continue to get the same errors. For a temporary issue it has now been on-going fo r at least 5 days.
Any other thoughts you may have on possible causes? The interwebs seem very dark on this particular problem...
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hello OlorinFiresky!
It sounds like you may be waiting on a recrawl there, for Google's system to check the site again. That can take up to a few weeks, though it's hard to say more from here. The files all seem in order and accessible from this end, however.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hi Queso,
If it was as simple as a recrawl then surely if I were to do a Fetch as Google it should result in success because I'm forcing the issue. Yet I get the following error and have done since this problem first appeared...
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
What is the date on when the fetch error was detected? I've found that Google will keep the error listed with other server errors even though it might be months old. You might want to use the checkboxes below to select and delete any old server errors.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Did you ever get an actionable answer to your question? I'm having same problem and it seems to be caused by Weebly directing the boots to editmysite as a resource instead of to my actual domain (I don't have a free site, so it should not be trying to crawl the Weebly parent site).
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I resolved this myself in the end. I had mutiple domain options listed and this was causing the issue. So I had https://domainname.com, https://www.domanname.com, http://domainname.com, and http://www.domainname.com. Once I removed all but http://www.domainname.com the problem went away.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report