- Subscribe to RSS Feed
- Mark Thread as New
- Mark Thread as Read
- Float this Thread for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
Hello,
Somehow, a text was added into robots.txt and I'm unable to edit it. The text stops certain crawlers to crawl my site.
User-agent: NerdyBot Disallow: /
I need to remove this, unfortunalty support keeps sending me canned messges so no use.
Thanks
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
No. Google is the one that rejected the site map! We can't get our Pages index in Google possum we can't add Schemas or twitter cards. why does Weebly always make things harder!!
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hi there. Do you have a screenshot you can provide of the Google errors?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Weebly automatically generates the robots.txt file for you, and it's not possible to edit it. Note that changes you make in the Weebly editor determine what gets added to the robots.txt file. For example, on the Pages tab in the Advanced section, if you check the box for "Hide this page from search engines" it will add that page to the bottom of the list.
Weebly has made the decision to block NerdyBot, and it's not possible to change that.
So, you should see something like:
Sitemap: https://www.your-domain.com/sitemap.xml User-agent: NerdyBot Disallow: / User-agent: * Disallow: /ajax/ Disallow: /apps/
If there is nothing after Disallow: /apps/ it means that nothing else on your site is being blocked.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Thanks for this information, Jeffrey. I've recently become aware that FreeFind is now not able to search my site here (SeattleYASS.weebly.com). It worked fine for years, but some time before May 2016 it became blocked.
FreeFind tells me I should put the following lines in robots.txt:
user-agent: FreeFind
allow: /
Can something like this be done, or is FreeFind blocked for some reason? Is there anything I can do about this?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
You won't be able to directly edit your site, though you can do this:
1. Go to Settings > SEO
2. Disable the option to hide your site from search engines
3. Re-publish
That will update your robots.txt file so it doesn't block search engines in general (including FreeFind).
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Thanks, Adam, that did it. I don't know how "hide from search engines" got turned on -- I certainly didn't do it! When I turn that off FreeFind can do its indexing and all is well.
Best regards,
Gerrit
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I'm not sure either, but I'm glad to hear it's working for you!
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hi Adam,
I'm having the exact same issue. I've republished after making sure the HIDE THIS PAGE FROM SEARCH ENGINES is not checked, but Google Search Console continues to find the following errors and my site has diappared from Google Search
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Which URLs are being blocked? There will also be a few that are which are just backend resources (anything in /ajax, for example). This is normal though, and wouldn't affect the indexing of your site pages and so on.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I use Google Merchant as well as Weebly and Google MErchant is telling me it cannot crawl my images. How do I fix that? Thanks in advance for your help.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Your robots.txt file and domain DNS records look to be set up ok, @autoprofishopUS. What error is Google Merchant giving in specific, and what URL are you using?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hi, Adam-
In replying to your message, I just realized what I think I did wrong....
I may have used the wrong image URLs. I listed the URLs which hav the weebly domain associated (which is what you get when you right click and copy the image location from WITHIN the "edit site" mode.
image link |
https://www.weebly.com/uploads/7/4/3/1/74316665/s644759774199853307_p1_i22_w320.png |
What I should have done is close "edit site," and returned to my site's shopping page and copied the image link which would be associated then to MY URLinstead of weebly's...
http://www.autoprofishop.us/uploads/7/4/3/1/74316665/s644759774199853307_p1_i22_w450.png
(This may be a different product, but it's part of the same batch of images).
YOU ARE A GENIUS! THANK YOU! LOL (that was kinda easy, eh?)
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
I'm no genius, but I'm glad to hear that you got it sorted!
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Why is it not possible to change what the robots.txt is blocking if it is not a parameter that you have intentionally blocked yourself?
Is it limiting my page from being crawled? I am still not showing up in search engines on first page of google... not sure where I am going wrong and thought this might be responsible.
this is what mine looks like
Sitemap: http://www.wellness4paws.ca/sitemap.xml User-agent: NerdyBot Disallow: / User-agent: * Disallow: /ajax/ Disallow: /apps/
Please help!
- Melanie
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
The only items blocked by default are backend resources that you wouldn't want indexed. "User-agent: *" will apply to all search engines, and you can see that other than the stuff I just mentioned there's nothing else blocked. If you check the option for a page to hide it from search engines, it'll add it there after publishing.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Hello
im having a huge issue with google search console telling me that i have robots.txt blocking my mobile webpages even though im able to view on mobile also i have them saying 10 of my pages content is too close together and content is too wide for mobile users none of this is correct i even have pictures showing contradictory results so this is affecting my seo rank which is really not good ive paid alot of people to help and got no where so any help would be outstanding
heres my website its also other pages besides this one
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
Thanks for posting, @iFixmarbella. Your robots.txt file looks normal to me - I don't see anything in it which should prevent Google from seeing the site on mobile devices. Can you provide an example of a page which they say has content too close together?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
But what if there is? I thought I read here or somewhere, that Weebly was aware of the Nerdy bot problem and had blocked it. But MY SITEMAP is almost ALL disallows and no block of nerd blocked. My errors when trying to submit sitemap is robots blocking google elements, causing missing in uploads. Mobile too.
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report
What is the address of the site where you are seeing that, @options1?
- Mark as New
- Bookmark
- Subscribe
- Subscribe to RSS Feed
- Permalink
- Report