x

Nerdy Bot etc

I have received the following when trying to get my site indexed and crawled .

User-agent: NerdyBot 

Disallow: /      

User-agent: dotbot

Crawl-delay: 10

User-agent: *

Disallow: /ajax/

Disallow: /apps/

 

I have read elsewhere on the site that it is not necessary to be concerned.  However this does not tell me how to overcome that my site is not being fetched.  As the User I have been verified but after 2 weeks of trying I just cannot get Google to co-operate.  Any help very welcome please

119 Views
Message 1 of 2
Report
1 REPLY 1
Square Champion

Hi @theoldandgrey6.  Question.  Is your site a Square/Weebly hosted site?  If so, then this whole disallowing NerdyBoy (which is an AI bot crawler) is new to me.  I didn’t know Square was doing that yet, though I’m happy that they are, if so.

 

A quick perusal of StackOverflow.com tells me that sometimes Google’s bots can decide to ignore a site completely when they see a disallow directive in the robots.txt file. So, if you are able to edit that txt file (and Square isn’t doing this for you), edit it and add the following line at the end:

 

User-agent: Googlebot Allow: /

 

No one seems to know what is up with Google here, but this seems to work for most folks.

 

Also, keep in mind that it can take Google many days, and even weeks, for its bots to find new sites.  So patience is about all you have on your side while you wait for this to happen.

 

Let me know if you have any other questions.

Chip

If my answer resolves your issue, please take a minute to mark it as Best Answer. That helps people who find this thread in the future.

Piper’s Ice Cream Bar, Covington KY USA
Website
Facebook
Click here to see a list of third-party apps I use to add functionality to my Square account!

जो है सो है
99 Views
Message 2 of 2
Report