Welcome to WebmasterWorld Guest from 34.201.121.213

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Big reductions in crawl-to-index limits on Google Fetch tool

     
1:49 am on Mar 29, 2018 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12311
votes: 396


There have been numerous questions and complaints on the web recently, including several on this forum, about Google's slowness in indexing pages submitted via the Fetch tool. This has to do, it turns out, with spammers' misuse of the tool to crawl and index pages.

Search Engine Roundtable has been covering this since mid-February. Here is probably Barry's most comprehensive article of the three or four I've seen....

Google Sets Aggressive Limits On Request Indexing Fetch Tool
Mar 14, 2018 - by Barry Schwartz
https://www.seroundtable.com/google-aggressive-limits-on-request-indexing-25397.html [seroundtable.com]

As many of you know, Google has set new quotas for the Fetch as Google submit to index feature because of spam and abuse. But it still seems there is a cat and mouse game going on between Google and spammers with that tool.

John Mueller of Google responded to the number of complaints around the tool spitting out error messages that Google has set "pretty aggressive limits there at the moment." He said on Twitter, " I suspect that'll settle down again over time, but in general, I'd recommend focusing on non-manual methods for normal crawling & indexing (like sitemap files

I also suggest taking a look at the Google help page for the tool....

Ask Google to recrawl your URLs
[support.google.com...]

If you’ve recently added or made changes to a page on your site, you can ask Google to (re)index it using the Fetch as Google tool.

The "Request indexing" feature on Fetch as Google is a convenience method for easily requesting indexing for a few URLs; if you have a large number of URLs to submit, it is easier to submit a sitemap. instead. Both methods are about the same in terms of response times....

Note also item #5 on the page...

5 - Recrawling is not immediate or guaranteed. It typically takes several days for a successful request to be granted. Also, understand that we can't guarantee that Google will index all your changes, as Google relies on a complex algorithm to update indexed materials.


5:59 am on Mar 29, 2018 (gmt 0)

New User

joined:Mar 28, 2018
posts:2
votes: 0


If google allowing 10 individual urls per day then I submit via fetch as google only 1 blog post and 2 days past it's still not indexed yet.
5:03 am on Apr 5, 2018 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12311
votes: 396


I submit via fetch as google only 1 blog post and 2 days past it's still not indexed yet.
What makes you believe that you should expect to be indexed in 2 days? Perhaps you have done this before and the tool behaved differently?

What has been your prior history with using this tool for submissions? All of the recent SE Roundtable articles on this topic have suggested that Google is wanting to discourage manual Fetch tool submissions for routine updating.

As I noted above, John Mueller recommends "focusing on non-manual methods for normal crawling & indexing."

Google has also said very explicitly in the support page I reference above that they make no guarantees. I'll quote item #5 again....
5 - Recrawling is not immediate or guaranteed. It typically takes several days for a successful request to be granted. Also, understand that we can't guarantee that Google will index all your changes, as Google relies on a complex algorithm to update indexed materials.

Chances are that if you were a previous frequent user of the Fetch tool, Google might consider this as a reason to discourage your use of the tool. Google also notes that the algorithm they're using to evaluate submissions is "complex", and in this case I believe them.

My guess is that the factors they're looking at include...
- your prior history of use...
- their evaluation of what you and possibly related sites have submitted via the Fetch tool...
- their evaluation of the quality of what you're submitting now...
- and whether they think you might be trying to reverse-engineer their algorithm.

This last is particularly interesting, I think, as there's a Google patent this forum discussed back in 2012 which I'm conjecturing might relate to this....

Google's Rank Modifying Patent for Spam Detection
Aug 18, 2012
https://www.webmasterworld.com/google/4486158.htm [webmasterworld.com]

While there's no guarantee that Google uses everything it patents, from site changes I've seen that I've felt have caused sites to lose ranking, I believe that this patent or something similar has been used.

I can well imagine that Google's monitoring the use of the Fetch tool might be an addition or substitute for parts of the Rank Modifying algorithm, and that this might save Google computing time overall.

I see, btw, that it's been about a week since you posted your question. Has the blog post you're asking about been indexed, and have you made any further observations about the behavior of the Fetch tool?

4:46 am on July 18, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Apr 6, 2016
posts:154
votes: 21


< moved from another location >


@Robert Charlton whenever i use the fetch as google tool, the url gets indexed instantly within seconds. Only problem is that it's limited to only 10 urls per day making it impossible to get all my new posts indexed because my site generates over 10 posts daily. Wish they can just increase the limit to 20.


[edited by: Robert_Charlton at 6:34 am (utc) on Jul 18, 2018]

8:19 am on July 18, 2018 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12311
votes: 396


@Robert Charlton whenever i use the fetch as google tool, the url gets indexed instantly within seconds. Only problem is that it's limited to only 10 urls per day making it impossible to get all my new posts indexed because my site generates over 10 posts daily. Wish they can just increase the limit to 20.
Halaspike, I've moved your question from the July Updates and SERPs thread, where it would be off-topic, to discuss it here.

As noted above, Google has put severe limits on the use of the Fetch Tool for submissions because they want to discourage its use by spammers, yet they do want to keep it available "as a convenience" for those who need it. I'll quote again what I quoted before, with emphasis added...

The "Request indexing" feature on Fetch as Google is a convenience method for easily requesting indexing for a few URLs; if you have a large number of URLs to submit, it is easier to submit a sitemap. instead. Both methods are about the same in terms of response times..

WebmasterWorld is not connected with Google in any way, and I'm not privy to Google's algos, though I've paid a lot of attention to how they might work... so all of the following is conjecture only....

It's likely if your urls submitted by the Fetch tool get indexed within seconds, that Google doesn't consider you a spammer. I'm thinking that Google doesn't know that, though, until it has spidered and evaluated your submissions via their self-described "complex" submission algorithms, which most likely incorporate some of the factors I've suggested above. That requires a special layer of infrastructure. If Google doubled the limit on Fetch submissions for inclusion, they'd probably put a noticeable additional load on that extra layer.

Additionally, since Google is going to be crawling naturally and via Sitemaps in any event, that algo and infrastructure layer is to a degree also a redundant layer. But it's an extra layer of infrastructure that Google would rather not have to support more than is necessary, particularly since it's adding to Google's potential vulnerability. It feels obvious to me that the Fetch tool, if used in large quantities, could be used to reverse engineer some of the algo.

It may also be that if Google were to double the submission limits, they'd no longer be able to index ten of your pages within seconds. I gather there were also manpower problems supporting user complaints because the Fetch tool wasn't indexing some sites as fast as some submitters wanted. It had been subject to considerable abuse, and ten is the number they're going with for now.

3:57 pm on July 18, 2018 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:9913
votes: 972


No matter how fast anything is done, it is never fast enough for some. Patience appears to be in short supply in the race to be "first".

G will index when it gets around to it. Most times it is pretty quick, and sites that do regularly update are visited fairly regularly. That has been my experience and I've not seen any reason to Fetch as Google for new urls.
5:01 am on July 26, 2018 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12311
votes: 396


It appears that Google continued to have too many manual submissions, and has had to drop its public submission feature. Here's the announcement on Twitter...
[twitter.com...]
Google Webmasters
@googlewmc
We've had to drop the public submission feature, but we continue to welcome your submissions using the usual tool in Search Console and through sitemaps directly.
3:25 AM - Jul 25, 2018

Announced also by Barry on seroundtable...
Google Drops Public Submit To Index Tool
Jul 25, 2018 - by Barry Schwartz
[seroundtable.com...]

5:07 am on July 26, 2018 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12311
votes: 396


Mod's note: Locking above ^^^ .

Discussion on public submission to Google continues at new thread....

Google has to drop public submission on Fetch / Inspection Tool
July 25, 2018
https://www.webmasterworld.com/google/4912575.htm [webmasterworld.com]