Welcome to WebmasterWorld Guest from 54.147.158.28

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Submit URLs to Google with Fetch as Googlebot

     
4:51 pm on Aug 3, 2011 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1280
votes: 64


[googlewebmastercentral.blogspot.com...]
The Fetch as Googlebot feature in Webmaster Tools now provides a way to submit new and updated URLs to Google for indexing. After you fetch a URL as Googlebot, if the fetch is successful, you値l now see the option to submit that URL to our index. When you submit a URL in this way Googlebot will crawl the URL, usually within a day. We値l then consider it for inclusion in our index. Note that we don稚 guarantee that every URL submitted in this way will be indexed; we値l still use our regular processes葉he same ones we use on URLs discovered in any other way葉o evaluate whether a URL belongs in our index...

When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month.

[edited by: Robert_Charlton at 5:14 pm (utc) on Aug 3, 2011]
[edit reason] adjusted link [/edit]

7:08 pm on Aug 3, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:13218
votes: 348


When you submit a URL in this way Googlebot will crawl the URL, usually within a day.

I love this. The "Fetch as Googlebot" feature doesn't use the real googlebot, just an understudy who has learned all its lines. So they have to crawl the page all over again for it to count.

Today's interesting lesson: When you (that is, you, not me) make a post in Windows-Latin-1 using curly quotes and em dashes, my browser decides it is in "Japanese (Shift JIS)" and turns those six non-Latin-1 characters into Kanji.
5:24 pm on Aug 4, 2011 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 14, 2007
posts:136
votes: 0


if the fetch is successful

curious wording

So, what would define a successful fetch vs. an unsuccessful fetch?
5:35 pm on Aug 4, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 9, 2001
posts:1307
votes: 0


I started this awhile back ~ it's a very useful GWT feature but be aware that you are limited in how many you can fetch on a per week basis, so be sure to prioritize and do the most important URLs first.

...............................
7:53 pm on Aug 4, 2011 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Apr 29, 2005
posts:1937
votes: 62


In what circumstances would this be useful? Googlebot gets round to most pages reasonably quickly, doesn't it?
8:49 pm on Aug 4, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 2002
posts:742
votes: 8


Is this 'better' than a sitemap ping? Seems like more work doing onesey twosey (Fetch as Googlebot) rather than a single submission of a sitemap file with multiple URL's.

What am I missing?
11:06 pm on Aug 4, 2011 (gmt 0)

Moderator

WebmasterWorld Administrator webwork is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 2, 2003
posts:7877
votes: 27


Is this now a way for G to sniff out original material versus scraped and/or duplicated content? Perhaps even a way for G et. al. to suppress duplicated, re-purposed, revised, rewritten versions of the same material?

So, maybe feed the bot before feeding the rss feed?
12:46 am on Aug 5, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 9, 2001
posts:1307
votes: 0


>In what circumstances would this be useful?

After the Panda virus decimated my sites, I started wondering if my most important pages were even visible any more to Googlebot, so to put those questions to rest, I used this tool. When each page came back with "Success", I could at least rest assured that I had not fallen into the Googlevoid. For that reason alone it was time well spent one night.

.....................
12:54 am on Aug 5, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member

joined:Apr 14, 2010
posts:3169
votes: 0


Be careful with this. It's of no danger to most websites and webmasters but when you announce to Google that "I DEFINITIVELY OWN THIS SITE" you're signing up for the unknown. Google may or may not place guides and restrictions based on you and/or your history of which the contents of are unknown to you. It's a game of roulette though most likely safe for the majority of webmasters.

If however you've been penalized on another site and/or are working on some rankings issues of some kind you may be 'infecting' your new site right out of the gate. Make sure your portfolio is in clear sailing mode before you use this for a new site, imo.
1:06 am on Aug 5, 2011 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:13218
votes: 348


I started wondering if my most important pages were even visible any more to Googlebot

Wouldn't a sitemap do the same thing? One category of Crawl Errors is "On Sitemap", presumably meaning the sitemap is the only way they know of a page's existence. If you've got a trick for convincing g### that a given page doesn't exist, once they've decided it does, I think everyone would like to hear it.
1:56 am on Aug 5, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 9, 2001
posts:1307
votes: 0


>Wouldn't a sitemap do the same thing?

Yes, as would have my raw logs. But I was in panic mode and thus grasping at straws (none of which are apparently attached to anything). I ran the tool until it came back with the message that I had maxed out, then a week later did it again, so not a lot of time was lost, and with the exception of Sgt's post above, had not heard of a downside...

..................................
5:07 am on Aug 5, 2011 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2004
posts:1939
votes: 0


Not understanding how this is in any way helpful to anyone.

Sounds like the Google site submission tool, antiquated beyond its time years back...
12:45 pm on Aug 5, 2011 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 21, 2001
posts:1281
votes: 0


Is it just Google heroin to give adicts more to obsess about?