Welcome to WebmasterWorld Guest from 54.163.40.152

Message Too Old, No Replies

Submit URLs to Google with Fetch as Googlebot

     

Shaddows

4:51 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



[googlewebmastercentral.blogspot.com...]
The Fetch as Googlebot feature in Webmaster Tools now provides a way to submit new and updated URLs to Google for indexing. After you fetch a URL as Googlebot, if the fetch is successful, you値l now see the option to submit that URL to our index. When you submit a URL in this way Googlebot will crawl the URL, usually within a day. We値l then consider it for inclusion in our index. Note that we don稚 guarantee that every URL submitted in this way will be indexed; we値l still use our regular processes葉he same ones we use on URLs discovered in any other way葉o evaluate whether a URL belongs in our index...

When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month.

[edited by: Robert_Charlton at 5:14 pm (utc) on Aug 3, 2011]
[edit reason] adjusted link [/edit]

lucy24

7:08 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



When you submit a URL in this way Googlebot will crawl the URL, usually within a day.

I love this. The "Fetch as Googlebot" feature doesn't use the real googlebot, just an understudy who has learned all its lines. So they have to crawl the page all over again for it to count.

Today's interesting lesson: When you (that is, you, not me) make a post in Windows-Latin-1 using curly quotes and em dashes, my browser decides it is in "Japanese (Shift JIS)" and turns those six non-Latin-1 characters into Kanji.

bw100

5:24 pm on Aug 4, 2011 (gmt 0)

5+ Year Member



if the fetch is successful

curious wording

So, what would define a successful fetch vs. an unsuccessful fetch?

Reno

5:35 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I started this awhile back ~ it's a very useful GWT feature but be aware that you are limited in how many you can fetch on a per week basis, so be sure to prioritize and do the most important URLs first.

...............................

nomis5

7:53 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



In what circumstances would this be useful? Googlebot gets round to most pages reasonably quickly, doesn't it?

Hoople

8:49 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is this 'better' than a sitemap ping? Seems like more work doing onesey twosey (Fetch as Googlebot) rather than a single submission of a sitemap file with multiple URL's.

What am I missing?

Webwork

11:06 pm on Aug 4, 2011 (gmt 0)

WebmasterWorld Administrator webwork is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Is this now a way for G to sniff out original material versus scraped and/or duplicated content? Perhaps even a way for G et. al. to suppress duplicated, re-purposed, revised, rewritten versions of the same material?

So, maybe feed the bot before feeding the rss feed?

Reno

12:46 am on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>In what circumstances would this be useful?

After the Panda virus decimated my sites, I started wondering if my most important pages were even visible any more to Googlebot, so to put those questions to rest, I used this tool. When each page came back with "Success", I could at least rest assured that I had not fallen into the Googlevoid. For that reason alone it was time well spent one night.

.....................

Sgt_Kickaxe

12:54 am on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Be careful with this. It's of no danger to most websites and webmasters but when you announce to Google that "I DEFINITIVELY OWN THIS SITE" you're signing up for the unknown. Google may or may not place guides and restrictions based on you and/or your history of which the contents of are unknown to you. It's a game of roulette though most likely safe for the majority of webmasters.

If however you've been penalized on another site and/or are working on some rankings issues of some kind you may be 'infecting' your new site right out of the gate. Make sure your portfolio is in clear sailing mode before you use this for a new site, imo.

lucy24

1:06 am on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



I started wondering if my most important pages were even visible any more to Googlebot

Wouldn't a sitemap do the same thing? One category of Crawl Errors is "On Sitemap", presumably meaning the sitemap is the only way they know of a page's existence. If you've got a trick for convincing g### that a given page doesn't exist, once they've decided it does, I think everyone would like to hear it.

Reno

1:56 am on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Wouldn't a sitemap do the same thing?

Yes, as would have my raw logs. But I was in panic mode and thus grasping at straws (none of which are apparently attached to anything). I ran the tool until it came back with the message that I had maxed out, then a week later did it again, so not a lot of time was lost, and with the exception of Sgt's post above, had not heard of a downside...

..................................

CainIV

5:07 am on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Not understanding how this is in any way helpful to anyone.

Sounds like the Google site submission tool, antiquated beyond its time years back...

kapow

12:45 pm on Aug 5, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is it just Google heroin to give adicts more to obsess about?
 

Featured Threads

Hot Threads This Week

Hot Threads This Month