homepage Welcome to WebmasterWorld Guest from 54.196.207.55
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Submit URLs to Google with Fetch as Googlebot
Shaddows




msg:4347254
 4:51 pm on Aug 3, 2011 (gmt 0)

[googlewebmastercentral.blogspot.com...]
The Fetch as Googlebot feature in Webmaster Tools now provides a way to submit new and updated URLs to Google for indexing. After you fetch a URL as Googlebot, if the fetch is successful, you値l now see the option to submit that URL to our index. When you submit a URL in this way Googlebot will crawl the URL, usually within a day. We値l then consider it for inclusion in our index. Note that we don稚 guarantee that every URL submitted in this way will be indexed; we値l still use our regular processes葉he same ones we use on URLs discovered in any other way葉o evaluate whether a URL belongs in our index...

When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month.

[edited by: Robert_Charlton at 5:14 pm (utc) on Aug 3, 2011]
[edit reason] adjusted link [/edit]

 

lucy24




msg:4347339
 7:08 pm on Aug 3, 2011 (gmt 0)

When you submit a URL in this way Googlebot will crawl the URL, usually within a day.

I love this. The "Fetch as Googlebot" feature doesn't use the real googlebot, just an understudy who has learned all its lines. So they have to crawl the page all over again for it to count.

Today's interesting lesson: When you (that is, you, not me) make a post in Windows-Latin-1 using curly quotes and em dashes, my browser decides it is in "Japanese (Shift JIS)" and turns those six non-Latin-1 characters into Kanji.

bw100




msg:4347833
 5:24 pm on Aug 4, 2011 (gmt 0)

if the fetch is successful

curious wording

So, what would define a successful fetch vs. an unsuccessful fetch?

Reno




msg:4347842
 5:35 pm on Aug 4, 2011 (gmt 0)

I started this awhile back ~ it's a very useful GWT feature but be aware that you are limited in how many you can fetch on a per week basis, so be sure to prioritize and do the most important URLs first.

...............................

nomis5




msg:4347914
 7:53 pm on Aug 4, 2011 (gmt 0)

In what circumstances would this be useful? Googlebot gets round to most pages reasonably quickly, doesn't it?

Hoople




msg:4347944
 8:49 pm on Aug 4, 2011 (gmt 0)

Is this 'better' than a sitemap ping? Seems like more work doing onesey twosey (Fetch as Googlebot) rather than a single submission of a sitemap file with multiple URL's.

What am I missing?

Webwork




msg:4348001
 11:06 pm on Aug 4, 2011 (gmt 0)

Is this now a way for G to sniff out original material versus scraped and/or duplicated content? Perhaps even a way for G et. al. to suppress duplicated, re-purposed, revised, rewritten versions of the same material?

So, maybe feed the bot before feeding the rss feed?

Reno




msg:4348028
 12:46 am on Aug 5, 2011 (gmt 0)

>In what circumstances would this be useful?

After the Panda virus decimated my sites, I started wondering if my most important pages were even visible any more to Googlebot, so to put those questions to rest, I used this tool. When each page came back with "Success", I could at least rest assured that I had not fallen into the Googlevoid. For that reason alone it was time well spent one night.

.....................

Sgt_Kickaxe




msg:4348040
 12:54 am on Aug 5, 2011 (gmt 0)

Be careful with this. It's of no danger to most websites and webmasters but when you announce to Google that "I DEFINITIVELY OWN THIS SITE" you're signing up for the unknown. Google may or may not place guides and restrictions based on you and/or your history of which the contents of are unknown to you. It's a game of roulette though most likely safe for the majority of webmasters.

If however you've been penalized on another site and/or are working on some rankings issues of some kind you may be 'infecting' your new site right out of the gate. Make sure your portfolio is in clear sailing mode before you use this for a new site, imo.

lucy24




msg:4348046
 1:06 am on Aug 5, 2011 (gmt 0)

I started wondering if my most important pages were even visible any more to Googlebot

Wouldn't a sitemap do the same thing? One category of Crawl Errors is "On Sitemap", presumably meaning the sitemap is the only way they know of a page's existence. If you've got a trick for convincing g### that a given page doesn't exist, once they've decided it does, I think everyone would like to hear it.

Reno




msg:4348070
 1:56 am on Aug 5, 2011 (gmt 0)

>Wouldn't a sitemap do the same thing?

Yes, as would have my raw logs. But I was in panic mode and thus grasping at straws (none of which are apparently attached to anything). I ran the tool until it came back with the message that I had maxed out, then a week later did it again, so not a lot of time was lost, and with the exception of Sgt's post above, had not heard of a downside...

..................................

CainIV




msg:4348103
 5:07 am on Aug 5, 2011 (gmt 0)

Not understanding how this is in any way helpful to anyone.

Sounds like the Google site submission tool, antiquated beyond its time years back...

kapow




msg:4348202
 12:45 pm on Aug 5, 2011 (gmt 0)

Is it just Google heroin to give adicts more to obsess about?

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved