homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google Desktop Tools and Google Labs Projects
Forum Library, Charter, Moderator: open

Google Desktop Tools and Google Labs Projects Forum

Recon on site, search via toolbar, site freshed next day
Lisa is correct

 2:23 pm on May 11, 2002 (gmt 0)

Continuing from Lisa's observation that using the toolbar (advanced) demands googlebot to come and check out the site clicked on.

Well, i made some site modifications, then entered terms into the toolbar, and clicked on my site, and the next day its got a Fresh beside it !

The site is too small and specialised to warrant daily or weekly visits, and usually is visited in the normal googlebot crawl cycle (3-5days continuous out of the month). Hence Lisa's theory gets a thunb up from me.



 4:26 pm on May 11, 2002 (gmt 0)

There are over a million toolbar users out there.

Are we REALLY going to believe that google is adding fresh tags because of the sites they visit? This would require millions of visits (thousands a day just for me). They would be doing nothing but visiting sites that toolbar users visit.

The site in your profile has a PR 5. It was last modified on the 10th according to your server. Google doesn't care how big your site is - I have seen one page sites with this tag.

Google added the fresh tag, because it visited the page on the 10th and it has changed and is therefore "fresh".

Google is still experimenting with the fresh stuff.


 5:16 pm on May 11, 2002 (gmt 0)

>There are over a million toolbar users out there.

You have a source for that Chris?


 5:34 pm on May 11, 2002 (gmt 0)

You mean having Chris_R next to it isn't enough :)


7th bullet


 5:52 pm on May 11, 2002 (gmt 0)

rcjordan mentioned 3.2 million recently in this thread [webmasterworld.com]. Also, Chris_R said it was over a million in this thread [webmasterworld.com] so it must be true;).


 6:02 pm on May 11, 2002 (gmt 0)

Being pedantic but not argumentative - but 1,000,0000 downloads dosent mean 1,000,000 users. Subtract those that have lost it after disk drive crashes, remove it, never install it though they download, get bored with it, have it installed but dont display the toolbar, those who have downlaoded it several times, and we have much less!

I see these claims about "so many downlaods", everyhwere = eg Gator/Wayback guys/toolbars/software but the reality, im guessing, is that the number of users is a small proportion. If i had to guess in the single digits.


 7:45 pm on May 11, 2002 (gmt 0)

Ture to some extent, but google is actually useful - while those other ones are jarring and forced upon users.

Google is the only toolbar that I haven't uninstalled within 24 hours of installing it.

The point is that no matter how many toolbars are out there - you are dealing with millions of websites being visited every day (assuming some reasonable number of users).

It doesn't make any sense that google would use this for the fresh listings.

Google does claim millions of USERS:


Of course, they may count anyone who downloads it as a user.

The toolbar is the future.


 1:12 pm on May 12, 2002 (gmt 0)


I agree in essence with what your saying about the amount of users. but how many of them delibrately use the toolbar, and click on a modified site (post crawl).

them millions would suddenly become a very little amount.

I'm not saying that what i observed is a certainty, but i will keep an eye on this on future mods and builds.


 1:46 pm on May 12, 2002 (gmt 0)

You're talking about a recrawl of pages that were already present in the Google index, right? This has no connection at all with Lisa's experiment, and neither proves nor disproves it. In fact, the only thing your example proves is that Google will recrawl previously indexed pages once in a while, which is not exactly breaking news... ;)

Lisa was talking about the detection of new pages that Google has no other way than a toolbar transaction to even know about.


 2:46 pm on May 12, 2002 (gmt 0)

I agree in essence with what your saying about the amount of users. but how many of them delibrately use the toolbar, and click on a modified site (post crawl).

Sorry, I just don't get it - whatever the users intentions are - they are irrelevant.

Google would have to either visit all the sites that were clicked on by users or some of them at random.

ONE person - visiting a site isn't going to get googlebot a coming.

Your page was updated on the 10th. Google listed it as fresh. It worked. the toolbar has nothing to do with it.

It is just a waste of resources to visit everysite, and then visit millions of sites that people are visiting in the toolbar.

There is no way ANYONE would ever design a search engine like this. There is no way for google to know if a site has been update until it visits it.

Google is the most efficient crawler out there. It was very well designed.

Google has been experimenting with fresh. Toolbar data is useless. I can't imagine why the would even CONSIDER using it.

They have come up with a way of predicting which sites are more likely to update than others. They are improving this. Their crawler works 24/7. It can only visit so many sites. To add sites from the toolbar would only cause it to have to visit more sites more than once - or do more processing to get it off the list.


 3:26 pm on May 12, 2002 (gmt 0)

I see what you mean Chris, just wishful thinking i suppose.


 3:37 pm on May 12, 2002 (gmt 0)

..Toolbar data is useless. I can't imagine why the would even CONSIDER using it..

Chris_R, finally something I can disagree with you about.

One of the shortcomings of Google is that it only considers sites or pages for its index that are officially submitted and/or linked to from pages already in the Google index.

It could therefore (at least temporarily) potentially miss pages that could be very topical, up-to-date, but more importantly, very popular. This is or was certainly the case with Google's crawling and reindexing sequence.

I am no computer expert, but I believe it should be easy to make a check-up at the end of the day in which Google compares the url's which were visited (by Google toolbar data, possibly even topped up with Alexa toolbar data) with the url's in the index.

It would be good if Google then spidered these pages (at least the ones which were visited more frequently by different toolbar users) and indexed them as soon as their system allows them to do this. (Even though these pages would probably rank low due to (possibly) no incoming links).

Google's Pagerank linking concept works very well in indexing and ranking the WWW, but it fails to see which popular (non-indexed)pages it misses. The toolbar data will help in doing that.

I am not saying Google is doing this at the moment, but they should, at least as soon as the number of toolbar users passes a critical amount.


 5:49 pm on May 12, 2002 (gmt 0)

The theory - is ok, but google - as you point our would bury the listings.

SERP are not for webmasters. they are for users. Users do not type in every word of the title just to see their page come up.

Any even remotely competitive phrase gives thousands of results:

squirrel hunting hats 2,850
my poor uncle went downtown 15,800
google cheats at card games 3,740
george bush rides a merry go round 1,600
mothers day should be banned 31,000
brett loves french fries 1,890

Any result with PR0 (which is what a new page would get) would be buried under the results.

This thread was more about freshness than Lisa's.

It is ture, of course, that many pages are missed and not included in the directory. Google has never claimed anything different. Google has never tried to include all pages and has said more or less that you need links for pages to count.

In a perfect world - what you are suggesting would work. Google is under time constraints - they aren't going to add pages that would get buried - it would be a waste of time. 99% of the time no one would see it. They aren't going to change the ranking system - as it would allow all kinds of useless pages to float to the top.

They need to make efficient use of their time. They know these pages will be buried. They couldn't even go back and visit many pages for a second time when a site was down during the last crawl. What would google rather spend time on - revisiting a site that is down - that already has PR - or visit a site for the first time that isn't even going to be seen by most users. That page will get in - as soon as someone links to it.

They may do this in the future when they go to custom SERPs, but even then - it will take more than one toolbar user for this to occur.

What you mention is a short coming of all engines. There isn't much to be done about it - except to either randomly insert new sites up higher to track click and popularity data or

My idea would be once the toolbar/custom serps become a reality - that google would put a tab on the page - something like "my google".

In there would be a section called "new sites". They could then match your habits against others (you visit webmaster world, papa john's, victoria's secret, and star wars website).

Based on other people's habits that are similar to yours - they could do what you are suggesting. Right now - I just don't think they can. They need more computers. 10,000 + sometimes just isn't enough.

Just my 2 cents.


 8:22 pm on May 12, 2002 (gmt 0)


Chiyo mentions in this thread:


that the links on Fresh! pages do get indexed for a couple of days. I will try it out. I wonder if these pages get any PR/anchortext credit in their ranking. What's your experience?

If it works it would cover part of the problems about new pages not appearing in Google.

You are right about the crawl-revisit problems, however I would speculate that it has more to do with their crawling algo needing an update (as just recently happened) than only with computer resources (or did they just dramastically increase their compu-server numbers?).

Also I believe that the "toolbar-popular" unindexed new pages will make up a small number of extra crawls.

For some sites, unindexed in Google, it would still help being in their, even at burried rankings. I still use Copernic/Ixquick to find certain companies that have websites but no incoming links. You would still find them by typing in their company name + some level of address/product if Google would include them using their toolbar.

Your tab idea is an interesting one and I guess that feature would soon appear - similar to visiting music sites and seeing what others with the same taste bought or downloaded.


 8:42 pm on May 12, 2002 (gmt 0)

>They could then match your habits against others...Based on other people's habits that are similar to yours
>>Your tab idea is an interesting one and I guess that feature would soon appear

This is sidetrackin, but I'd like to draw your attention to this thread
[webmasterworld.com ]
on a new engine working with the Google API, proposing a concept very much like what you are describing above.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google Desktop Tools and Google Labs Projects
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved