Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
Cut and past it in the browser and you will find that the toolbar get pagerank in an XML file that contain far more information about the website.
For some websites there is additional info such as the date of last crawl but not for all websites.
Does somebody know if there is a hidden feature to display additional info in the toolbar ?
Also the voting function from the beta toolbar doesn't use http to send the vote to google, anybody know how it work ?
For all people who like google very much and always want to know more about it and experiment, don't forget to read [google.com...] .
I do this already.
The thing is, I haven't tried messing with IE cookies beyond this because they are apparently indexed somehow. Yes, each one is in a separate file. It's mostly plain text, and it can probably be edited safely to spit out different data. But there's also something unconventional happening. For example, on a Netscape cookie, in cookies.txt, you can plainly see the number 2147368520 (any number higher than 2147483647 will not work) on your Google cookie. This is the number of seconds after Jan 1, 1970 that the cookie expires. I don't see this in the IE cookie for Google. But the expiration date has to exist somewhere because it's part of the cookie standard, so that makes me nervous.
With Netscape, once the browser is exited, all the cookies are in one plain text file called cookies.txt in your profile directory. You can delete these selectively with any text editor. I do most of my surfing in Netscape. Certain cookies I don't delete, because they are my favorite sites and I need them for site configuration. Google's cookies bite the dust several times a day, via a program that is part of another program that I use for a different purpose several times a day.
I'd be nervous about selectively editing out particular cookies in IE, the way I do with Netscape. With IE I just wipe out all *.txt and *.dat files instead of picking and choosing. I don't know what would happen in IE -- it might mess up their cookie index. Netscape doesn't use any sort of index or indirect access.
I find it hard to believe that Google would keep a database that would track the sties that every users visits. If you do the math on how quickly such a file would grow (file-size-wise) you see that it quickly becomes unmanagable. Assuming that:
1. There are a reasonably large number of Google Toolbar users reporting their data,
2. Users who use the toolbar are "power users" who would do a lot of surfing
The amount of data required to store all of the URLS would strech into the terabytes. This thing would be less scalable then Gnutella. :)
There is a parameter to google cookies called TM, which I'd guess as being short for Time Mark, or simply TiMe, it looks a lot like a UNIX timestamp to me. My guess is that running it through Perl's localtime function would render the date the first access was made with that particular cookie. It sure renders a valid date, but I don't know when that was. Probably the Time you were Marked.
Another timestamp, this one called LM is usually higher than TM. Seems to be Last Mark. LM-TM probably gives Google a good idea of your EnthusiasmForGoogleRank ;).
There is another parameter, called 'ID' made up of 8 bytes, or 2 double words. My sixth sense tells me this is some sort of "ID"(Dr.Evil quotes intended). Tough guess. But could also be used to mislead high IQ SEO's like the members of this board into believing this is actually our "ID". Hmmm, looks like an ID, weighs like an ID...let's call it our "ID".
Adding it up, we get 8 bytes + 4 bytes + 4 bytes, which is, uh...16. 16 bytes is what an ID costs the great Google cookie jar.
So for every KByte of memory they store 64 ID's. For every MByte, they store 65535 ID's. With a Gig they get to store 67.108.864 of them and with 150 Gigabytes they could tag 10 billion people, or earth's population the next few...years I guess up until 2038???. That is, you can buy 150 Gigs for around U$750 bucks, maybe a little more. The Web's population is not even close to the U.S. population, which could be tagged with around 3 Gigs.
Just to clarify that with a Terabyte of data you'd tag over 100 billion [not necessarily different] people.
Added later: to be useful they'd probably like to store other things like an extra 4 bytes for your IP, maybe a 64 byte buffer for user agent...still, my point is that cheap storage nowadays allows this kind of user tracking at very low costs. IP's and UserAgents are also available at common logs so maybe the 16 bytes is all they need afterall?
Sure, Google could keep a log of IPs, a unique Google ID and other values like the User Agent or the frequency with which their product is used.
Personally, if people knew that information about me, I wouldn't care. As it stands, it is difficult to associate IPs with people's names - especially for the majority of people who access the internet via dialup. How does that information violate your privacy?
to display PageRank, the Google toolbar send back to Google the url of the page you are watching. It also send the keywords actually displayed in the google toolbar.
So does this mean that if I create a new site, type some keywords into the bar, then visit my site, google gets a report of my address along with the keywords for it? Could this be a "Add URL" button of sorts, where you get to influence the keywords?
That mean that each time you visit a different page after a search, in some way those pages are "tagged" by the keywords you where searching for.
I think that they could use it to find "related" pages.
I mean, you start your search at google but at some point you start browsing pages that are 2,3,4 pages away from the SERP but you still have the keyword displayed in the toolbar and it is sent to google's backend when you have PageRank display turned on. So, now google may look the page and see if there is a match for the keyword. It may be used in various way, giving more importance for this page for this keyword or finding related pages.
In fact recording clickstream is one thing, you may find browsing session,
but recording clickstream + search keyword and now you kind track "searching sessions" and that is more interesting to Google.
Am I right GoogleGuy ?