Welcome to WebmasterWorld Guest from 18.104.22.168
[edited by: tedster at 9:46 pm (utc) on Apr 1, 2010]
joined:Jan 12, 2009
joined:Dec 29, 2003
I don't get it, serps change daily so I don't see the point of a big serp change thread.
It is thursday and the google traffic is slowing down...and this usually continues until the weekend...I notice this pattern for the last few weeks (from Feb or march). Anyone else experiencing this?
My site is up in caffeine and I think G* switch off caffeine in the weekend
How are you supposed to tell the difference between the two?
We had stopped working on one site last year(May) and it gradually dropped in rankings( very slowly) but now it is appearing back on top!
I also have noticed increased power of older domains/urls in caffeine. Can't tell if it's due to age of backlinks, or age of the urls. Any thoughts?Most of my sites suffered a huge Google traffic loss one month ago. But the only two sites that have not been affected are both older than 8 years old.
I am observing .edu pages ranking higher with increasing prevalence in the serps.
But Google should realize that .edu can be just as easily manipulated for commercial purposes as any other TDL, and .edu pages shouldn't be held in such high regard
Additionally, looking at the pages that Googlebot comes to visit, I see them trying to read pages that do not exist since 2008 and have since been 301-ed to new URLs (on the same site)
Googlebot is looking for some URLs that have been 301ed long time back
We've developed an interesting trick that speeds up the first step: instead of storing the entire index on one very powerful computer, Google uses hundreds of computers to do the job. Because the task is divided among many machines, the answer can be found much faster.
To illustrate, let's suppose an index for a book was 30 pages long. If one person had to search for several pieces of information in the index, it would take at least several seconds for each search. But what if you gave each page of the index to a different person? Thirty people could search their portions of the index much more quickly than one person could search the entire index alone. Similarly, Google splits its data between many machines to find matching documents faster.
One query can use in the area of 500 different servers - and there are thousands of servers that hold various versions of that data. So yes, you can easily be seeing the results of different data sets a lot of time.
And with regard to the time stamp, yes, the data from every url is stored at various stages of its evolution and each one has a time stamp. Not all data-sets contain every time-stamped version of a given URL.
Watching Google move data around is a pretty intensive sport - it can either confuse things or clarify things for you, depending on how accurately you picture what they're doing.
joined:Dec 29, 2003