| 2:48 am on Jul 1, 2006 (gmt 0)|
This won’t help much tedster, but I can tell you that from a Canadian IP, “Show IP”, reports only 2 IP’s, however, from my US based server, it does report 3 IP’s.
Guess it’s like a hotel room, why does it take 4 towels to do what 1 does at home?
| 7:43 am on Jul 1, 2006 (gmt 0)|
|Does anyone else have this three IP thing? I just get one Google IP address |
At the moment in the UK, I am getting for BOTH Google.com & Google.co.uk
And I get DIFFERENT results for the same query.
| 7:58 am on Jul 1, 2006 (gmt 0)|
And FWIW, the .co.uk queries use a filter that is not apparent on ANY of those 4 Data Centres.
I used a UK type of query "cornwall", and I am on a UK ISP, BT
If you look at the results, some sites are higher in the serps on the .co.uk query than they are in any of th 4 data centres.
| 4:20 pm on Jul 3, 2006 (gmt 0)|
For the first time, when run site: operator, I see on all DCs that my homepage title is that of DMOZ. Don't now whether I should be happy or just start a new thread; Damn You, Google, You are killing my site!
My homepage though shows on top of listings, at least ;-)
| 4:46 pm on Jul 3, 2006 (gmt 0)|
|For the first time, when run site: operator, I see on all DCs that my homepage title is that of DMOZ. Don't now whether I should be happy or just start a new thread; Damn You, Google, You are killing my site! |
That happenned to me far a long long time. I tried to find a live DMOZ editor but couldn't find one that would take any responsibility for my category. Then Google started to use my description again. Then I changed my description and put the top traffic terms in it. Now when someone does a search for one of the high traffic terms their search is enboldened in my listing and I think I get more clicks per SERP.
Keep your fingers crossed.
| 4:58 pm on Jul 3, 2006 (gmt 0)|
I have suspected for some time that using google.com or google.co.uk returns an entirely different result set than a single IP starting point.
In fact it is impossible to match google.com or google.co.uk results with any single IP address serp.
I mentioned this some time ago in another thread.
For this reason it seems entirely logical that Google is using a combination of datacentres to produce the result sets from the main search homepages.
This way it is far more difficult for webmasters to use McDar or other tools to accurately "Data Centre Watch"
In my opinion DC watching may still be interesting but looking at individual Google IP search pages is not going to help in accurate assessment of current or future SERPs.
Also "Random" results are a good way of ensuring that results to the searcher are a fresh mix of several (possibly random) data centres as we all know these DC's can be a little different at times due to natures of the google indexing system.
I also believe that the google.co.uk etc homepages introduce further filters including server location, domain extension etc that are not introduced to direct IP searches even with the country string manually added in the address window. The results are quite different.
Again I have posted this before but it went un-noticed except by Tedster.
| 7:55 pm on Jul 3, 2006 (gmt 0)|
I have no evidence whatsoever to back this up, but it has occurred to me of late that maybe G are holding cached searches in memory on some of their servers for a certain amount of time. If the query is close enough to the original then they return a cached result rather than doing a full search. Maybe rather than holding indexes in memory or hitting the hard drives to perform a query, some of the time potted results are sent back to similar enough (maybe the n% most common) searches.
There is certainly something funny going on with google serps and cookies...
| 8:57 pm on Jul 3, 2006 (gmt 0)|
I'm curious if anyone whose site was hammered on June 27th has seen it reappear since then. Our three main sites went to 10% of long-term traditional Google referrals and are still there.
Also, has anyone distilled the various evidence-based theories of what's going on and created a summary post? I've read all of the messages in the threads on the subject, and haven't come across one.
| 9:22 am on Jul 4, 2006 (gmt 0)|
> For the first time, when run site: operator, I see on all DCs that my homepage title is that of DMOZ
Yep: similar for me: there is one KW I sporadically monitor, beacuse I "own" ALL relevant two-word-combinations, except a search on this KW alone. It used to be on spot 28-40 for years, but in the past months moved up to page one on more and more of the DCS, except my default one. In the last couple of days it was shown on page one this minute, gone again the next.
Yesterday (July 3rd)was the first day, I seem to have occupied page one with it, and ALL McDar results report the same.
If you ask me: Big Daddy has settled.
Indeed, time to open a new thread.
| 10:37 am on Jul 4, 2006 (gmt 0)|
well if this is the future of google and they think this is an improvement to search then they have clearly lost the plot.
Some great sites have been hit hard since this infastructure was introduced. They still have cannonical issues with a great number of sites that they are doing nothing about and whilst its anoying if your own sites lose position its even more anoying when you see the spam and junk sites that rank above you!
On one of our sites, the bot is clearly active on the site yet the data must be going into a big black hole or something because nothing is changing in the serps and the serps still bring up out of date cashed pages for the site and pages without the www prefix despite a 301 being in place.
Also, there is absolutely no logic to some SERPS what so ever. Its almost like for some two or three word strings they have picked the top 40 sites that match the nearest and mixed them all up together.
Page Rank has little effect now by the looks of it other than it affecting how deep the google bot goes on your site.
Age of a page remains an important factor, in so much that if you have a stale out of date page that nothing has changed on for two years chances are it will rank high.
I think the serps are the worst ever, others who have come out tops in all this will no doubt disagree
| 11:12 am on Jul 4, 2006 (gmt 0)|
|I have no evidence whatsoever to back this up, but it has occurred to me of late that maybe G are holding cached searches in memory on some of their servers for a certain amount of time. |
If I had capacity issues that is what I would do.
Am I the only one who thinks that the results on these DCs are just plain unadulterated garbage?
Or are they just suffering from a McNameless filter?
| 11:36 am on Jul 4, 2006 (gmt 0)|
Im noticing garbage sites with numbered urls on previously stable(!) dcs and default uk. Definately still some tinkering.
Keyword monitoring seeing a lot of fluctuation day to day too.
Also noticing a lot of scaper sit sserving Google cahe details? Also numbers in the domain - pure spam - no content at all.
Are sites that link to cache results in Google being given some sort of preferential ranking?
| 11:56 am on Jul 4, 2006 (gmt 0)|
I totally agree, all garbage pops up on the industries I watch.
Seems like agressive inanchor link building and 'VERY low end' SEO's get another chance of survival there.
I saw there the worst SERP for a long time.
It seems like it is where Google is heading though...if that's the case, they better review their on-site algo for good because the like of: "we at KW1 company provide KW2 services for KW1 customers. Our KW2 services are the best for KW1 ....." Amazingly ridicoulous when on page 1 (medium range competition type of SERP's for this similar examples I saw)
| 4:06 pm on Jul 4, 2006 (gmt 0)|
|Am I the only one who thinks that the results on these DCs are just plain unadulterated garbage? |
188.8.131.52 etc etc
No, you're definitely not the only one, dear
On this first dc (too lazy to look at the specific rest but I know the results are similar everywhere), in my sector:
1) "Informational" site with tons of "Added Reviews" pages containing (usually) just a few lines of "new text". The featuring home page only contains links to these pages with a fraction of text taken from these inner pages.
2) Products site with tons of images and 2-3 words describing each product
3) A .gov site with regulations, reports and warnings and links to the white house etc
4) A page with a "Click a letter below" list of LINKS ONLY - must be around 100 usually one-word-links - NO OTHER CONTENT whatsoever
5) Another .gov site EXACT COPY of the previously mentioned .gov site (featuring on position 3 but on a different domain (of course!)
6) A decent site
8) A good site
9) A low quality site containing a few lines of trivial content and links to all kinds of sites, even casino
10) Yahoo directory
On the second page I see ANOTHER Amazon page, wikipedia and other jewels
I have NEVER seen worse results than this before!
| 4:11 pm on Jul 4, 2006 (gmt 0)|
I am not believing in the "results pieced together from several datacentres" stuff. I recently asked why ShowIP lists three IP addresses for Google and the answer was something like this:
Sites like Google have multiple IP adresses for one domain name in their DNS listing (called DNS Round Robin). You cannot say for sure which one firefox is using for displaying the site. It may even use different ones for the main body and images. Therefore ShowIP just shows one address and indicates by '3 more' that 3 more IPs exist for the domain.
Interestingly, I have tried accessing google.com then google.au, then google.fr and so on, and I get a completely separate set of results each time - however the three IPs in one result are always from the same Class-C block as each other.
The next search will bring three new IP addresses, all from the same Class-C block as each other, but from a different Class-C block to that used last time.
I think that there is some results filtering going on depending on the IP that you are located at yourself, and from which you access Google's servers, combined with the actual Google domain name (the .com or country specific) that you requested the data from.
Accessing a direct IP address forces the "generic" Google English. Maybe there are some parameters that can be added to the search URL that then makes it return the "specific" results based on those filters. I suspect that the &hl= parameter does have at least a small role to play here.
| 4:18 pm on Jul 4, 2006 (gmt 0)|
One of the reasons why you are seeing pages with lists of links on them imo is simply because google has done something with spidering in relation to page rank which has strangled its own results!
Ie it wont now spider off a page unless its over a certain page rank.
For example a page thats on one of our site acts as a site map for a section about widgets detailing all the specific widget pages off it within the widget section of the site.
The Home Page is a PR7, the widget section page a PR5, the site map/widget index page a PR3 the specific widget page off this page is not indexed - three deep, to many now under the algo.
When you search for specific widget information. Rather than google deliver the page about that specific widget, it will deliver the page above it that acts as the site map that lists the widget page off it.
Multiply this by loads of other sites with the same problem and G delivers serps that look and are garbage.
| 4:24 pm on Jul 4, 2006 (gmt 0)|
I can't believe how spammy or content poor the sites at the top are since the 27th, it is like age, content and page rank are being completely ignored.
I found one site in the top three for a popular one word term with the title "Under Construction" and virtually nothing on the under construction page. But, the site's url, for example only, term.org matches the actual term itself exactly.
My own sites with good content and page rank were knocked way back on the 27th but are slowly working their way back up the ladder. I am attributing this to visitor popularity which they have always had in the past.
My theory is that they are experimenting with different algorithms and then watching the results as to how popular they are with google visitors. If this is true it would mean a potential upcoming change in policy.
| 4:51 pm on Jul 4, 2006 (gmt 0)|
To prove your point that Google almost discounts page rank, a popular search term for "Blue Widgets" out of 11 million results has the following pages listed in its SERPS:-
1 & 2 - Same Site duel listing PR5 Page - Garbage - (in sector, low quality)
3 - PR5 (in sector with dedicated page but not specific)
4 - PR5 (Dedicated site, reasonable listing)
5 - PR4 - Directory Site
6 - PR5 - Directory site (only info on it about the search term is link to our site which is position 8!)
7 - PR5 - Directory again
8 - PR5 - (our site with dedicated sector about term)
9 - PR4 - Directory
10 - PR3 - (Dedicated site but poor page)
11 - PR8 - Authority site dedicated to the subject (should be 1)
12 - PR5 - In Sector, with dedicated page
13 - PR4 - In sector not relevent
14 - PR3 - In sector with dedicated page
15 - PR4 - In sector not relevent
16 - PR0 - Directory
17 - PR5 - Dedicated site
18 - PR0 - Directory
19 - PR5 - In sector
20 - PR3 - Dedicated
Conclusion - 6 out of 20 are directory sites, The true authority is in Position 11, Only 4 out of the 20 are dedicated to the subject matter and 5 have sections about the subject matter.The remaining 5 are garbage
A directory with a link to our site and nothing else can outrank our own site that contains the information.
PR5 pages can be beat by PR0s and PR3s
Now someone tell me Google have improved the serps since the roll out of big daddy - frankly the serps are a joke imo.
< continued here: [webmasterworld.com...] >
[edited by: tedster at 1:58 am (utc) on July 7, 2006]
| This 168 message thread spans 6 pages: < < 168 ( 1 2 3 4 5  ) |