Forum Moderators: Robert Charlton & goodroi
what i notice is that the Google data centre that serves me in the UK by typing in Google.co.uk is 66.102.9.104 and i see various serps results
Yet, if i type into the browser 66.102.9.104 i get different results why is this?
[edited by: tedster at 10:37 pm (utc) on July 28, 2006]
I posted on this some time ago.
You simply cannot replicate .co.uk results using and actual IP address. Much testing has proved this.
Reseller: I would expect the results across one Class-C block to be the same. My guess is that you caught that block in the middle of the data being updated, and that the next time you look the results will all be the same as each other, just within that block.
"RichTC: What results do you get if, rather than using 66.102.9.104, you use gfe-lm.google.com instead?"
I see the 66.102.9.104 results exactly as i would if i typed in the data centre number in.
Once i go to google as www.google.co.uk the 66.102.9.104 results are different/ much better imo.
Conclusion is that ellio is bang on. An extra filter is applied to the results. Perhaps and quite right imo google gives a bit more weight to sites that operate within the area the data centre is serving?
Ive deleted all cookies/ files etc and run this test three times now - what ellio discovered im seeing exactly the same thing.
Question is - Is this the same in reverse? Are the results i see on the .com from the uK are they different from the .com results that are seen locally in different regions of the USA and Europe?
Very interesting
Other sites I have did not budge and had:
ODP listings for Title in Google
exchanged relevant links
No duplicate content.
All of the sites are roughly the same age. The only element which is diffierent is the % of duplicate content on the site that got hit hard, which is around 40%.
Any thoughts, discussions on this element?
"Reseller: I would expect the results across one Class-C block to be the same. My guess is that you caught that block in the middle of the data being updated, and that the next time you look the results will all be the same as each other, just within that block."
You are right. This morning I see the results the same within that block.. vanished :-)
However, I'm still #2 for that particular key phrase on the majority of DCs. The question now is; to which direction the DCs are going to move?
I think duplicate content within a site is very likely in many sites that are authorities on their subject. Googles trying to be far to clever for its own good imo if its trying to hit sites because they have too much similar content.
Its where you drawn the line on what is duplicate.
For example if your website is about cooking you are likely to have many pages that mention eggs, flour etc, even actions around the ingrediant may be similar on many pages.
I think google should drop the whole on site duplicate and word sematic issues and just deliver plain and simple the most relevent pages to the exact search term from the site. Meanwhile if a site has 1000 pages on it that contain duplicate keywords google should deliver the one thats dedicated to the keywords that is the most relevent.
Im seeing far to many results in google currently that are meaningless simply because the page contains a link on it to another section of the webmasters site and google has failed to the list the correct page.
I mean to say that the other 9 sites showing on the 1st page for "Blue Widgets" were not affected by this filter, but me (the remaining 9 sites are: one affiliation site, i.e. adding no additional content, and the remaining 8, very good and respectable competitors).I'm anyway getting the same good ranking if I search "Blue Widgets" with the quotations signs. As all of you.
This sounds like Florida all over again. Have you tried searching for the singular Blue Widget and other stems of the words. I'm finding that for some singular 2 word terms that I'm not showing up for, if I search for the term with one word changed to plural I come back in.
Sid
Forgot to say that the plurals thing does not work on the 72.14.207.* group. That group is showing different results to the majority. I'm very confident that what we are seeing has something to do with the semantics ellement of the algo.
[edited by: Hissingsid at 8:19 am (utc) on July 27, 2006]
At the beginning of this Yo-Yo dance, all the versions were giving "nowhere" results:
1- Blue Widgets
2- Blue Widget
3- The Blue Widgets
Now, the behaviour is exactly the same, but for the 2nd and 3rd terms.
Term nr. 1 (Blue Widgets) is't either 1st or nowhere on all datacaenters and dancing (within the same DC, varies from 1st to nowhere)
Terms 2 and 3 are either "a bit worse ranking than before, still acceptable" or "almost disappeared, kind of 3xxth position but showing not with the main url"
The substance of I doesn't change, IMO.
On the DCs where I'm 1st for the 1st keyword, I'm also "not too bad" with the other 2 keywords.
On the DCs where I'm vanished for my 1st keyword, I'm also "too much bad, and not showing for the main URL" with the other 2 keywords.
Therefore I don't think this can be a semantic problem with singular/plural versions of a certain expression...
With my website that vanished there was only one change made and that was a run of site link that linked directly back to the home page. This was done as a test to see the impact for the run of site links. 1 day after this had been changed to test it the site dropped out the listings for high money terms. When the link was changed back to normal and google crawled us we are back on the listings that we were on before. It must me something to do with run of site links or on page factors as this was the only site the test was done on all the other sites I deal with didn't move at all.
Hope this helps some of you.
Pete
and: this is not the explanation: never had this in our SEO life.
Also, we run a lot of similar websites, that have not been affetcted by this filter. therefore, cannot be related to how we are linked and how we link as we adopt exactly the very same policies everywhere (white hat 110%, even more "legal" than google suggests).
We have been meaning to run this test for a while if your site is bluewidgets.com and on your navigation menu which runs through the whole site instead of having a link with anchor text "home" you change "home" to say " Blue Widgets Homepage" But only the Blue Widgets is linked to that the hyperlink is then directed to bluewidgets.com so that through the whole of the site you will see blue widgets homepage with blue widgets being the anchor text.
This test was done to see if The new "data refresh" values or devalues site wide links with anchor text. This test has been proved that site wide links with anchor text relevant to the website is frowned upon by google and should not be done as it will effect your listings.
I have another site that this test will be done upon just to make sure. I just thought it might be useful for some people incase their site had the same sort of problem maybe with site wide links to their own site.
Pete
I see opposite trends in my industry.
My super-top-always nr1 (at least after Jagger) competitor for a very highly competitive keyword is using the technique you are suggesting and outlining as "punished by G".
I also agree that this could be seen (and should be seen) as a malicious technique, and in fact I have always used "Home" instead of "Blue Widgets Homepage" to link my homepage, from every page of my sites.
BUT: those smart guys are 1st for all those very competitive keywords.
ALSO: I mande a test myself using in the navigation bars "WidgetsWorld.Com" (instead of "Homepage", being "WidgetsWorld.Com" the address of my homepage) on one of our tens of websites and this has worked beautifully: pre-and-after jagger always first, also in these days.
But I don't dare to adopt this technique on all the sites we manage, as this is for me a black hat technique that I refuse to adopt. And Google should do the same, punishing those sites instead of rewarding them.
[edited by: giuliorapetti at 11:08 am (utc) on July 27, 2006]
[edited by: soapystar at 11:13 am (utc) on July 27, 2006]
Thanks for the responses. IMHO i do class this as black hat techniques I was just doing this as a test not to implement into the site. I have seen numerous sites adopt this technique and wanted to know if this new update/data push targeted this and effected the sites that had it. I think in my case it did effect it and that is all I was looking to achieve. I don't believe this is the best way to go about getting listings but I still think that the above comments are right you can no longer build a site purely for the visitors only you have to consider google in everything that you are putting onto the site.
Pete
And by the way, I am one of the least technically qualified webmasters. Pages are simple (and may have many errors for all I know) and some of the discussion here seems like rocket science to me. In other words, if content is good, Google may be very forgiving (as it always been to us).
Friends, if you have a good website and have done nothing wrong, hang in there (do nothing). You will be back. We suffered a 90% drop in traffic and income and it was painful. PS: We are based in the US.