From my experience when using the actual .co.uk address additional filters are used so that if you compare results with those at the IP address plus the UK bit of the data-string (from a .co.uk search) you get different results.
I posted on this some time ago.
You simply cannot replicate .co.uk results using and actual IP address. Much testing has proved this.
RichTC: What results do you get if, rather than using 188.8.131.52, you use gfe-lm.google.com instead?
Reseller: I would expect the results across one Class-C block to be the same. My guess is that you caught that block in the middle of the data being updated, and that the next time you look the results will all be the same as each other, just within that block.
"RichTC: What results do you get if, rather than using 184.108.40.206, you use gfe-lm.google.com instead?"
I see the 220.127.116.11 results exactly as i would if i typed in the data centre number in.
Once i go to google as www.google.co.uk the 18.104.22.168 results are different/ much better imo.
Conclusion is that ellio is bang on. An extra filter is applied to the results. Perhaps and quite right imo google gives a bit more weight to sites that operate within the area the data centre is serving?
Ive deleted all cookies/ files etc and run this test three times now - what ellio discovered im seeing exactly the same thing.
Question is - Is this the same in reverse? Are the results i see on the .com from the uK are they different from the .com results that are seen locally in different regions of the USA and Europe?
My site just reappeared in almost of all of the DCs. Anyone see the same? I am just hoping it sticks.
I'm back also. From nowhere to first 12 in hundreds of terms about 7pm Wednesday. Looks (to my limited viewpoint) like a total roll back to pre-June 27... I'm seeing the same results I saw then.
Not seeing anything here for what was hit in the past few days.
Bad results on most DC's I see. One site I own lost on many top keyword searches, simply vanished overnite on 'the dreaded night'
Other sites I have did not budge and had:
ODP listings for Title in Google
exchanged relevant links
No duplicate content.
All of the sites are roughly the same age. The only element which is diffierent is the % of duplicate content on the site that got hit hard, which is around 40%.
Any thoughts, discussions on this element?
Good morning g1smd
"Reseller: I would expect the results across one Class-C block to be the same. My guess is that you caught that block in the middle of the data being updated, and that the next time you look the results will all be the same as each other, just within that block."
You are right. This morning I see the results the same within that block.. vanished :-)
However, I'm still #2 for that particular key phrase on the majority of DCs. The question now is; to which direction the DCs are going to move?
Good morning Folks
IMO, whatever has started yesterday morning, its still going on with full power.
In the good old days, we would have called whats happening as a Pre-Update stage.
Today.. Matt Cutts will call it Data Refresh, while Adam will insist that its just a Bad Data Push :-)
I am back in almost all DC's (my particular site that got hit at 26/7)
I posted this as well at
I believe the new data refresh as MC mention is on its way.Fest your seat belts ,cross fingers and wait.Good luck reseler and the other fellow webmasters.I strongly believe Google will fix things with white sites sooner or later .Many things are not final yet ,Google directory PR update ,BL PR update not complete ,after those will be finalized then probably we will see something like florida or alegra IMHO.
I see some pages back, but not others. The main one I track, the supplemental that had been at the top of the site: search is now the fourth listed page and the correct main index page is at the top. Ranking down slightly from where it used to be. In other words, it's more or less only now very slightly impacted by the weak supplemetal, rather than totally dead.
I think duplicate content within a site is very likely in many sites that are authorities on their subject. Googles trying to be far to clever for its own good imo if its trying to hit sites because they have too much similar content.
Its where you drawn the line on what is duplicate.
For example if your website is about cooking you are likely to have many pages that mention eggs, flour etc, even actions around the ingrediant may be similar on many pages.
I think google should drop the whole on site duplicate and word sematic issues and just deliver plain and simple the most relevent pages to the exact search term from the site. Meanwhile if a site has 1000 pages on it that contain duplicate keywords google should deliver the one thats dedicated to the keywords that is the most relevent.
Im seeing far to many results in google currently that are meaningless simply because the page contains a link on it to another section of the webmasters site and google has failed to the list the correct page.
|I mean to say that the other 9 sites showing on the 1st page for "Blue Widgets" were not affected by this filter, but me (the remaining 9 sites are: one affiliation site, i.e. adding no additional content, and the remaining 8, very good and respectable competitors). |
I'm anyway getting the same good ranking if I search "Blue Widgets" with the quotations signs. As all of you.
This sounds like Florida all over again. Have you tried searching for the singular Blue Widget and other stems of the words. I'm finding that for some singular 2 word terms that I'm not showing up for, if I search for the term with one word changed to plural I come back in.
Forgot to say that the plurals thing does not work on the 72.14.207.* group. That group is showing different results to the majority. I'm very confident that what we are seeing has something to do with the semantics ellement of the algo.
[edited by: Hissingsid at 8:19 am (utc) on July 27, 2006]
It looks strange but I'm having the same problems as all of you ONLY on two websites out of 13. And the only thing different from each other is the google sitemap placed on those two that disappeared...
[edited by: Alex70 at 8:34 am (utc) on July 27, 2006]
At the beginning of this Yo-Yo dance, all the versions were giving "nowhere" results:
1- Blue Widgets
2- Blue Widget
3- The Blue Widgets
Now, the behaviour is exactly the same, but for the 2nd and 3rd terms.
Term nr. 1 (Blue Widgets) is't either 1st or nowhere on all datacaenters and dancing (within the same DC, varies from 1st to nowhere)
Terms 2 and 3 are either "a bit worse ranking than before, still acceptable" or "almost disappeared, kind of 3xxth position but showing not with the main url"
The substance of I doesn't change, IMO.
On the DCs where I'm 1st for the 1st keyword, I'm also "not too bad" with the other 2 keywords.
On the DCs where I'm vanished for my 1st keyword, I'm also "too much bad, and not showing for the main URL" with the other 2 keywords.
Therefore I don't think this can be a semantic problem with singular/plural versions of a certain expression...
I am now seeing all my listings returned with higher listings. There is also still an increase of about 20 mill listings. Is everyone else returned?
The "not ranking for a specific search" thing has been around for years, but the quotes thing would never have made the pages appear before, and obviously the phenomenon is far more widespread now than previously.
I'm not seeing any changes
home page still MIA
other pages AOK
results "from UK" still OK
With my website that vanished there was only one change made and that was a run of site link that linked directly back to the home page. This was done as a test to see the impact for the run of site links. 1 day after this had been changed to test it the site dropped out the listings for high money terms. When the link was changed back to normal and google crawled us we are back on the listings that we were on before. It must me something to do with run of site links or on page factors as this was the only site the test was done on all the other sites I deal with didn't move at all.
Hope this helps some of you.
pls explain: what is site link? Is it a sitemap?
run of site link: you get or purchase a link from someone, but these inbound links appear on all pages of the site that is linking to you.
and: this is not the explanation: never had this in our SEO life.
Also, we run a lot of similar websites, that have not been affetcted by this filter. therefore, cannot be related to how we are linked and how we link as we adopt exactly the very same policies everywhere (white hat 110%, even more "legal" than google suggests).
Similar but this was a link that ran through my site I'll explain.
We have been meaning to run this test for a while if your site is bluewidgets.com and on your navigation menu which runs through the whole site instead of having a link with anchor text "home" you change "home" to say " Blue Widgets Homepage" But only the Blue Widgets is linked to that the hyperlink is then directed to bluewidgets.com so that through the whole of the site you will see blue widgets homepage with blue widgets being the anchor text.
This test was done to see if The new "data refresh" values or devalues site wide links with anchor text. This test has been proved that site wide links with anchor text relevant to the website is frowned upon by google and should not be done as it will effect your listings.
I have another site that this test will be done upon just to make sure. I just thought it might be useful for some people incase their site had the same sort of problem maybe with site wide links to their own site.
I see opposite trends in my industry.
My super-top-always nr1 (at least after Jagger) competitor for a very highly competitive keyword is using the technique you are suggesting and outlining as "punished by G".
I also agree that this could be seen (and should be seen) as a malicious technique, and in fact I have always used "Home" instead of "Blue Widgets Homepage" to link my homepage, from every page of my sites.
BUT: those smart guys are 1st for all those very competitive keywords.
ALSO: I mande a test myself using in the navigation bars "WidgetsWorld.Com" (instead of "Homepage", being "WidgetsWorld.Com" the address of my homepage) on one of our tens of websites and this has worked beautifully: pre-and-after jagger always first, also in these days.
But I don't dare to adopt this technique on all the sites we manage, as this is for me a black hat technique that I refuse to adopt. And Google should do the same, punishing those sites instead of rewarding them.
[edited by: giuliorapetti at 11:08 am (utc) on July 27, 2006]
IMHO you might be suffering for an innatural increase of links with the same anchor text, and not for linking back to the index with keywords in the anchor.
but you can look at the way google has gone over the last 12 months and the trend is clear. Do not use keywords on site that match the subject of your site. Not in met title, H1 tags, anchor text or too many times on the page itself. Instead concentrate on optimising the external elements of the site. Its true that it is now no longer possible to build a site just for the user if you hope to get google traffic for any sector they have targetted.
[edited by: soapystar at 11:13 am (utc) on July 27, 2006]
|IMHO you might be suffering for an innatural increase of links with the same anchor text |
what you are saying is that even your own internal navigation must be deigned for google and not for the site visitor...
10 year old site kicked out of googles results once again! I'd been back for a year where I deserved to be..now replaced by news articles? and my site nowhere to be seen.. More google madness it seems
Thanks for the responses. IMHO i do class this as black hat techniques I was just doing this as a test not to implement into the site. I have seen numerous sites adopt this technique and wanted to know if this new update/data push targeted this and effected the sites that had it. I think in my case it did effect it and that is all I was looking to achieve. I don't believe this is the best way to go about getting listings but I still think that the above comments are right you can no longer build a site purely for the visitors only you have to consider google in everything that you are putting onto the site.
Almomst all of the damage from June 27 is now over for us. We did nothing at all - just updated the website as we always do. I do see minor change (plus minus 10%) in our rankings but other than that it seems that nothing has changed. And by the way, because we did not like the description from ODP, we added the tag, and it all worked out to be great - no impact on rankings.
And by the way, I am one of the least technically qualified webmasters. Pages are simple (and may have many errors for all I know) and some of the discussion here seems like rocket science to me. In other words, if content is good, Google may be very forgiving (as it always been to us).
Friends, if you have a good website and have done nothing wrong, hang in there (do nothing). You will be back. We suffered a 90% drop in traffic and income and it was painful. PS: We are based in the US.
| This 182 message thread spans 7 pages: 182 (  2 3 4 5 6 7 ) > > |