Welcome to WebmasterWorld Guest from 126.96.36.199
OK, so after feeling really good to get our site back in order after the 27th of June "data refresh" I come back from holiday and find its all gone Pete Tong again.
The site: tool is NOT SHOWING OUR HOME PAGE FIRST. AGAIN!
Consequently, the home page is not showing in the serps. Again. Grrrrr....
Another data refresh required please Google. You're killing me.
[edited by: tedster at 6:17 am (utc) on Aug. 30, 2006]
Interesting. We started with a domain 11 years ago and built a lot of content on it for our clients. As domains began to be used more frequently for individual sites, our clients migrated from www.OldDomain/client to their own domains. After they moved their hosting to other clients we eventually redirected (301) these old client sites back to a new domain of ours where we built all our new content.
In example, we had one client with a site at something like:
Today for searches on "badger golf course" (it's actually not badger so don't bother trying that ;-) our site comes up in the top three even though we have zero to do with either bader or golf. Amazing but true.
The point here is that I noticed when I do link:www.NewDomain.com none of the 301 redirected old pages show up as links AND THEY USED TO. I still get the benefit of the keywords -- I just don't get to see the linked pages.
So, someone tell me why a pages which has been gone for more than five years still helps me with SEO and what, if anything, it means that the links no longer show up. Oh, amazingly, www.OldDomain.com with a PR of 6 has a link on it to www.NewDomain.com but it doesn't show up using the link:www.NewDomain.com syntax either. Another strange one!
joined:May 31, 2004
joined:May 31, 2004
joined:June 11, 2005
My site has a name that no one else would ever, legitimately have to use. On all other DC's, apart from 72.*, a search for just my site name (i.e. without .co.uk added) provides 12800 returns, with me at the top. On the 72.* DC's I'm half way down the pages and sites that have given me non-reciprocal links are at the top. They (72.*) all seem to be acting the same way.
I predict big changes for the last week of this month. 3 months from the June 27th updates.
p.s. I'm also seeing the same changes on the following DC's
Correct. For URLs that are marked as Supplemental and which return a "200 OK" status, the current content of the page will be indexed and appear as a normal result, while any searches based on older versions of the page will appear as a Supplemental Result. The supplemental result allows searchers to find content that they looked at recently but which is no longer on that page.
>> These are mostly all 302 pages. <<
There is almost no valid reason to be using 302 redirects these days. I would look very carefully at your site architecture before you hit any snags.
>> Of these 444, about half of them are printable version pages which last year we had told Googlebot not to index (via robots.txt).
Yes. Using robots.txt to keep duplicates out of the index is a very good idea. I just fixed a 50 000 page site that was exposing 750 000 URLs in just this manner: [webmasterworld.com...]
>> After the recent changes in the algo, I figured I'd reopen them (trying anything here). <<
If by "re-open" you mean "show them to Google", then that would be a very big mistake. Google takes a year to drop a Supplemental Result from the index when it represents a page that is now 404, redirects, or is excluded from indexing after it has been indexed. Exclude those URLs again, and make that permanent.
>> The other half are pages which were for paid listings which are now expired. We redirect them to another page until they're paid again. I'm thinking this probably isn't the best idea. <<
Make sure that the redirect is a 301 redirect (not ever a 302 redirect), OR make sure that those URLs take you to a custom 404 page (one that realy does return a 404 HTTP status code) that has basic site navigation on it to get the visitor headed in the right direction.
>> I just tried the supplemental search and came up with only 6 pages that really are supplemental. I guess all I can do now is to ignore them. <<
YES! If the Supplemental Result is for a URL that is now a 301 redirect or is now returning a 404 error, then you can safely ignore that result. Google will drop it from view after one year. You cannot control that action. In the meantime the 301 redirect, or your custom 404 page should be feeding the visitor through to the correct part of your site anyway.
If the supplemental result is for a URL that still returns "200 OK", then that indicates a problem: usually duplicate content of some sort (URLs with multiple different parameters that show the same content, www vs. non-www, multiple domains, http vs. https, etc) or pseudo-duplicate content (multiple pages with same title and/or meta description, or page content that is too similar).
>> With Google, it takes a perfectly unique page of content, with unique title, etc, and makes it supplemental. Then, they show up as non supplemental for a few days, and then go back to supplemental. <<
Supplemental, or not, depends on the search query for live pages, those that return "200 OK".
Additionally, it depends on the datacentre. If you look at gfe-gv and then at gfe-eh you will see big differences. The data at gfe-eh has had a big change made to supplemental results a few weeks ago. That change is now also on many other datacentres, but is still far from being on all.
See also: [webmasterworld.com...]
[edited by: g1smd at 12:47 pm (utc) on Sep. 12, 2006]
FYI everyone: my backlinks count seemed to have dropped off right around the August 17th date. Any chance Google instituted some form of filter/penalty for sites on the same IP with reciprocal (or even standard) links?
Lastly, when I search on:
I get a weird site which has 302'd to my home page. Weird. Anyone else find anything like that?
One site has Google adsense, the other doesn't. The two websites are on different servers! Other websites on their server seem unaffected.
Does anyone know what's going on? Anyone else have this happen?
Both sites are well over 2 years old. I haven't changed anything either
Does anyone know what's going on? Anyone else have this happen?
I see your 2 years - and raise you to 7 years! Yes one of my 7-year old sites has dropped from the index.
As for what's going on - this is the nature of Google. It is still crawling the site but it is no longer in the index as of the start of this week.
Nothing has changed... perhaps a new filter has been added to rid the world of cheesy sites?
Seriously - I am going to hold fire for a few weeks and see if it pops back. They sometimes do.. but with Google nobody ever knows what is going on.
By disappear do you mean that a site: command shows nothing?
If you're still indexed and are not ranking, do you come back by appending &filter=0 into the URL string?
Knowing that might help to diagnose what is wrong and figure out a way to solve whatever problem is causing you to blip out of existence.
I know everyone says "don't rely on google traffic" but take my example.
We sell widgets.
In Spring we needed to order ahead the new stocks of widgets for Autumn. Clearly we analyse sales, and predict stocks for Autumn. Sales of course come from many sources but with google as the dominant supplier of natural leads, they are also are a large percentage of sales. I cannot change that. Now I cannot do my stock purchasing predictions by saying "Heck, we had better not order just in case google pulls the plug". I need to order based on the current situation.
Of course June 27th google pulled the plug.
I am now faced with cancelling the Autumn stock - quite a problem with suppliers. I am also now discussing shutting down that business, as it is not worth this aggro. It is not economic to run with zero natural traffic.
So what is the lesson to be learnt? What should I have done?
a) Block google spiders and traffic, so that my websites are not dependant on that source of leads?
Actually I have always had a kinda strategy - I run 3/4 internet businesses, in different market sectors, and the other two are NOT optimised for google. I get no organic traffic from google for them, I expect no traffic and so it is consistent. (but I do use adwords).
Personally I think if you have a profitable business that relies on good google natural listings - flog the damn thing - as you are constantly living with the risk that your business is worthless tomorrow.
I like google, but I wish they only had like 30% of the markeplace when they do stuff like this.
I didn't think to check all the datacenters, and now it's back, so I'm not sure.
By gone, I mean totally out of the index, not just out of the SERPS -I tried a site: search, search for my URL, nothing - they totally vanished from Google existence.
So here's some weirdness:
If I search Google on a phrase where I should be #2 & 3, I'm not there. If I use a tool to check then I'm in the 900's.
So I run the following from a command prompt:
Tracing route to www.l.google.com [188.8.131.52]
If I put that IP into my web browser and do a search I find that my listings are exactly where I would expect them. NOW THAT'S WEIRD!
This doesn't always work -- it depends on the IP I use. There are just two points here:
1. Google changes the IP I get throughout the day and even from computer to computer in my office. So when I search I get different results all the time.
2. Checking an IP (datacenter) directly does not always show me the same as if I use www to search Google.
My guess is that when I use WWW it gives me results from some random index based on load sharing rather than the datacenter I traceroute. Then again, maybe each datacenter has multiple indexes running right now!
I just searched:
Hit refresh a few times and it gave me different results. In one result set I was there, after a few refreshes I was gone from the listings.
I tried it earlier and no amount of refreshes worked -- then I tried again and it worked.
So Google is bouncing users around to different indexes on the same datacenter. I guess checking datacenters is no longer accurate.