homepage Welcome to WebmasterWorld Guest from 54.242.231.109
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 95 message thread spans 4 pages: 95 ( [1] 2 3 4 > >     
17 Aug - Supplemental again
Northstar




msg:3049887
 1:29 pm on Aug 17, 2006 (gmt 0)

My site went back to being all supplemental again just like during the June 27 to July 27th mess. My traffic also dropped back down. Is anyone else having this problem again?

 

zap995




msg:3049947
 2:07 pm on Aug 17, 2006 (gmt 0)

I know this is a newb question, but for the life of me I can't figure out how to determin number of supp vs number of regular. How do you do this?

pgrote




msg:3049948
 2:08 pm on Aug 17, 2006 (gmt 0)

I can confirm that it happened to our sites ... again. I wonder how long it's going to take to fix this one.

At least Yahoo tells you when something is going on. I haven't seen any communication from Google that something has happened.

mbucks




msg:3049959
 2:14 pm on Aug 17, 2006 (gmt 0)

Yep, exactly the same here. Only the home page not supplemental.

Datacentres all over the place again.

Serps for our most popular keywords are generally rubbish.

dtcoates




msg:3049969
 2:20 pm on Aug 17, 2006 (gmt 0)

Hi, Guys,

This is my first post, so hello there.

We have been having a nightmare too - all ties in with the dates mentioned, and pages were mostly non supplemental until today. Then, wham, most pages gone and our visitor numbers devastated.

One of our competitors was affected first time around, but has stayed rock solid ever since - wish we were so lucky. Can't see any real difference apart from they have a PR4; we are PR3. Been fixing things like 404 server responses, uploaded a new sitemap, removed old 301 redirects. I can't think of anything else to fix this.

I've been searching for months to solutions to this, but like most people had no real luck. Any meaningful advice gratefully accepted.

Regards,

Darren

g1smd




msg:3049972
 2:22 pm on Aug 17, 2006 (gmt 0)

The datacentre at [gfe-eh.google.com ] has results that are very different as of today...

I see another big clean up in supplemental Results there.

JoeSinkwitz




msg:3049978
 2:25 pm on Aug 17, 2006 (gmt 0)

g1smd,

I actually see some sites entirely supplemental on gfe-eh that would normally show at least some non-sup pages. The SERPS are a bit more tweaked there too...more emphasis on title or something.

Cygnus

Bewenched




msg:3049992
 2:33 pm on Aug 17, 2006 (gmt 0)

I'm seeing that we are gaining non supp pages and lost about 100k of the previsously supplimentals today. We did implement a 301 redirect to force www. as well as set sitemaps to www.domain as well. Also made sure that the site could not be spidered under ssl .. this was a killer for us this past spring.

night707




msg:3049997
 2:42 pm on Aug 17, 2006 (gmt 0)

same problem here with a high qual content site. Traffic has gone down again completely by 90 just like June 27. There had been a perfect recovery for some few weeks, but now back into nightmare. Bad for users and deadly for a publisher.

g1smd




msg:3050014
 2:49 pm on Aug 17, 2006 (gmt 0)

What I see on gfe-eh.google.com is that Supplemental Results from 2005 are now all gone. Previously there were many going back as far as 2005 June.

I see many recently created new Supplemental Results representing content from just a few months ago. These are where the content has been edited very recently (and searches for the new content point to a non-Supplemntal Result), or where on-site duplicate content existed but the alternative (non Supplemental) URLs have now been filtered out (in one case the filtering was through robots directives to get certain URL formats delisted).

skibum




msg:3050023
 2:56 pm on Aug 17, 2006 (gmt 0)

Not going supplemental much but rankings are all over the place again, mostly down. Looks like a shakeup last night.

[edited by: skibum at 3:02 pm (utc) on Aug. 17, 2006]

wanderingmind




msg:3050031
 3:00 pm on Aug 17, 2006 (gmt 0)

This flip flop is maddening. For my main site, rankings have dropped to nowhere again. This is the third time this year they have vanished, came back, gone again, came back... now gone again.

And now how long is this going to last - till 27th of this month, or 6 months?!

petehall




msg:3050034
 3:02 pm on Aug 17, 2006 (gmt 0)

I think something has gone seriously wrong.

So many clean sites have been punished, it just doesn't seem right. Must be an error.

?

trinorthlighting




msg:3050035
 3:03 pm on Aug 17, 2006 (gmt 0)

There is some data crossing over on certain data centers in the past 24 hours, just hang on.

yayagogo




msg:3050040
 3:04 pm on Aug 17, 2006 (gmt 0)

Our established 6 year old site (pr 5) was affected for the first time overnight. We were not affected during June/July tweaks to G's algo. Search for our business name still comes up #1, but most (previously stable) interior pages now completely buried in search results for their KW's under non relevant and spammy pages. We have not done anything recently to the site's pages.

I did notice when doing site: command that there were very OLD pages that no longer exist being displayed in G's search results. It is as if they were bringing in results from a year ago.

Northstar




msg:3050053
 3:21 pm on Aug 17, 2006 (gmt 0)

There has to be something major going on at Google this year. Our site has remain fairly steady in Google for four years. Now this year it is like a roller coaster. All other major search engines that we get traffic from have remained steady.

HuhuFruFru




msg:3050056
 3:23 pm on Aug 17, 2006 (gmt 0)

Unfortunately it has also hit my site here in Germany very hard. For the last two years everything was very stable, here and there some minor changes, but this change since this morning has caused a huge traffic drop.

The site:query shows all 120.000 pages, but the main page is not in the first position.

On 72.14.207.104 the site:query seems shows now only 56.000 pages, and ranking is very bad.

I was not affected during the 27th june-problem.

And Adsense earnings are of course way down :(

[edited by: HuhuFruFru at 3:25 pm (utc) on Aug. 17, 2006]

Marval




msg:3050059
 3:24 pm on Aug 17, 2006 (gmt 0)

g1smd
Noted the changes on that datacenter - however also noted that top ten results for a bunch of "money phrases" are all subdomains of the same host (doorway pages) - at least for the first 100 phrases I looked at and it seems to be limited to 2 word phrases (dont know why thats important) - but have seen changes in the last 4 hours on that datacenter like they are bringing in more results - but not getting the scripted doorways out of the way

netmeg




msg:3050071
 3:38 pm on Aug 17, 2006 (gmt 0)

We somehow ADDED 38,000 pages to one client overnight, mostly recent supplementals however, as g1smd outlined. Weird. I haven't determined if it has affected ranking or traffic yet. (The site doesn't have anywhere near 38,000 pages; closer to 12,000)

colin_h




msg:3050114
 4:05 pm on Aug 17, 2006 (gmt 0)

Gadday All,

I think it's probably a good thing that we're seeing all this flip-flopping activity from Google at the moment. I have a hunch that they will look at a bank holiday weekend for an update and (although a long shot) we, in the England have our August bank holiday starting Saturday 26th August. The mad scientists at Google are obviously planning something, the supplementals have shot up from the recent low afew days ago ... A sign I think that changes are coming soon.

On my sitemap diagnostics, I have been watching regular trawling and the 404 report keeps going up and down. I need a complete site visit from Googlebot and I'm hoping that next weekend will be the time. From my rather rusty memory, in the past the visits started mid-week and the odd bit of flux started showing on the Thursday / Friday.

Of course, the sting in the tail is that if you are not helped out by the changes ... it could a long time coming before the next one.

I don't know ... could be a load of rubbish, it just makes me feel good thinking ahead.

All the Best

Col :-)

Halfdeck




msg:3050182
 4:59 pm on Aug 17, 2006 (gmt 0)

Good thing at least I can't find year-old supplementals on the DC I checked, but it looks like Google is completely ignoring 301s. I removed subdomains back around March/April. The cached page seems to be from the new version on a subfolder, so Google is apparently following the 301s but filing the page under the old url, not the new.

Also, site:www.domain.com on gfe-eh.google.com only displays 33 pages out of 1,690 for one of my sites.

Two steps forward, one step back.

[edited by: Halfdeck at 5:03 pm (utc) on Aug. 17, 2006]

g1smd




msg:3050200
 5:07 pm on Aug 17, 2006 (gmt 0)

The .co.uk has been redirected to the .com for more than a year, with all of the .co.uk pages listed as Supplemental for all of that time. The .com pages were all fully listed.

Today, all the .co.uk listings disappeared on gfe-eh; and the .com listings remain indexed and ranking as before.

This did not happen on previous updates; but other sites redirected earlier did update back then. It looks like Google wants to hang on to redirected URLs for at least a year for various purposes, before dropping them.

So, measure the effectiveness of your redirects not by whether the redirected URLs still appear in the SERPs, but by how well the domain that you redirected to is faring: are all pages listed, are they fully indexed, are they NOT URL-only or Supplemental, and does most of the site appear in a site:domain.com search before the "click here to see omitted results" message appears?

mbucks




msg:3049510
 8:51 am on Aug 17, 2006 (gmt 0)


System: The following 6 messages were spliced on to this thread from: http://www.webmasterworld.com/google/3049508.htm [webmasterworld.com] by tedster - 2:00 pm on Aug. 17, 2006 (EDT -4)


Our sites all 2000 pages apart from the homepage went supplemental.

Matt Cutts suggested deep links etc should help the problem and many others had similar theories, so rather than sit on my hands I had a dabble at something that I knew couldn't hurt.

I fired out a press release (we had some large announcements anyhow) linking to 5 pages on our site. Not expecting much short term I was suprised to see all five pages back fully indexed with an up to date version of each page (aug 11).

An incredible result that proved that more links would solve all.

EXCEPT:- This morning I've just checked and all the pages have gone back to supplemental with a cached date of 29th May!

Well frustration reigns yet again. How poor is this for the users? It's pretty pathetic google. The profits you guys make, in my humble opinion is hugely disproportionate to the quality of service you provide.

Internet users deserve better results than what you’re serving up.

You’re fortunate you’re still operating in an immature industry but you will be found out. I hope that before long a player with sufficient resources gets involved, cos there’s relatively easy money to be had.

Far easier than running an airline.

texasville




msg:3050057
 3:23 pm on Aug 17, 2006 (gmt 0)

I strongly suspect that what you did was a short term fix. I think the ibl's to the five pages helped and got them out of the supps until google matched up the press releases and then downgraded them as dup content, thus decreasing the value of those links and putting them back into supps.
To get pages out of the supps permanently, I strongly suspect it will take good inbound links that remain static and not based on articles or press releases that will be deemed dup content.

g1smd




msg:3050076
 3:39 pm on Aug 17, 2006 (gmt 0)

Check out what is going on at [gfe-eh.google.com ] - it is getting close to the time that what happens there is going to be important.

I would expect Matt Cutts to start making comments on it in only a few weeks time, or so, judging from the recent hints on his blog.

mbucks




msg:3050225
 5:22 pm on Aug 17, 2006 (gmt 0)

"To get pages out of the supps permanently, I strongly suspect it will take good inbound links that remain static and not based on articles or press releases that will be deemed dup content."

But the point is, why is it in the supps in the first place? That's the $million question. At the end of the day, it's unique content that's relevant to potential searches.

Why not just index pages and give them a fighting chance in the index, putting what it deems to be the most relevant up top?

Is it not possible that google can't actually cope at the minute and it's not the be all and end all in search engine technology?

g1smd




msg:3050237
 5:27 pm on Aug 17, 2006 (gmt 0)

Remember that there are several types of Supplemental Results.

For a page that goes 404 or the domain expires, Google keeps a copy of the very last version of the page that they saw, as a Supplemental Result and show it in the index when the number of other pages returned is low. The cached copy will be quite old.

For a normal site, the current version of the page should be in the normal index, and the previous version of the page is held in the supplemental index.

If you use search terms that match the current content, then you see that current content in the title and snippet, in the cache, and on the live page.

If you search for terms that were only on the old version of the page, then you see those old search terms in the title and snippet, even though they are not in the cache, nor found on the live page. That result will be marked as supplemental.

There are also supplemental results where the result is for duplicate content of whatever Google considers to be the "main" site. These results seemingly hang around forever, with an old cache, a cache that often no longer reflects what is really on the page right now. Usually there is no "normal" result for that duplicate URL - just the old supplemental, based on the old data. On the other hand, the "main" URL will usually have both a normal result and a supplemental result (but not always).

If you have multiple URLs leading to the same content, "duplicate content", some of the URLs will appear as normal results and some will appear as Supplemental Results. The Supplemental Results will hang around for a long time, even if the page is edited or is deleted. Google might filter out some of the duplicates, removing them from their index: in that case what is left might just be a URL that is a Supplemental Result.

The fix for this is to make sure that every page has only one URL that can access it; make sure that any alternatives cannot be indexed. Run Xenu LinkSleuth over the site and make sure that you fix every problem found. Additionally do make sure that you have a site-wide 301 redirect from non-www to www as that is another form of duplicate content waiting to cause you trouble.

Also, make sure that every page has a unique title tag and a unique meta description, as failing to do so is another problem that can hurt a site.

SuddenlySara




msg:3050263
 5:40 pm on Aug 17, 2006 (gmt 0)

Is there a program out there as good as Xenu that is able to ignore image files altogether when checking site for errors and orphans and such?

mbucks




msg:3050347
 6:27 pm on Aug 17, 2006 (gmt 0)

If you have multiple URLs leading to the same content, "duplicate content", some of the URLs will appear as normal results and some will appear as Supplemental Results. The Supplemental Results will hang around for a long time, even if the page is edited or is deleted. Google might filter out some of the duplicates, removing them from their index: in that case what is left might just be a URL that is a Supplemental Result.

The fix for this is to make sure that every page has only one URL that can access it; make sure that any alternatives cannot be indexed. Run Xenu LinkSleuth over the site and make sure that you fix every problem found. Additionally do make sure that you have a site-wide 301 redirect from non-www to www as that is another form of duplicate content waiting to cause you trouble.

Also, make sure that every page has a unique title tag and a unique meta description, as failing to do so is another problem that can hurt a site.

Many thanks for that thorough reply, a nice summary of possible problems.

What do you suggest if none of these issues apply. All I can think of is when we launched we weren't aware of the www / non www problem but then put this right with a 301 redirect months ago.

I may be over-simplifying this whole issue, but this whole supplemental thing seems very over complicated and no-suprising that google's indexing is in such a mess.

Why not:- index everything, and rank what it deems more relevant over the non relevant for the search phrase entered? It seems a massive waste of resources to cache different version of the pages then revert back to old versions.

Let's say that google deems a page to be dupe enough to make it supplemental (wrongly in some cases) and therefore it punishes the content of that page and doesn't show any very specific search phrases as results. What if that page was the only page on the web that contained information on that search term?

The user isn't getting the best result possible.

<edit reason: format quote>

[edited by: tedster at 7:12 pm (utc) on Aug. 17, 2006]

g1smd




msg:3050396
 6:46 pm on Aug 17, 2006 (gmt 0)

If I make a search this week and see a particular result, then next week that site has gone offline or the page content has changed, Google holds that result for many months, and keeps a cache of it as well, just in case I look for the site again.

They do this whether the page is 404, the domain expired, or the page content has been edited and updated: they keep an old copy for many months as a Supplemental Result.

Every URL for an active site has the new content in the normal index, and the old content in the Supplemental index.

For a non-active site/URL, the last known content is moved to the supplemental index a few weeks after the site/URL is no longer active.

When duplicate content is also dropped into the equation, then things become very messy. Some URLs for an active site will already be Supplemental, and others will be filtered out as duplicates. This happens in several very specific circumstances: when a domain has content showing as "200 OK" at both www and non-www URLs, when a site has several domains (perhaps .com and .co.uk) all pointing to the same content with "200 OK" status for all URL variations, and where a "page" has several valid URLs that can reach it (like the 10 or more variations for each thread in a forum like PHPbb or vBulletin; or the navigation by various searches in shopping carts - where the path followed builds a URL, and each product has dozens of different paths, and hence URLs, to reach it).

Having one canonical URL for every piece of content (especially for dynamic sites like forums and carts), 301 redirects from non-www to www, and excluding URLs that can only deliver proper content to users that are logged in (like "post new thread" and "reply" and "send PM" URLs in a forum) is key.

Matt Cutts was right all along; just some of the Google-speak was too cryptic to understand what the long term implications of some things really are; and I can see that there are certain types of spam that these actions can severely cripple; as well as legitimate sites where the owner does not take enough care with their site architecture, or cannot interpret the symptoms of what is going wrong.

Just to say that these searches are very important:

site:domain.com
site:domain.com inurl:www
site:domain.com -inurl:www
site:www.domain.com
site:www.domain.com inurl:www
site:www.domain.com -inurl:www

This 95 message thread spans 4 pages: 95 ( [1] 2 3 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved