homepage Welcome to WebmasterWorld Guest from 184.73.40.21
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 182 message thread spans 7 pages: < < 182 ( 1 2 3 4 [5] 6 7 > >     
June 27th, August 17th, What's happening?
Pages are vanishing and reappearing -- even whole sites
DeROK

5+ Year Member



 
Msg#: 3055209 posted 3:15 am on Aug 22, 2006 (gmt 0)

< a continued discussion from these threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...] >

-----------------------------------------------

Hey everyone,

I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?

Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.

So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.

Here are my questions. If any of you can shed some light on these, I would really appreciate it.

1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?

2. Can I expect a recovery similar to the one I had in July?

3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?

Thanks for you time!

[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]

 

schalk

10+ Year Member



 
Msg#: 3055209 posted 9:08 am on Aug 24, 2006 (gmt 0)

I am asking myself the same question what is happening.

We have an E-Commerce site with thousands of pages. In the April hit we lost everything to supplemental. We then recovered after a few weeks, but still kept some pages in supplemental. Every since then we have slowly been losing more pages and seemingly took a major hit on August 17th.

We have a lot of product pages, that have very little content, but some have massive content. I can't see any pattern with which pages going supplemental, since we are losing some good pages with loads of content, and keeping bad ones with low content.

The worry is that we do repeat some paragraphs on common pages, for shipping details and returns policy. Could I be triggering the duplicate content, by repeating this sort of thing? On this basis I don't really know whether I am truly been hit because of duplicate content. It feels as though we are, but can't see a definate pattern.

Question 1:

I could really do with someone confirming what is truly classed as Duplicate Content. There must be a lot of people out there running template product pages, faced with the same problem.

Question 2:

Am I really seeing a drop of pages or has the way the site: count of pages changed. I think I had read from Matt Cutts that they were possibly modifiying this count, to make it more accurate). My feeling is I am seeing both. More pages going supplemental, and a more accurate count of pages when using site: command.

Question 3:

How do I get these pages out of supplemental? If I make changes, can I expect them to reappear.

mbucks

5+ Year Member



 
Msg#: 3055209 posted 9:56 am on Aug 24, 2006 (gmt 0)

Hi Schalk.

Those are the golden questions that I've been trying for weeks to get answers for. Good content being lost to supplemental.

And in answer to a previous post, we have many PR 4 pages in supplemental too. So This appears to be unrelated.

schalk

10+ Year Member



 
Msg#: 3055209 posted 10:45 am on Aug 24, 2006 (gmt 0)

It would be nice to see some feedback from GoogleGuy on this subject, not sure if he has posted recently?

cleanup

10+ Year Member



 
Msg#: 3055209 posted 10:56 am on Aug 24, 2006 (gmt 0)

I too have lost pages and took a big hit on another site August 17th.

No apparent reason as far as I can see, I have checked links, content, metas etc ad infinitum...

Just one comment though on missing pages and google problems in general.

Whilst everyone is running around like headless chickens looking for on/off pages factors on their sites I have a question.

Remember Google aluded a year ago their Db storage problems.

Big Daddy was supposed to cure "infrastructure problems"

So do we now all have 100% confidence that Google has fixed all of ITS problems?

I for one do not, I am not wasting any more time looking for site problems when I can see some of my site getting hit and some not in what appears to be in a random fashion. Comming and going on the so called refreshes every month or so.

Once again I think folks here are giving Google too much credit, with people trying to look for logic behind what seems to be just an ongoing firefighting excersise on the part of Googles technical dept.

IMHO of course..

[edited by: cleanup at 11:00 am (utc) on Aug. 24, 2006]

Tomseys

10+ Year Member



 
Msg#: 3055209 posted 10:58 am on Aug 24, 2006 (gmt 0)

Has anyone had their sites de-indexed over the past couple of days? I am missing some sites and have wrote G and I have gotten no reponse.

[edited by: Tomseys at 11:25 am (utc) on Aug. 24, 2006]

Tomseys

10+ Year Member



 
Msg#: 3055209 posted 11:18 am on Aug 24, 2006 (gmt 0)

I just noticed that Google dropped matt cutts site too. So I guess this some sort of issue G is having?

zeus

WebmasterWorld Senior Member zeus us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 11:42 am on Aug 24, 2006 (gmt 0)

I also have a lot of pages gone supplemental, now I have removed them from the server, so it is a 404, but they are still in the index, this was a test.

Also have another site which is all over supplemental, but not the index page, the stupid thing EVERY page is unique, maybe 5% same content rest is unique on each page, then I found out that sometimes when you write a different description on each page you should come back which I have now done on this site, but nothing.

Google srop this supplemental and omitted stuff and get a bigger harddisc:)

Northstar

5+ Year Member



 
Msg#: 3055209 posted 11:53 am on Aug 24, 2006 (gmt 0)

My active pages aren't supplemental just buried under thousands of old 404 supplemental pages. During the "fix" July 27th to Aug. 17th my active pages were all listed at the top then the thousands of 404 supplimentals. Is this what the rest of you are seeing?

montefin

5+ Year Member



 
Msg#: 3055209 posted 12:07 pm on Aug 24, 2006 (gmt 0)

It would be nice to see some feedback from GoogleGuy on this subject, not sure if he has posted recently?

schalk,

I, like others, have seen my site battered like a pinata by Google since April. As a native New Yorker, I can buy into the old ConEdison line "Dig we must for a better New York" -- for a while.

But at this point, if GoogleGuy is anybody below Sergey Brin or Larry Page, I'm not sure his/her posting would be meaningful.

For my part, I'll hold on till the end of Summer -- which as we all know is September 22nd.

Only 29 days and counting.

petehall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 12:14 pm on Aug 24, 2006 (gmt 0)

I have a new site to launch soon and am too scard to do it thanks to the current mess.

This is radiculous to say the least.

If we didn't care about Google it could go live without any trouble what so ever.

dtcoates

5+ Year Member



 
Msg#: 3055209 posted 12:15 pm on Aug 24, 2006 (gmt 0)

schalk,

Your situation EXACTLY matches mine. Here's what I have changed, if it helps:

1) Removed ANY possibility of search engines finding pages for URLs other than the ones I want (non-WWW, HTTPS versions, other domains pointing to the same webspace). I checked using Xenu and it found 18,000 links on my site. After the fixes, only the links I wanted showed up). Did this using .htaccess and robots.txt, plus absolute links and rel="nofollow" to hide the secure checkout

2) Removed all our test sites from the server (some with valid robots.txt files, but still in Google) that may have duplicate content

3) Used the URL removal tool to attempt to remove ANY potential duplicate content from 2) and 3) - and there was plenty!

4) Made sure every page returns correct server header (200, 301, 404 as applicable) and removed a load of redundant 301s (we had several thousand after a site move in April)

5) Used copy scape to find sites who have stolen our content - I found one that copied about half of our site and have reported them to Google. They even left our links on - they go straight back to our site! If Google will remove them, I'm considering legal action

6) We're considering revising generic product descriptions as we've found many other sites who are using the same as us - you know how it is, you just use the text from the brochure for speed, but that could now be a problem

7) Completely randomised the meta keywords for products in the same category, so that although they use a common "pool", they are all different

8) Looked at repetition and density of certain words, like VAT - on one page it was there 104 times so I've changed the code to cut it down

9) Removed some of the repeated stuff - we had "Useful info" text that could appear on many similar pages, so we've minimised it

10) Stepped up link building efforts (articles, blogs etc.)

11) Submitted regular Sitemaps - although some say they could be a bad thing; don't know

12) Also considering cutting down or mixing up the content of alt / title tags on images and buy buttons - currently they're all a bit similar and I'm worried about "over optimisation"

I have wasted so much time on this for the last four months, reading forum post after forum post, wondering what the hell I've done. Just hoping that it'll all be over, but not too confident.

Darren

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3055209 posted 12:17 pm on Aug 24, 2006 (gmt 0)

I would say this all has to do with the prefferred domains. They must not have properly tested it before they rolled it out.

If you log on to site maps, you will see the function is not working.

night707

5+ Year Member



 
Msg#: 3055209 posted 12:33 pm on Aug 24, 2006 (gmt 0)

... and in answer to a previous post, we have many PR 4 pages in supplemental too ...

For my site utterly unimportant frames had been featured at google resulting in "unimportant " ratings inside google analytics for that site whilst its PR 4 and 5 pages went supplemental and traffic from google com went to ZERO on Aug 17.

Before several pages with totally different content had been on Google`s No1 for particular KWs. Now ALL Google.com traffic gone on one day just like on June 27 and after Bourgbon in 2005.

night707

5+ Year Member



 
Msg#: 3055209 posted 12:39 pm on Aug 24, 2006 (gmt 0)

Dead Elvis

I think some of you are barking up the wrong tree here ;)

I've got PR 5 & 6 sites that have been hit... lost traffic on June 27th, got it back July 27th, lost it again on August 17th.

These are both original content sites, they have unique Meta keywords & descriptions on all pages, they validate, and they have plenty of orgainc links ie, they are exactly what Google tells us to build: quality sites.

Personally, I'm lost.

........ Hey, exactly the very same happening to me. Same dates, same problem Google engeneers doing real damage to publishers and user experience.

crobb305

WebmasterWorld Senior Member crobb305 us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 1:23 pm on Aug 24, 2006 (gmt 0)

My oh my, I am very disappointed by the amount of subdomain spam, and redirect urls that are sneaking into the index. Are these somehow getting past the "sandbox"? On one of my search queries, I see 4 of these in the top 10. The index has taken a drastic downturn in the past 2 weeks. I already notified Google of this by using the report-spam option inside Sitemaps. Nothing has happened yet.

I am very hopeful to see some changes real soon.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 2:43 pm on Aug 24, 2006 (gmt 0)

Is there something in between - like they catch you, add a "4 weeks probation tag" and then let you back in - just to avoid too much tinkering?

I heard from Matt Cutts at a PubCon -- sadly, I have no pictures of me and "my good buddy" :) -- that he hoped to do something like this some day - algorithmically apply some penalties, and remove them algorithmically, but BY DEGREES, if the issue was fixed. And I certainly see behavior in today's SERPs that suggests this kind of thing is in place.

So that "automated penalty removal by degrees" would be something like your probation tag.

ScarlettR8

10+ Year Member



 
Msg#: 3055209 posted 5:14 pm on Aug 24, 2006 (gmt 0)

To extend this discussion into more avenues we, as webmasters of legit sites, can take in this current search engine mayhem...is everyone doing their part to help Google, etc, battle spammy directories and MFA sites by using the Competitive Ad Filter to block their display in Adsense ads on our sites? If we all keep up with this, particularly those of us that have high ranking (traffic) sites and Adsense...won't that cut their income source (and help ours) and slow the proliferation. May be simplistic, but it has a certain logic and I've seen good results since I've become more concientious about blocking them. Thoughts?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 5:22 pm on Aug 24, 2006 (gmt 0)

>> I also have a lot of pages gone supplemental, now I have removed them from the server, so it is a 404, but they are still in the index, this was a test. <<

I keep on saying this, and I will keep on repeating it.

1. You cannot remove Supplemental Results from the index. They stay there for a year, then Google drops them in their own time if the page no longer exists, or the page content is now vastly different to the Supplemental Result.

2. By counting supplemental results you are looking at the wrong thing.

3. You should make sure that for supplemental pages that are 404 that the status code really is 404, and that for URLs that issue a redirect, that it really is a 301 redirect. If that is the case then Google will drop those supplemental results after a year. I have seen those sort of results dropped in the last 3 "updates": late last year, earlier this year, and last week. The latest supplemental "update" has only occured on gfe-eh and has not yet spread to other datacentres.

4. For other content pages, you should make sure that there is only one URL for each piece of content, and that each page has a unique title and a unique meta description. If you have multiple URLs for the same content then you must use robots.txt or the meta robots noindex tag to get the duplicates out of the index. The duplicates will show as supplemental results for a year or more, but IGNORE them. Make sure that the Canonical URL is fully indexed.

5. Make sure that anyone that clicks any result that is supplemental does get to the correct content either via a 301 redirect, or via an error page that contains enough navigation to help the user on their way.

6. Stop counting supplemental results. Look at what you have that is fully indexed. Fix internal navigation. Reduce duplicate content. Maximise the number of pages that are fully indexed. Stop counting supplemental results. Google drops them when they want to, not on your schedule.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 5:35 pm on Aug 24, 2006 (gmt 0)

dtcoates: Do you have the same troubles at gfe-eh.google.com or is that one different?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 5:39 pm on Aug 24, 2006 (gmt 0)

soapystar: Make sure that every page has a unique title tag, and a unique meta description. See comments by Matt Cutts [threadwatch.org] for more information.

dtcoates

5+ Year Member



 
Msg#: 3055209 posted 6:13 pm on Aug 24, 2006 (gmt 0)

g1smd,

Yes, at this point in time I am seeing almost identical results for some searches and even worse results for others (unfortunately).

For a site:www.domain.co.uk, there are only 60 pages and 14 are supplemental. On a normal search, there are 172 pages; 136 are supplemental. We actually have over 1000 pages, and on July 27 970 were non-supplemental.

Additionally, our supplemental pages aren't old ones that I want to see the back of - they are our current catalogue pages.

I am seriously worried about traffic if all the datacenters start to look like gfe-eh...

Darren

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 6:36 pm on Aug 24, 2006 (gmt 0)

You have checked site:domain.com too? Without www.

Dead_Elvis

5+ Year Member



 
Msg#: 3055209 posted 6:39 pm on Aug 24, 2006 (gmt 0)

>> I keep on saying this, and I will keep on repeating it.

1. You cannot remove Supplemental Results from the index. <<

Yes you can, it's called the Page Removal tool, and it works quite well. Of course, you need to make sure you block Google from coming back to those pages in the future, but the tool does work.

I've kept my site clean of supplemental results this way, but it hasn't stopped me from being effected by this fiasco.

dtcoates

5+ Year Member



 
Msg#: 3055209 posted 6:52 pm on Aug 24, 2006 (gmt 0)

g1smd,

Yep, pretty much identical results from both www and non-www on both normal and gfe-eh datacenters.

But what does it all mean?

4. For other content pages, you should make sure that there is only one URL for each piece of content, and that each page has a unique title and a unique meta description. If you have multiple URLs for the same content then you must use robots.txt or the meta robots noindex tag to get the duplicates out of the index. The duplicates will show as supplemental results for a year or more, but IGNORE them. Make sure that the Canonical URL is fully indexed.

5. Make sure that anyone that clicks any result that is supplemental does get to the correct content either via a 301 redirect, or via an error page that contains enough navigation to help the user on their way.

6. Stop counting supplemental results. Look at what you have that is fully indexed. Fix internal navigation. Reduce duplicate content. Maximise the number of pages that are fully indexed. Stop counting supplemental results. Google drops them when they want to, not on your schedule.

I've done all this, but only completed it Friday 19th August (unique links) and definitely removed all duplicate content this week. Already had unique title / description, but fixed issue with generic meta keywords this week as well. Obviously (hopefully) it's now a waiting game? What do you think?

My head is spinning Exorcist style with all this...

Thanks,

Darren

egomaniac

10+ Year Member



 
Msg#: 3055209 posted 8:08 pm on Aug 24, 2006 (gmt 0)

schalk said:

I could really do with someone confirming what is truly classed as Duplicate Content. There must be a lot of people out there running template product pages, faced with the same problem.

Simply putting a paragraph or two of copy that is identical across your site is not considered Duplicate Content.

From what I understand, you need to have near 100% duplicate page content to have a duplicate content problem.

Once I duplicated an optin page of mine at two places on my website without thinking about what I was doing. Both pages disappeared from the SERPS until I removed the duplicate one. Then the original recovered its former rankings.

Currently I have dozens of articles that rank extremely well in spite of the fact that I have also syndicated these articles for other publishers to reprint on their websites. These articles have been reprinted hundreds of times, yet they don't cause a duplicate content penalty.

From what I understand, duplicate content penalties require the body content AND the template to be nearly 100% identical.

thms

5+ Year Member



 
Msg#: 3055209 posted 8:30 pm on Aug 24, 2006 (gmt 0)


Once I duplicated an optin page of mine at two places on my website without thinking about what I was doing. Both pages disappeared from the SERPS until I removed the duplicate one. Then the original recovered its former rankings.

why would both page disappear because of duplicate content and why not only the duplicate page is affected?

If what you say is true, then I could link to my competitor site and append a dummy query string to the URL thus creating a duplicate content page on my competitor's site, like:

[competitor-domain.com...]

so the SEs will see a duplicate page and remove both index.php and index.php?a=1?

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 8:35 pm on Aug 24, 2006 (gmt 0)

"Yes you can, it's called the Page Removal tool, and it works quite well."

No, that does absolutely nothing. It just hides them. They come back in six months, but even when not displayed, they are there.

You can't get rid of supplementals, ever. You can just hide them.

schalk

10+ Year Member



 
Msg#: 3055209 posted 8:52 pm on Aug 24, 2006 (gmt 0)

Whilst trying to get to the bottom of my potential duplicate content issue, I discovered that a lot of our meta tags "description" had the same first 4 words or so. Every page had a different description, but the first 4 words were the same. So I then though this was the problem and made sure that the first few words were unique. This was over 4 weeks ago and I have seen no improvement. I basically learned nothing by trying to tweak the site.

If we are to live with pages going supplemental, I could cope with this, if I knew why they went supplemental in the first place.

For me what happened in April is key to the situation we are in now, which I presume is down to Big Daddy.

But I still don't know whether it's me at fault, or whether Google is just in one big mess.

selomelo

5+ Year Member



 
Msg#: 3055209 posted 9:40 pm on Aug 24, 2006 (gmt 0)

I discovered that a lot of our meta tags "description" had the same first 4 words or so.

I have a relevant observation here.

One of the most popular free dictionaries on the net uses a "template" description for every word/idiom/expression in its database. The template is as follows:

"Definition of KW in the Idioms Dictionary. KW phrase. What does KW expression mean? Definitions by the largest Idiom Dictionary."

The distinct dictionary entry (KW)is repeated 3 times in this short description. And it is same for all entries except the entries themseleves.

And this dictionary site dominates first pages of SERPs for literally thousands of words/phrases especially if you add such keywords as "dictionary," "idioms," "expression", "mean", etc.

So, IMHO, "description" may not be a critical factor in duplicate content problems, at least when considered alone.

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3055209 posted 10:05 pm on Aug 24, 2006 (gmt 0)

Supplemental listings are not necessarily bad. A good example is repair parts.

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3055209 posted 10:52 pm on Aug 24, 2006 (gmt 0)

g1smd,

A factor you are forgetting is how often a keyword is searched on google. That plays into the formula a lot.


This 182 message thread spans 7 pages: < < 182 ( 1 2 3 4 [5] 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved