homepage Welcome to WebmasterWorld Guest from 54.166.66.204
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 182 message thread spans 7 pages: < < 182 ( 1 2 3 [4] 5 6 7 > >     
June 27th, August 17th, What's happening?
Pages are vanishing and reappearing -- even whole sites
DeROK




msg:3055211
 3:15 am on Aug 22, 2006 (gmt 0)

< a continued discussion from these threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...] >

-----------------------------------------------

Hey everyone,

I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?

Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.

So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.

Here are my questions. If any of you can shed some light on these, I would really appreciate it.

1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?

2. Can I expect a recovery similar to the one I had in July?

3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?

Thanks for you time!

[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]

 

fjpapaleo




msg:3057974
 10:43 pm on Aug 23, 2006 (gmt 0)

"I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above"

This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.

gcc_llc




msg:3057979
 10:47 pm on Aug 23, 2006 (gmt 0)

" I would assume that it is NOT in [gfe-eh.google.com...] right? "

ITs for ALL DC's including that one

" as retrieved on Aug 19, 2005"

Just checked.

g1smd




msg:3057982
 10:48 pm on Aug 23, 2006 (gmt 0)

>> when links become archived then the link strenth decreases and the page goes supplemental <<

Not quite the same thing, but related to "link strength", a friend has a 30 page site, with some pages having only minimal text content. I warned for a year that having the same meta tag on multiple pages was going to cause trouble.

The pages had been listed normally for several years, but then just a few months ago I did a site:domain.com search one day and found several changes had recently occured.

Only two pages showed up before the "repeat the search with the omitted results included" message appeared.

On clicking that link, many of the pages now appeared but all apart from two had turned supplemental.

A few days later, just a few pages were listed after the message was clicked. Only two were shown as normal results, and there were a few more pages listed as supplemental. The rest had disappeared from the listings.

The two normally listed pages were the only two pages of the site that had any external incoming links: the homepage and one internal page.

Upon fixing the meta descriptions on all of the pages, it was only a matter of weeks before all the pages from the site were listed fully and normally again in a site: search.

[edited by: g1smd at 10:50 pm (utc) on Aug. 23, 2006]

Halfdeck




msg:3057983
 10:48 pm on Aug 23, 2006 (gmt 0)

To be or not to be supplemental.. is a question that hinges on more than one factor alone. I believe PageRank is a factor, but a PR of 8 may not save you if the page is an identical copy of a page with a PR of 10.

Google's algo is not a single IF/ELSE statement.

g1smd




msg:3057985
 10:53 pm on Aug 23, 2006 (gmt 0)

Yes there are several types of supplemental results.

I made a lot of comments about that in: [webmasterworld.com...] especially on page 2 of that thread, onwards.

hvacdirect




msg:3057987
 10:58 pm on Aug 23, 2006 (gmt 0)

Would not having a description (completely omited from the pages) on several pages have the same result as having duplicate descriptions?

fjpapaleo




msg:3057989
 10:59 pm on Aug 23, 2006 (gmt 0)

"I believe PageRank is a factor"

It's onlyn a factor in that most higher PR sites have a higher trust rank due to age and incoming links. You can get away with just about anything with an older site that has high TR. Try the same tactic with a lower TR site and you'll get banned to the google dungeon forever.

g1smd




msg:3057991
 11:04 pm on Aug 23, 2006 (gmt 0)

An omitted meta description causes roughly the same problems as an identical meta description.

In the omitted case, Google uses the first words on the page: often from the nav bar, and therefore often identical too.

trinorthlighting




msg:3057997
 11:08 pm on Aug 23, 2006 (gmt 0)

We have not moved an inch in the serps, either up or down during all of the "data refreshes". That leads me to believe that all the sites that are flexing a bit have a borderline penalty to them due to meta tags, title tags, or incorrect html.

Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2

trinorthlighting




msg:3058001
 11:17 pm on Aug 23, 2006 (gmt 0)

One other thought of flux, if you are on a shared server, and say 90% of the other sites on that server are pure and banned spam, that could be hurting you. (Extreme Example)

ITt always helps knowing your neighbors on a shared server. We always use website server providers that strictly prohibit any type of spamming. Does your webhost do the same?

Halfdeck




msg:3058004
 11:24 pm on Aug 23, 2006 (gmt 0)

Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2

That's like saying being 300lb overweight isn't a factor when getting dates because I have three fat relatives that got married before they hit 30. =)

gcc_llc




msg:3058011
 11:35 pm on Aug 23, 2006 (gmt 0)

"Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2 "

Then your keywords aren't very competitive.

thms




msg:3058021
 11:47 pm on Aug 23, 2006 (gmt 0)


An omitted meta description causes roughly the same problems as an identical meta description.

In the omitted case, Google uses the first words on the page: often from the nav bar, and therefore often identical too.

wordpress by default omits meta keyword and meta description, isn't it? Matt Cutts blog also doesn't have meta kwd and description

petehall




msg:3058029
 11:57 pm on Aug 23, 2006 (gmt 0)

This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.

That I would love to see. I presume you are writing unique content.

Please sticky me?!

To be or not to be supplemental.. is a question that hinges on more than one factor alone. I believe PageRank is a factor, but a PR of 8 may not save you if the page is an identical copy of a page with a PR of 10.
Google's algo is not a single IF/ELSE statement.

Exactly - it's at least 3 (lol).

I'd say anyone complaining about being supplemental listings as a result of copying content from another site shouldn't be moaning about supplemental results in the first place.

However this is unfortunate for companies writing quality, unique content and having little PR to aid retention of this content in the main index.

Surely if you take a close look at supplemental pages they must be 90% a result of low PR / poor linking structure (which in turn leads to a low of PR).

AustrianOak




msg:3058038
 12:09 am on Aug 24, 2006 (gmt 0)

"That leads me to believe that all the sites that are flexing a bit have a borderline penalty to them due to meta tags, title tags, or incorrect html."

Wrong. We do not know what the cause of the "penalties" are, otherwise we'd fix them. My meta, title and html is 100% fine. Also, let's not assume it's a valid penalty we are getting hit with, many of us have 100% legit sites that are caught up in a google mess of some sorts.

Dead_Elvis




msg:3058055
 12:31 am on Aug 24, 2006 (gmt 0)

I think some of you are barking up the wrong tree here ;)

I've got PR 5 & 6 sites that have been hit... lost traffic on June 27th, got it back July 27th, lost it again on August 17th.

These are both original content sites, they have unique Meta keywords & descriptions on all pages, they validate, and they have plenty of orgainc links ie, they are exactly what Google tells us to build: quality sites.

This is certainly not about PR, and while the Meta issue may be effecting some, it certainly isn't a factor for me.

Personally, I'm lost.

petehall




msg:3058057
 12:33 am on Aug 24, 2006 (gmt 0)

Dead_Elvis - as I said I think this is an temp error (I hope).

Dead_Elvis




msg:3058064
 12:36 am on Aug 24, 2006 (gmt 0)

I'm with you I hope!

walkman




msg:3058075
 12:48 am on Aug 24, 2006 (gmt 0)

this is bafling. My domain does not even rank for "domain.com" any more. The index page has plenty of high powered backlinks links, it is indexed daily and google "gets" about 15-50% of the links each day. Yet, only about 50% of the pages are in the index, and many are supplementals. Pages are VERY different and the META is unique as well.

Until two weeks or so ago when doing site:domain.com, the home page was listed either second or third. Now it is listed on top but the rankings still suck. When I rank #1 in Google for "domain.com" I do extremely well, so I assume an automatic penalty has ben assigned to my site. Anyone else having this problem?

petehall




msg:3058077
 12:52 am on Aug 24, 2006 (gmt 0)

Yeah... Google seems a total mess at the moment.

I'd say 75+% of people will be having problems - some extreme.

trinorthlighting




msg:3058088
 1:00 am on Aug 24, 2006 (gmt 0)

The penalties could be a lot of things, those were just a few. It also could be the keyword density as well. Have you guys read this thread?

[webmasterworld.com...]

AustrianOak




msg:3058090
 1:04 am on Aug 24, 2006 (gmt 0)

trino, careful in your wording.. you claimed ALL SITES. That comment has no facts behind it unless you yourself have tested ALL SITES.

As you can see by some post already your theory has failed.

Anyone else?

[edited by: AustrianOak at 1:05 am (utc) on Aug. 24, 2006]

trinorthlighting




msg:3058105
 1:12 am on Aug 24, 2006 (gmt 0)

Oak,

I agree, a lot of the sites that are flexing a bit are good sites and are in competitive keywords.

How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this.

bwalters




msg:3058119
 1:28 am on Aug 24, 2006 (gmt 0)

For what it's worth--my sites have perfect HTML, minimal SEO, and no obvious flags (they do have unique metas and title, etc.) No dynamic pages and no pages supplemental ever; no more than 100 pages per site.

When ranking all sites rank on the first page, and for the main site #2, for the relevant keywords.

My main site has had the experience of a lot of people: lost traffic (about 95%) on June 27th, got it back July 27th (about 150% of previous), lost it again on August 17th (about 80% from baseline).

Four other sites, each with less traffic and different topics, experienced the exact opposite--so in a sense it balances out. All sites have AdSense, Google Analytics and Google Sitemaps.

Dead_Elvis




msg:3058195
 3:23 am on Aug 24, 2006 (gmt 0)

That reminds me, unlike many others, my pages have not gone supplemental they've just dropped from the SERPS.

AustrianOak




msg:3058206
 3:34 am on Aug 24, 2006 (gmt 0)

"How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this."

Yes, I just noticed this today. I had selected the www.domain.com format just to see that the function is no longer working.

What I (we) would give to get a glimpse inside google logic/theory/etc/etc.. it's scary to think how bad things may be just seeing all the chaos from outside!

[edited by: AustrianOak at 3:34 am (utc) on Aug. 24, 2006]

Bewenched




msg:3058218
 3:57 am on Aug 24, 2006 (gmt 0)

How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this.

I turned on the perffered domain about 4 days ago. Nothing seemed different in the total non-supplemental pages. Until yesterday... I about fell out of my chair.

Then 90% of what was in their system went supplimental. I'm not saying it's a totally bad thing since the cache dates were so old .. well over a year old on those...

Google has been spidering our pages alot since I turned on the perferred domain, but our site is so big it will probably take them a year at this rate to get most of them back in.

Strange thing is ... the ones that are supplimental all had unique content, unique titles and for the most part unique descriptions.

Some pages are climbing out of the supplimental status today.

I will say this... since things have been slow I've taken advantage of the extra time to do some code cleanup, optimization and made more use of style sheets to get the page weight down.

[edited by: Bewenched at 3:58 am (utc) on Aug. 24, 2006]

the_nerd




msg:3058378
 7:44 am on Aug 24, 2006 (gmt 0)

What is interesting about this update is that it has impacted sites that had zero content changes over the past several months. This leads me to believe that whatever change is happening isn't related to on-page factors.

not neccessarily - if they change their "gusto" this could hit pages that haven't changed in years.

the_nerd




msg:3058393
 7:58 am on Aug 24, 2006 (gmt 0)

tedster,

the webmaster backed off on those keyword links and saw upward movement within a few days. Not back to the first page, but a solid jump.

something I wanted to know for long time - but didn't dare asking ;-)

We know penalties. Your out and if you try you might be allowed in later.

And we know filters. You run into one - change something (like you mentioned) and - back you are.

Is there something in between - like they catch you, add a "4 weeks probation tag" and then let you back in - just to avoid too much tinkering?

nerd

soapystar




msg:3058413
 8:35 am on Aug 24, 2006 (gmt 0)

>>>g1smd

can you clarify..when you speak of identical metas...do you mean any one of the meta could match another and cause a problem?..i.e two pages might have the same keywords but other meta differ..?

is their a % threshold at which they appear different...?

is no meta better than identical metas..?

Thanks!

schalk




msg:3058437
 9:08 am on Aug 24, 2006 (gmt 0)

I am asking myself the same question what is happening.

We have an E-Commerce site with thousands of pages. In the April hit we lost everything to supplemental. We then recovered after a few weeks, but still kept some pages in supplemental. Every since then we have slowly been losing more pages and seemingly took a major hit on August 17th.

We have a lot of product pages, that have very little content, but some have massive content. I can't see any pattern with which pages going supplemental, since we are losing some good pages with loads of content, and keeping bad ones with low content.

The worry is that we do repeat some paragraphs on common pages, for shipping details and returns policy. Could I be triggering the duplicate content, by repeating this sort of thing? On this basis I don't really know whether I am truly been hit because of duplicate content. It feels as though we are, but can't see a definate pattern.

Question 1:

I could really do with someone confirming what is truly classed as Duplicate Content. There must be a lot of people out there running template product pages, faced with the same problem.

Question 2:

Am I really seeing a drop of pages or has the way the site: count of pages changed. I think I had read from Matt Cutts that they were possibly modifiying this count, to make it more accurate). My feeling is I am seeing both. More pages going supplemental, and a more accurate count of pages when using site: command.

Question 3:

How do I get these pages out of supplemental? If I make changes, can I expect them to reappear.

This 182 message thread spans 7 pages: < < 182 ( 1 2 3 [4] 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved