Welcome to WebmasterWorld Guest from 54.196.243.192

Message Too Old, No Replies

June 27th, August 17th, What's happening?

Pages are vanishing and reappearing -- even whole sites

     
3:15 am on Aug 22, 2006 (gmt 0)

New User

10+ Year Member

joined:Feb 10, 2006
posts:27
votes: 0


< a continued discussion from these threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...] >

-----------------------------------------------

Hey everyone,

I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?

Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.

So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.

Here are my questions. If any of you can shed some light on these, I would really appreciate it.

1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?

2. Can I expect a recovery similar to the one I had in July?

3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?

Thanks for you time!

[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]

10:43 pm on Aug 23, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2004
posts:109
votes: 0


"I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above"

This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.

10:47 pm on Aug 23, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Mar 8, 2006
posts:75
votes: 0


" I would assume that it is NOT in [gfe-eh.google.com...] right? "

ITs for ALL DC's including that one

" as retrieved on Aug 19, 2005"

Just checked.

10:48 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


>> when links become archived then the link strenth decreases and the page goes supplemental <<

Not quite the same thing, but related to "link strength", a friend has a 30 page site, with some pages having only minimal text content. I warned for a year that having the same meta tag on multiple pages was going to cause trouble.

The pages had been listed normally for several years, but then just a few months ago I did a site:domain.com search one day and found several changes had recently occured.

Only two pages showed up before the "repeat the search with the omitted results included" message appeared.

On clicking that link, many of the pages now appeared but all apart from two had turned supplemental.

A few days later, just a few pages were listed after the message was clicked. Only two were shown as normal results, and there were a few more pages listed as supplemental. The rest had disappeared from the listings.

The two normally listed pages were the only two pages of the site that had any external incoming links: the homepage and one internal page.

Upon fixing the meta descriptions on all of the pages, it was only a matter of weeks before all the pages from the site were listed fully and normally again in a site: search.

[edited by: g1smd at 10:50 pm (utc) on Aug. 23, 2006]

10:48 pm on Aug 23, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Nov 10, 2005
posts:240
votes: 0


To be or not to be supplemental.. is a question that hinges on more than one factor alone. I believe PageRank is a factor, but a PR of 8 may not save you if the page is an identical copy of a page with a PR of 10.

Google's algo is not a single IF/ELSE statement.

10:53 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Yes there are several types of supplemental results.

I made a lot of comments about that in: [webmasterworld.com...] especially on page 2 of that thread, onwards.

10:58 pm on Aug 23, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 25, 2006
posts:106
votes: 0


Would not having a description (completely omited from the pages) on several pages have the same result as having duplicate descriptions?
10:59 pm on Aug 23, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2004
posts:109
votes: 0


"I believe PageRank is a factor"

It's onlyn a factor in that most higher PR sites have a higher trust rank due to age and incoming links. You can get away with just about anything with an older site that has high TR. Try the same tactic with a lower TR site and you'll get banned to the google dungeon forever.

11:04 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


An omitted meta description causes roughly the same problems as an identical meta description.

In the omitted case, Google uses the first words on the page: often from the nav bar, and therefore often identical too.

11:08 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 5, 2006
posts:2094
votes: 2


We have not moved an inch in the serps, either up or down during all of the "data refreshes". That leads me to believe that all the sites that are flexing a bit have a borderline penalty to them due to meta tags, title tags, or incorrect html.

Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2

11:17 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 5, 2006
posts:2094
votes: 2


One other thought of flux, if you are on a shared server, and say 90% of the other sites on that server are pure and banned spam, that could be hurting you. (Extreme Example)

ITt always helps knowing your neighbors on a shared server. We always use website server providers that strictly prohibit any type of spamming. Does your webhost do the same?

11:24 pm on Aug 23, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Nov 10, 2005
posts:240
votes: 0


Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2

That's like saying being 300lb overweight isn't a factor when getting dates because I have three fat relatives that got married before they hit 30. =)

11:35 pm on Aug 23, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Mar 8, 2006
posts:75
votes: 0


"Page rank is not a factor, because we have pages ranking top 10 in their perspective keywords that have a pr of 0,1,2 "

Then your keywords aren't very competitive.

11:47 pm on Aug 23, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:June 15, 2006
posts:69
votes: 11



An omitted meta description causes roughly the same problems as an identical meta description.

In the omitted case, Google uses the first words on the page: often from the nav bar, and therefore often identical too.

wordpress by default omits meta keyword and meta description, isn't it? Matt Cutts blog also doesn't have meta kwd and description

11:57 pm on Aug 23, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2003
posts:859
votes: 3


This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.

That I would love to see. I presume you are writing unique content.

Please sticky me?!

To be or not to be supplemental.. is a question that hinges on more than one factor alone. I believe PageRank is a factor, but a PR of 8 may not save you if the page is an identical copy of a page with a PR of 10.
Google's algo is not a single IF/ELSE statement.

Exactly - it's at least 3 (lol).

I'd say anyone complaining about being supplemental listings as a result of copying content from another site shouldn't be moaning about supplemental results in the first place.

However this is unfortunate for companies writing quality, unique content and having little PR to aid retention of this content in the main index.

Surely if you take a close look at supplemental pages they must be 90% a result of low PR / poor linking structure (which in turn leads to a low of PR).

12:09 am on Aug 24, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Jan 1, 2003
posts:302
votes: 0


"That leads me to believe that all the sites that are flexing a bit have a borderline penalty to them due to meta tags, title tags, or incorrect html."

Wrong. We do not know what the cause of the "penalties" are, otherwise we'd fix them. My meta, title and html is 100% fine. Also, let's not assume it's a valid penalty we are getting hit with, many of us have 100% legit sites that are caught up in a google mess of some sorts.

12:31 am on Aug 24, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 17, 2006
posts:61
votes: 0


I think some of you are barking up the wrong tree here ;)

I've got PR 5 & 6 sites that have been hit... lost traffic on June 27th, got it back July 27th, lost it again on August 17th.

These are both original content sites, they have unique Meta keywords & descriptions on all pages, they validate, and they have plenty of orgainc links ie, they are exactly what Google tells us to build: quality sites.

This is certainly not about PR, and while the Meta issue may be effecting some, it certainly isn't a factor for me.

Personally, I'm lost.

12:33 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2003
posts:859
votes: 3


Dead_Elvis - as I said I think this is an temp error (I hope).
12:36 am on Aug 24, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 17, 2006
posts:61
votes: 0


I'm with you I hope!
12:48 am on Aug 24, 2006 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


this is bafling. My domain does not even rank for "domain.com" any more. The index page has plenty of high powered backlinks links, it is indexed daily and google "gets" about 15-50% of the links each day. Yet, only about 50% of the pages are in the index, and many are supplementals. Pages are VERY different and the META is unique as well.

Until two weeks or so ago when doing site:domain.com, the home page was listed either second or third. Now it is listed on top but the rankings still suck. When I rank #1 in Google for "domain.com" I do extremely well, so I assume an automatic penalty has ben assigned to my site. Anyone else having this problem?

12:52 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2003
posts:859
votes: 3


Yeah... Google seems a total mess at the moment.

I'd say 75+% of people will be having problems - some extreme.

1:00 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 5, 2006
posts:2094
votes: 2


The penalties could be a lot of things, those were just a few. It also could be the keyword density as well. Have you guys read this thread?

[webmasterworld.com...]

1:04 am on Aug 24, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Jan 1, 2003
posts:302
votes: 0


trino, careful in your wording.. you claimed ALL SITES. That comment has no facts behind it unless you yourself have tested ALL SITES.

As you can see by some post already your theory has failed.

Anyone else?

[edited by: AustrianOak at 1:05 am (utc) on Aug. 24, 2006]

1:12 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 5, 2006
posts:2094
votes: 2


Oak,

I agree, a lot of the sites that are flexing a bit are good sites and are in competitive keywords.

How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this.

1:28 am on Aug 24, 2006 (gmt 0)

New User

5+ Year Member

joined:July 29, 2006
posts:3
votes: 0


For what it's worth--my sites have perfect HTML, minimal SEO, and no obvious flags (they do have unique metas and title, etc.) No dynamic pages and no pages supplemental ever; no more than 100 pages per site.

When ranking all sites rank on the first page, and for the main site #2, for the relevant keywords.

My main site has had the experience of a lot of people: lost traffic (about 95%) on June 27th, got it back July 27th (about 150% of previous), lost it again on August 17th (about 80% from baseline).

Four other sites, each with less traffic and different topics, experienced the exact opposite--so in a sense it balances out. All sites have AdSense, Google Analytics and Google Sitemaps.

3:23 am on Aug 24, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 17, 2006
posts:61
votes: 0


That reminds me, unlike many others, my pages have not gone supplemental they've just dropped from the SERPS.
3:34 am on Aug 24, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Jan 1, 2003
posts:302
votes: 0


"How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this."

Yes, I just noticed this today. I had selected the www.domain.com format just to see that the function is no longer working.

What I (we) would give to get a glimpse inside google logic/theory/etc/etc.. it's scary to think how bad things may be just seeing all the chaos from outside!

[edited by: AustrianOak at 3:34 am (utc) on Aug. 24, 2006]

3:57 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 26, 2006
posts:1619
votes: 0


How many of you flipped on the preffered domain on site maps just to have it not working? Google could be shuffling due to this.

I turned on the perffered domain about 4 days ago. Nothing seemed different in the total non-supplemental pages. Until yesterday... I about fell out of my chair.

Then 90% of what was in their system went supplimental. I'm not saying it's a totally bad thing since the cache dates were so old .. well over a year old on those...

Google has been spidering our pages alot since I turned on the perferred domain, but our site is so big it will probably take them a year at this rate to get most of them back in.

Strange thing is ... the ones that are supplimental all had unique content, unique titles and for the most part unique descriptions.

Some pages are climbing out of the supplimental status today.

I will say this... since things have been slow I've taken advantage of the extra time to do some code cleanup, optimization and made more use of style sheets to get the page weight down.

[edited by: Bewenched at 3:58 am (utc) on Aug. 24, 2006]

7:44 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 29, 2003
posts:790
votes: 0


What is interesting about this update is that it has impacted sites that had zero content changes over the past several months. This leads me to believe that whatever change is happening isn't related to on-page factors.

not neccessarily - if they change their "gusto" this could hit pages that haven't changed in years.

7:58 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 29, 2003
posts:790
votes: 0


tedster,

the webmaster backed off on those keyword links and saw upward movement within a few days. Not back to the first page, but a solid jump.

something I wanted to know for long time - but didn't dare asking ;-)

We know penalties. Your out and if you try you might be allowed in later.

And we know filters. You run into one - change something (like you mentioned) and - back you are.

Is there something in between - like they catch you, add a "4 weeks probation tag" and then let you back in - just to avoid too much tinkering?

nerd

8:35 am on Aug 24, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 19, 2002
posts:1945
votes: 0


>>>g1smd

can you clarify..when you speak of identical metas...do you mean any one of the meta could match another and cause a problem?..i.e two pages might have the same keywords but other meta differ..?

is their a % threshold at which they appear different...?

is no meta better than identical metas..?

Thanks!

This 182 message thread spans 7 pages: 182