homepage Welcome to WebmasterWorld Guest from 54.227.20.250
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 182 message thread spans 7 pages: < < 182 ( 1 2 [3] 4 5 6 7 > >     
June 27th, August 17th, What's happening?
Pages are vanishing and reappearing -- even whole sites
DeROK

5+ Year Member



 
Msg#: 3055209 posted 3:15 am on Aug 22, 2006 (gmt 0)

< a continued discussion from these threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...] >

-----------------------------------------------

Hey everyone,

I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?

Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.

So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.

Here are my questions. If any of you can shed some light on these, I would really appreciate it.

1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?

2. Can I expect a recovery similar to the one I had in July?

3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?

Thanks for you time!

[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]

 

leeds1

5+ Year Member



 
Msg#: 3055209 posted 9:27 am on Aug 23, 2006 (gmt 0)


I've made no changes

I don't have google sitemaps

Homepage MIA for singular term

OK for plural.

(happened before this yr)

Tomseys

10+ Year Member



 
Msg#: 3055209 posted 9:54 am on Aug 23, 2006 (gmt 0)

is there any way to contact G regarding sites being dropped?

mattg3

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3055209 posted 10:19 am on Aug 23, 2006 (gmt 0)

The point is simple, it's a huge risk to put all your eggs into one basket if you have a sound business plan. Especially if the eggs are going in something that you have zero control over as many of us are finding out with the "refreshes".

If you assume for a minute we have all heared about diversification.... ;)

Anyway in Europe I can't really see a way around Google. What we do to diversify traffic sources is to make videos and suggest links to wikipedia, where most stay. Still this has to be done sensibly so it's not spam.

Of course I should have become a builder or so to make money ... but well ... too old now.. sigh ..

Tizak

5+ Year Member



 
Msg#: 3055209 posted 11:06 am on Aug 23, 2006 (gmt 0)

Google supplemental hell index,

I have a personnel website to display my photography and I've been trying to keep my web site design minimal and clean without much text content. The problem I'm having is that I'm continually in the Google supplemental hell index and I am getting no traffic. I would prefer to keep the web site interface uncluttered and text free, however, the conclusion I am coming to is that I will have to add more explanatory text.

Although being a newbie I attempted to build my site just using CSS, so it could be a formatting issue, also I share an IP address and so I'm not sure that could also be an issue.

I've tried to make the title, keywords, and alternative image tags as descriptive as possible when there is no text content on page, and tweak individual pages and have a Google site map, I'm crawled continually by the major search robots, and notify Google site maps whenever my web site has been updated. But none of the above has solved my problem. please have look at my site www.dig-i-tal.com .

gcc_llc

5+ Year Member



 
Msg#: 3055209 posted 12:17 pm on Aug 23, 2006 (gmt 0)

Once again, we're back to "as retrieved on Aug 19, 2005"

This is just ridiculous. Are they EVER going to get rid of this crap? At this point its just a joke.

First we hear the new options in the sitemap about the preferred domain issue and that hasn't been fixed yet. Second we hear that all supp results are going to be only 2-3 months old and Matt himself say's everyone should have results form March and now that isn't even correct.

Its going from frustration to pure anger at this point.

gcc_llc

5+ Year Member



 
Msg#: 3055209 posted 1:03 pm on Aug 23, 2006 (gmt 0)

"In the first case I mentioned above -- the footer links -- the webmaster backed off on those keyword links and saw upward movement within a few days."

Tedster,

I did a friendly search this morning for "increasing traffic adwords" and the first page was so filled with keywords at the bottom that I was a bit shocked. Google fighting spam? Doens't seem so to me. They even outrank Google. This is a prime example of what people tell you NOT to do, yet this page ranks number 1 and beats Google.

DeROK

5+ Year Member



 
Msg#: 3055209 posted 1:24 pm on Aug 23, 2006 (gmt 0)

What I don't understand I why they try to filter spam using an algorithm. It will never work. People will always try to cheat the system. Why don't they just hire people who can actually think and have them manually apply penalties to blatant cheaters?

Would it be so hard to add a spam button to the Google toolbar so people could quickly flag junk sites? Then once a site is flagged, somebody at Google reviews the site and makes a call as to whether a site is in violation or not?

AustrianOak

10+ Year Member



 
Msg#: 3055209 posted 1:52 pm on Aug 23, 2006 (gmt 0)

"Another observation - it appears for all our major keywords we cannot get above position 30 on the rankings. We see some increases up to postion 31, and then they bounce back to 50 +. Start increasing up to 31, and then drop back again. Anyone else seeing this effect?"

YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area.

Thank god for the trickle of yahoo and msn rankings that keep my site at around 20% of normal traffic. Currently since this last August disaster I have ZERO google traffic.

Long live spam. :S

longen

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 1:55 pm on Aug 23, 2006 (gmt 0)

Would it be so hard to add a spam button to the Google toolbar so people could quickly flag junk sites?

The spammers would use it to report quality sites as spam, by the thousands.

KenB

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 1:57 pm on Aug 23, 2006 (gmt 0)

YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area.

The search phrases I checked that got dinged with the July 27th update all fell to around page 6 from page one. Some have recovered as of Aug. 17th, but not many.

SEOcritique

5+ Year Member



 
Msg#: 3055209 posted 5:38 pm on Aug 23, 2006 (gmt 0)

What I don't understand I why they try to filter spam using an algorithm.

For the word the there are 22,280,000,000 indexed documents. That’s over 22 and a quarter billion documents. Even if you used an algorithm to isolate spam suspects the massive number of documents would make human review impractical from time, human resource and financial perspectives.

That does not mean the Google’s algorithms are not human based.

The accounts I have studied report that Google uses human reviewers to analyze a sampling of the index. The data generated or the graded documents are further reviewed for statistically relevant factors and the results are used to create and to tweak Google’s algorithms. (A set of guidelines for document reviewers was actually leaked a few years ago.)

There is additional surmising, albiet less supported, that Google uses the data from its book scanning and non-web data collection to further refine their natural text analysis. The intuitive reasoning behind this deduction is that books do not contain spam so they offer Google a more reliable data set.

incrediBILL

WebmasterWorld Administrator incredibill us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 3055209 posted 6:35 pm on Aug 23, 2006 (gmt 0)

"Sites" do not go supplemental: individual URLs do, or don't.

I can show you a few complete sites that are 99.9999999% supplemental except the home page.

According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental dung.

Sure enough, when I checked those "supplemental sites" they all had the same meta description on every page.

Luckily, those weren't MY site, they were competitors sites ;)

[edited by: incrediBILL at 6:37 pm (utc) on Aug. 23, 2006]

DeROK

5+ Year Member



 
Msg#: 3055209 posted 6:50 pm on Aug 23, 2006 (gmt 0)

"According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental dung."

Can anybody confirm this? If this is what's causing the problem, it's not that hard of a fix. I don't have the same meta description on every single page, but a lot of my pages that cover the same topic have the same meta description as each other.

DeROK

5+ Year Member



 
Msg#: 3055209 posted 6:54 pm on Aug 23, 2006 (gmt 0)

"The spammers would use it to report quality sites as spam, by the thousands."

If they reported quality sites as spam, it would do no good, because a human would review each flagged site and make a decision. So flagging an innocent site would do nothing.

Secondly, if somebody was repeatedly flagging innocent sites, couldn't Google just block any flags from that IP/user?

A system like this would go a long way in clearing out spam on Google. An algorithm will never get rid of spam. Just look at how much crap still gets into your email with "spam filtering" applied.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 6:56 pm on Aug 23, 2006 (gmt 0)

having the same meta description in every page causes Google to assume the pages are all about the same thing

Yes, I can confirm this, and in quite a few cases now. And the fix has been as simple as adding unique and page specific meta descriptions. (Or maybe not so simple, in some dynamic cases.)

It didn't used to be this way, but I first stumbled onto the problem for one client site last fall (and fixed it) -- and then other members here started confirming. It doesn't always mean a Supplemental tag -- sometimes it just shuffles everything off into the "Omitted Results" link.

There are definitely many other cases of "going supplemental" that do not involve identical meta descriptions and/or titles, however.

wanderingmind

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 7:00 pm on Aug 23, 2006 (gmt 0)

One of the reasons, not the only one.

I haven't had similar descriptions in my site for a few years now. Still data refreshes play havoc and supplementals come and go.

gcc_llc

5+ Year Member



 
Msg#: 3055209 posted 7:59 pm on Aug 23, 2006 (gmt 0)

Has anyone else chosen a preferred domain in sitemaps? I did as soon as the option was available and has a date of Aug 5th but now I just checked it again and I see no date and the option to choose is available again.

EDIT: I saw Vanessa state they are working on sitemaps right now.

netmeg

WebmasterWorld Senior Member netmeg us a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



 
Msg#: 3055209 posted 8:46 pm on Aug 23, 2006 (gmt 0)

I did on one site (one of my personal sites, as opposed to a client site) I haven't noticed anything different. For some reason the non-www version has a better PR for the home page than the www version, and that's still the case. But there have been no ranking/inclusion/site: ramifications as far as I can tell.

Halfdeck

5+ Year Member



 
Msg#: 3055209 posted 9:01 pm on Aug 23, 2006 (gmt 0)

Identical meta description/title is one major factor, but as others said, that's not the only reason. If it was, it would be easy to crawl out of the supplemental index.

The fact that once a page is tagged as supplemental, you need a Supplemental Googebot to come around and recrawl your supplemental urls doesn't help, since it used to come around every 6 months (or so I've heard).

I also think once a site goes heavily supplemental, you need to regain some trust with Google to get pages back into the main index (i.e. organic inbounds/PageRank). It would be one way Google guards itself against 100,000,000 page spam sites sitting in the supplemental index, and preventing any periodical on-page tweaks from reinjecting the site into the main index.

Also, Vanessa Fox recently commented on Google Groups (Crawling/indexing/ranking) regarding a directory type site with barely any text on the category pages:

You should also take a look at your site and make sure it provides
unique content. Most of your categories don't seem to have any content.
You'll need your pages to have value in order to get them indexed.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 9:08 pm on Aug 23, 2006 (gmt 0)

You have got to be kidding me. December 2004 pages back AGAIN.

Is Google really this inept?

[edited by: steveb at 9:14 pm (utc) on Aug. 23, 2006]

twebdonny



 
Msg#: 3055209 posted 9:13 pm on Aug 23, 2006 (gmt 0)

>>>>December 2004 pages back AGAIN.<<<<

Would that be pre-Florida? That is the time the real
demise at Google began.

petehall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 9:22 pm on Aug 23, 2006 (gmt 0)

Patterns in titles and meta result in omitted results not supplemental.

Pages are 'pushed' into supplemental when there is not enough PageRank to keep them in the main index (I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above).

Quite often this is down to a poor linking structure, however as far as I can see the level of PR required to keep pages in the main index just went up - which has caused a whole host of problems especially if you had links from many low PR pages such as many of the sites I work with.

Hopefully it's just a glitch or something but this is what it looks like to me.

petehall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 9:24 pm on Aug 23, 2006 (gmt 0)

The higher PR websites I work with are not experiencing any difficulties what so ever.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 10:14 pm on Aug 23, 2006 (gmt 0)

>> I've been trying to keep my web site design minimal and clean without much text content. <<

If there is not much to index on each page, and it shares many words of the same content across multiple pages, you will have many pages deindexed or supplemental. You need to increase your text content to give search engines something to index.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 10:18 pm on Aug 23, 2006 (gmt 0)


>> Once again, we're back to "as retrieved on Aug 19, 2005" <<

>> We hear that all supp results are going to be only 2-3 months old and Matt himself says everyone should have results form March and now that isn't even correct. <<

You are looking at [gfe-eh.google.com ] aren't you? That one has been cleaned up a lot. Other datacentres will probably follow a month or so from now. Don't bother too closely with those others for the moment.

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3055209 posted 10:18 pm on Aug 23, 2006 (gmt 0)

All the supps for me are listed "as retrieved on 18 Aug 2005" or there abouts. Just over a full year roll-back.

No wonder we're having customers call about items that were discontinued . We've been answering so many call .. this does explain it... I'll be so glad to get rid of those .. alot were pages that got spidered under SSL... so I assume they got tossed into supplimental because of dupe content.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 10:26 pm on Aug 23, 2006 (gmt 0)

>> "According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental" <<

I have been saying something about that for the past couple of years; and Matt Cutts confirmed it [threadwatch.org] a few days ago. It wasn't so much about being Supplemental, but instead was more about some results in a site: search being hidden away, and only appearing after the link in the "repeat the search with the omitted results included" message was clicked. In some cases it did also involve some supplemental pages too.

steveb

WebmasterWorld Senior Member steveb us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 10:32 pm on Aug 23, 2006 (gmt 0)

"(I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above)."

I've got about ten PR4 or higher. Those are results with supplementals parallel to a full listing. Then also there are the examples of pages like www versus non-www with a 301 in place where the obsolete URL could be even PR6.

petehall

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3055209 posted 10:38 pm on Aug 23, 2006 (gmt 0)

Any chance you could sticky me mr b?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3055209 posted 10:38 pm on Aug 23, 2006 (gmt 0)

>> You have got to be kidding me. December 2004 pages back AGAIN. <<

Where do you see that?

I would assume that it is NOT in [gfe-eh.google.com ] right?

fjpapaleo

10+ Year Member



 
Msg#: 3055209 posted 10:43 pm on Aug 23, 2006 (gmt 0)

"I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above"

This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.

This 182 message thread spans 7 pages: < < 182 ( 1 2 [3] 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved