Forum Moderators: Robert Charlton & goodroi
-----------------------------------------------
Hey everyone,
I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?
Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.
So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.
Here are my questions. If any of you can shed some light on these, I would really appreciate it.
1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?
2. Can I expect a recovery similar to the one I had in July?
3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?
Thanks for you time!
[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]
The point is simple, it's a huge risk to put all your eggs into one basket if you have a sound business plan. Especially if the eggs are going in something that you have zero control over as many of us are finding out with the "refreshes".
If you assume for a minute we have all heared about diversification.... ;)
Anyway in Europe I can't really see a way around Google. What we do to diversify traffic sources is to make videos and suggest links to wikipedia, where most stay. Still this has to be done sensibly so it's not spam.
Of course I should have become a builder or so to make money ... but well ... too old now.. sigh ..
I have a personnel website to display my photography and I've been trying to keep my web site design minimal and clean without much text content. The problem I'm having is that I'm continually in the Google supplemental hell index and I am getting no traffic. I would prefer to keep the web site interface uncluttered and text free, however, the conclusion I am coming to is that I will have to add more explanatory text.
Although being a newbie I attempted to build my site just using CSS, so it could be a formatting issue, also I share an IP address and so I'm not sure that could also be an issue.
I've tried to make the title, keywords, and alternative image tags as descriptive as possible when there is no text content on page, and tweak individual pages and have a Google site map, I'm crawled continually by the major search robots, and notify Google site maps whenever my web site has been updated. But none of the above has solved my problem. please have look at my site www.dig-i-tal.com .
This is just ridiculous. Are they EVER going to get rid of this crap? At this point its just a joke.
First we hear the new options in the sitemap about the preferred domain issue and that hasn't been fixed yet. Second we hear that all supp results are going to be only 2-3 months old and Matt himself say's everyone should have results form March and now that isn't even correct.
Its going from frustration to pure anger at this point.
Tedster,
I did a friendly search this morning for "increasing traffic adwords" and the first page was so filled with keywords at the bottom that I was a bit shocked. Google fighting spam? Doens't seem so to me. They even outrank Google. This is a prime example of what people tell you NOT to do, yet this page ranks number 1 and beats Google.
Would it be so hard to add a spam button to the Google toolbar so people could quickly flag junk sites? Then once a site is flagged, somebody at Google reviews the site and makes a call as to whether a site is in violation or not?
YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area.
Thank god for the trickle of yahoo and msn rankings that keep my site at around 20% of normal traffic. Currently since this last August disaster I have ZERO google traffic.
Long live spam. :S
YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area.
What I don't understand I why they try to filter spam using an algorithm.
For the word the there are 22,280,000,000 indexed documents. That’s over 22 and a quarter billion documents. Even if you used an algorithm to isolate spam suspects the massive number of documents would make human review impractical from time, human resource and financial perspectives.
That does not mean the Google’s algorithms are not human based.
The accounts I have studied report that Google uses human reviewers to analyze a sampling of the index. The data generated or the graded documents are further reviewed for statistically relevant factors and the results are used to create and to tweak Google’s algorithms. (A set of guidelines for document reviewers was actually leaked a few years ago.)
There is additional surmising, albiet less supported, that Google uses the data from its book scanning and non-web data collection to further refine their natural text analysis. The intuitive reasoning behind this deduction is that books do not contain spam so they offer Google a more reliable data set.
"Sites" do not go supplemental: individual URLs do, or don't.
I can show you a few complete sites that are 99.9999999% supplemental except the home page.
According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental dung.
Sure enough, when I checked those "supplemental sites" they all had the same meta description on every page.
Luckily, those weren't MY site, they were competitors sites ;)
[edited by: incrediBILL at 6:37 pm (utc) on Aug. 23, 2006]
Can anybody confirm this? If this is what's causing the problem, it's not that hard of a fix. I don't have the same meta description on every single page, but a lot of my pages that cover the same topic have the same meta description as each other.
If they reported quality sites as spam, it would do no good, because a human would review each flagged site and make a decision. So flagging an innocent site would do nothing.
Secondly, if somebody was repeatedly flagging innocent sites, couldn't Google just block any flags from that IP/user?
A system like this would go a long way in clearing out spam on Google. An algorithm will never get rid of spam. Just look at how much crap still gets into your email with "spam filtering" applied.
having the same meta description in every page causes Google to assume the pages are all about the same thing
Yes, I can confirm this, and in quite a few cases now. And the fix has been as simple as adding unique and page specific meta descriptions. (Or maybe not so simple, in some dynamic cases.)
It didn't used to be this way, but I first stumbled onto the problem for one client site last fall (and fixed it) -- and then other members here started confirming. It doesn't always mean a Supplemental tag -- sometimes it just shuffles everything off into the "Omitted Results" link.
There are definitely many other cases of "going supplemental" that do not involve identical meta descriptions and/or titles, however.
The fact that once a page is tagged as supplemental, you need a Supplemental Googebot to come around and recrawl your supplemental urls doesn't help, since it used to come around every 6 months (or so I've heard).
I also think once a site goes heavily supplemental, you need to regain some trust with Google to get pages back into the main index (i.e. organic inbounds/PageRank). It would be one way Google guards itself against 100,000,000 page spam sites sitting in the supplemental index, and preventing any periodical on-page tweaks from reinjecting the site into the main index.
Also, Vanessa Fox recently commented on Google Groups (Crawling/indexing/ranking) regarding a directory type site with barely any text on the category pages:
You should also take a look at your site and make sure it provides
unique content. Most of your categories don't seem to have any content.
You'll need your pages to have value in order to get them indexed.
Would that be pre-Florida? That is the time the real
demise at Google began.
Pages are 'pushed' into supplemental when there is not enough PageRank to keep them in the main index (I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above).
Quite often this is down to a poor linking structure, however as far as I can see the level of PR required to keep pages in the main index just went up - which has caused a whole host of problems especially if you had links from many low PR pages such as many of the sites I work with.
Hopefully it's just a glitch or something but this is what it looks like to me.
If there is not much to index on each page, and it shares many words of the same content across multiple pages, you will have many pages deindexed or supplemental. You need to increase your text content to give search engines something to index.
>> We hear that all supp results are going to be only 2-3 months old and Matt himself says everyone should have results form March and now that isn't even correct. <<
You are looking at [gfe-eh.google.com ] aren't you? That one has been cleaned up a lot. Other datacentres will probably follow a month or so from now. Don't bother too closely with those others for the moment.
No wonder we're having customers call about items that were discontinued . We've been answering so many call .. this does explain it... I'll be so glad to get rid of those .. alot were pages that got spidered under SSL... so I assume they got tossed into supplimental because of dupe content.
I have been saying something about that for the past couple of years; and Matt Cutts confirmed it [threadwatch.org] a few days ago. It wasn't so much about being Supplemental, but instead was more about some results in a site: search being hidden away, and only appearing after the link in the "repeat the search with the omitted results included" message was clicked. In some cases it did also involve some supplemental pages too.
I've got about ten PR4 or higher. Those are results with supplementals parallel to a full listing. Then also there are the examples of pages like www versus non-www with a 301 in place where the obsolete URL could be even PR6.
Where do you see that?
I would assume that it is NOT in [gfe-eh.google.com ] right?