I've made no changes
I don't have google sitemaps
Homepage MIA for singular term
OK for plural.
(happened before this yr)
is there any way to contact G regarding sites being dropped?
|The point is simple, it's a huge risk to put all your eggs into one basket if you have a sound business plan. Especially if the eggs are going in something that you have zero control over as many of us are finding out with the "refreshes". |
If you assume for a minute we have all heared about diversification.... ;)
Anyway in Europe I can't really see a way around Google. What we do to diversify traffic sources is to make videos and suggest links to wikipedia, where most stay. Still this has to be done sensibly so it's not spam.
Of course I should have become a builder or so to make money ... but well ... too old now.. sigh ..
Google supplemental hell index,
I have a personnel website to display my photography and I've been trying to keep my web site design minimal and clean without much text content. The problem I'm having is that I'm continually in the Google supplemental hell index and I am getting no traffic. I would prefer to keep the web site interface uncluttered and text free, however, the conclusion I am coming to is that I will have to add more explanatory text.
Although being a newbie I attempted to build my site just using CSS, so it could be a formatting issue, also I share an IP address and so I'm not sure that could also be an issue.
I've tried to make the title, keywords, and alternative image tags as descriptive as possible when there is no text content on page, and tweak individual pages and have a Google site map, I'm crawled continually by the major search robots, and notify Google site maps whenever my web site has been updated. But none of the above has solved my problem. please have look at my site www.dig-i-tal.com .
Once again, we're back to "as retrieved on Aug 19, 2005"
This is just ridiculous. Are they EVER going to get rid of this crap? At this point its just a joke.
First we hear the new options in the sitemap about the preferred domain issue and that hasn't been fixed yet. Second we hear that all supp results are going to be only 2-3 months old and Matt himself say's everyone should have results form March and now that isn't even correct.
Its going from frustration to pure anger at this point.
"In the first case I mentioned above -- the footer links -- the webmaster backed off on those keyword links and saw upward movement within a few days."
I did a friendly search this morning for "increasing traffic adwords" and the first page was so filled with keywords at the bottom that I was a bit shocked. Google fighting spam? Doens't seem so to me. They even outrank Google. This is a prime example of what people tell you NOT to do, yet this page ranks number 1 and beats Google.
What I don't understand I why they try to filter spam using an algorithm. It will never work. People will always try to cheat the system. Why don't they just hire people who can actually think and have them manually apply penalties to blatant cheaters?
Would it be so hard to add a spam button to the Google toolbar so people could quickly flag junk sites? Then once a site is flagged, somebody at Google reviews the site and makes a call as to whether a site is in violation or not?
"Another observation - it appears for all our major keywords we cannot get above position 30 on the rankings. We see some increases up to postion 31, and then they bounce back to 50 +. Start increasing up to 31, and then drop back again. Anyone else seeing this effect?"
YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area.
Thank god for the trickle of yahoo and msn rankings that keep my site at around 20% of normal traffic. Currently since this last August disaster I have ZERO google traffic.
Long live spam. :S
|Would it be so hard to add a spam button to the Google toolbar so people could quickly flag junk sites? |
The spammers would use it to report quality sites as spam, by the thousands.
|YES! This has been the case for most new people getting hit either in April, June, July or recently August. It's seems to be the "page 4,5,6+ penalty". It is clearly some penalty since it's always in that 4,5,6 results page area. |
The search phrases I checked that got dinged with the July 27th update all fell to around page 6 from page one. Some have recovered as of Aug. 17th, but not many.
|What I don't understand I why they try to filter spam using an algorithm. |
For the word the there are 22,280,000,000 indexed documents. That’s over 22 and a quarter billion documents. Even if you used an algorithm to isolate spam suspects the massive number of documents would make human review impractical from time, human resource and financial perspectives.
That does not mean the Google’s algorithms are not human based.
The accounts I have studied report that Google uses human reviewers to analyze a sampling of the index. The data generated or the graded documents are further reviewed for statistically relevant factors and the results are used to create and to tweak Google’s algorithms. (A set of guidelines for document reviewers was actually leaked a few years ago.)
There is additional surmising, albiet less supported, that Google uses the data from its book scanning and non-web data collection to further refine their natural text analysis. The intuitive reasoning behind this deduction is that books do not contain spam so they offer Google a more reliable data set.
|"Sites" do not go supplemental: individual URLs do, or don't. |
I can show you a few complete sites that are 99.9999999% supplemental except the home page.
According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental dung.
Sure enough, when I checked those "supplemental sites" they all had the same meta description on every page.
Luckily, those weren't MY site, they were competitors sites ;)
[edited by: incrediBILL at 6:37 pm (utc) on Aug. 23, 2006]
"According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental dung."
Can anybody confirm this? If this is what's causing the problem, it's not that hard of a fix. I don't have the same meta description on every single page, but a lot of my pages that cover the same topic have the same meta description as each other.
"The spammers would use it to report quality sites as spam, by the thousands."
If they reported quality sites as spam, it would do no good, because a human would review each flagged site and make a decision. So flagging an innocent site would do nothing.
Secondly, if somebody was repeatedly flagging innocent sites, couldn't Google just block any flags from that IP/user?
A system like this would go a long way in clearing out spam on Google. An algorithm will never get rid of spam. Just look at how much crap still gets into your email with "spam filtering" applied.
|having the same meta description in every page causes Google to assume the pages are all about the same thing |
Yes, I can confirm this, and in quite a few cases now. And the fix has been as simple as adding unique and page specific meta descriptions. (Or maybe not so simple, in some dynamic cases.)
It didn't used to be this way, but I first stumbled onto the problem for one client site last fall (and fixed it) -- and then other members here started confirming. It doesn't always mean a Supplemental tag -- sometimes it just shuffles everything off into the "Omitted Results" link.
There are definitely many other cases of "going supplemental" that do not involve identical meta descriptions and/or titles, however.
One of the reasons, not the only one.
I haven't had similar descriptions in my site for a few years now. Still data refreshes play havoc and supplementals come and go.
Has anyone else chosen a preferred domain in sitemaps? I did as soon as the option was available and has a date of Aug 5th but now I just checked it again and I see no date and the option to choose is available again.
EDIT: I saw Vanessa state they are working on sitemaps right now.
I did on one site (one of my personal sites, as opposed to a client site) I haven't noticed anything different. For some reason the non-www version has a better PR for the home page than the www version, and that's still the case. But there have been no ranking/inclusion/site: ramifications as far as I can tell.
Identical meta description/title is one major factor, but as others said, that's not the only reason. If it was, it would be easy to crawl out of the supplemental index.
The fact that once a page is tagged as supplemental, you need a Supplemental Googebot to come around and recrawl your supplemental urls doesn't help, since it used to come around every 6 months (or so I've heard).
I also think once a site goes heavily supplemental, you need to regain some trust with Google to get pages back into the main index (i.e. organic inbounds/PageRank). It would be one way Google guards itself against 100,000,000 page spam sites sitting in the supplemental index, and preventing any periodical on-page tweaks from reinjecting the site into the main index.
Also, Vanessa Fox recently commented on Google Groups (Crawling/indexing/ranking) regarding a directory type site with barely any text on the category pages:
|You should also take a look at your site and make sure it provides |
unique content. Most of your categories don't seem to have any content.
You'll need your pages to have value in order to get them indexed.
You have got to be kidding me. December 2004 pages back AGAIN.
Is Google really this inept?
[edited by: steveb at 9:14 pm (utc) on Aug. 23, 2006]
>>>>December 2004 pages back AGAIN.<<<<
Would that be pre-Florida? That is the time the real
demise at Google began.
Patterns in titles and meta result in omitted results not supplemental.
Pages are 'pushed' into supplemental when there is not enough PageRank to keep them in the main index (I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above).
Quite often this is down to a poor linking structure, however as far as I can see the level of PR required to keep pages in the main index just went up - which has caused a whole host of problems especially if you had links from many low PR pages such as many of the sites I work with.
Hopefully it's just a glitch or something but this is what it looks like to me.
The higher PR websites I work with are not experiencing any difficulties what so ever.
>> I've been trying to keep my web site design minimal and clean without much text content. <<
If there is not much to index on each page, and it shares many words of the same content across multiple pages, you will have many pages deindexed or supplemental. You need to increase your text content to give search engines something to index.
>> Once again, we're back to "as retrieved on Aug 19, 2005" <<
>> We hear that all supp results are going to be only 2-3 months old and Matt himself says everyone should have results form March and now that isn't even correct. <<
You are looking at [gfe-eh.google.com ] aren't you? That one has been cleaned up a lot. Other datacentres will probably follow a month or so from now. Don't bother too closely with those others for the moment.
All the supps for me are listed "as retrieved on 18 Aug 2005" or there abouts. Just over a full year roll-back.
No wonder we're having customers call about items that were discontinued . We've been answering so many call .. this does explain it... I'll be so glad to get rid of those .. alot were pages that got spidered under SSL... so I assume they got tossed into supplimental because of dupe content.
>> "According to something Matt Cutt's recently said it appears having the same meta description in every page causes Google to assume the pages are all about the same thing and lump them into one big heap of supplemental" <<
I have been saying something about that for the past couple of years; and Matt Cutts confirmed it [threadwatch.org] a few days ago. It wasn't so much about being Supplemental, but instead was more about some results in a site: search being hidden away, and only appearing after the link in the "repeat the search with the omitted results included" message was clicked. In some cases it did also involve some supplemental pages too.
"(I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above)."
I've got about ten PR4 or higher. Those are results with supplementals parallel to a full listing. Then also there are the examples of pages like www versus non-www with a 301 in place where the obsolete URL could be even PR6.
Any chance you could sticky me mr b?
>> You have got to be kidding me. December 2004 pages back AGAIN. <<
Where do you see that?
I would assume that it is NOT in [gfe-eh.google.com ] right?
"I'd like someone to show me some supplemental pages with a half decent PR of say 3 or above"
This has nothing to do with it. I literally have thousands, some PR 4 and 5 as well.
| This 182 message thread spans 7 pages: < < 182 ( 1 2  4 5 6 7 ) > > |