Forum Moderators: Robert Charlton & goodroi
What do we really know about this penalty, what causes it, and most important of all, how do we fix our sites to restore normal rankings?
An individual page all but disappears for the most usual search term like 'ancient widgets' but if I pick something else from the page like 'brass widgets' even though brass widgets are a minor mention on the page it will rank in the first 2 or 3 pages. So it is as if the word 'ancient' is blocked but from that page only. In fact if you search 'ancient widgets' another page from my site will come up even though ancient widgets are only mentioned on that page. Also pages from other sites that have linked to my ancient widget page will come up in the first few pages of the serps.
So I think it might be a one word penalty on that page only. Since this wasn't happening earlier it could be an algo change. I'm wondering if it might be word density as in my topic it is often necessary to repeat the word frequently when writing an article. I have tried decreasing the word density on one page which made for some awkward writing. It is too early to see if it will solve the problem.
I don't think affiliate links are the answer as I have exactly the same affiliates on the pages that haven't plunged in the serps. Also I don't think it has anything to do with high-value key words as the words that are hurting my pages would only be of interest in my niche topic. My sites are not retail either. I have links from gov. and edu. sites as well so that is no protection. Also this penalty seems to have nothing to do with PR. The pages still have the same PR3 or 4 as they had before.
far more common among niche authority sites
seems to be repetition of keywords on page title and in headlines on page.
I need to go back and see how often I use the two words together like 'ancient widgets'. maybe it is the words in combination.
I agree with Andy. It's insane that people with informational sites that have been around for years now have to pick through individual pages to try to change them so they won't have this penalty. I've spent 3 days on this instead of writing an article that I've spend weeks researching. Does Google want unique and informative content added to the web or not? I'm just spouting off here. I think we are getting caught in an algo meant to catch scrapers but it's catching a lot more pages than that.
I've made some changes and I guess we'll see if it makes any difference. Now I think of it one of my pages that had this penalty a few months ago got it's rankings restored shortly after I changed some titles up.
title: Headline and Keywords
meta-keywords: Keywords
meta-description: Headline
bread crumb navigation: Keywords
H1: Headline
content: unique self written content
Up to now some pages went from #1 or #2 to #8 or #9. Some others went down to #43 from #2 and some are lost somewhere in Googleplex.
Are we seeing a war on sites with db generated templates or on sites that look like they have db generated templates?
not sure why this is a war G thinks is worth fighting, but anyway...
it puts those of with those kinds of sites in difficult position...not just little guys but giants, too, are affected by this...
i mean, i guess that we could all trash the existing cat > subat > listing navigation which has built much of the web...but wont this just mean that everyone is scrambling to replace what already works with something less clear, less logical and less effcient...and frankly spammier?
Will we have to put fields in our dbs for some kind of handwritten descriptions, titles, and anchor text? Find new ways to manipulate our data to make titles that are sufficiently quirky and non-standardized to pass muster with G? Stop slicing data into little bits to feed to userss, but give it all to them on a big plate?
I am really not sure what direction to take, where this is going or how users are better served, anyone have any ideas?
The only one that occurs to tweak a bit, and wait for Matt to turn down the dials. Oh, and maybe its time to get back off the wagon, too.
[edited by: PhattusCattus at 11:21 pm (utc) on Jan. 20, 2007]
Hopefully adjustments will be made or we can figure out for sure what it is causing the penalty. Part of the difficulty is that some of the people posting here may have some other sort of penalty. If you have lost all or most of your site I suspect it's some other factor that has caused it.
I did check the idea that it might be combinations of words that were penalized. But whenever the one word that seems to be the problem was in the search phrase the missing page was lost in the serps. I'm pretty certain it is one word that causes the problem for the page.
The thing that really pisses me off about (one type of) this stupid penalty is I have been adding pages below the directory index page penalized, and in all cases the new pages have been penalized. So, it's a certainty that it isn't a keyword or optimization penalty attached to the new pages. The penalty is penalizing a directory, not even a page really.
I'm either too stubborn to add the pages in the wrong directory to avoid this penalty, or too delusionally faithfull that somebody at the plex with FIX THEIR IDIOT SEARCH ENGINE one of these days, and then everything will be in the logical place.
It just doesn't identify the page properly.
Am I missing something? How do you do this without repeating the keywords in these critical areas?
Here's some additional information and my story, for what it is worth:
We have a site I started in 1998, with a PR 5/6 (seems to fluctuate recently).
We are still ranking at #1 for our two top-traffic terms, but we have dropped off of the radar (to the 80's or 90's) for almost everything else.
A couple of our important terms will still show us as #1 IF I put quotes around them.
For example, we currently rank at #88 for area widgets (no quotes), but #1 for "area widgets" (with quotes). Any ideas on this phenomenon?
I am also anxious to see if anyone is slowly showing improvement. Having weathered several of these updates, I am optimistic overall but I haven't seen this exact thing happen before with our site.
One site, which actually is listed below my pages without quotes, had one of their pages jump to 300-something, way ahead of my pages.
I've also noticed in my logs that people seem to be digging deeper in Google for results. I'm finding a slight increase in referrals from Google, but they are listed at #111, 223, etc., in the SERPs. Could this mean people are having to go deeper into Google's SERPs to find what they want?
I'm sure it's not competition, there just isn't that much in this niche. If people are having to click that deep to find what they want, it doesn't bode well for Google.
For everyone, I also need to add that although the article pages that are all but missing on my site seem to be related to a key word the index page linking them is now in deep trouble and that pages problem doesn't seem to be at all related to any keyword.
So how can the average position be #9?
I'm beginning to think something at Google is seriously screwed up. Or, maybe it's just Webmaster Tools. It's all very strange.
[edited by: AndyA at 8:18 pm (utc) on Jan. 21, 2007]
If you were doing a search for "How to Make Widget Pie" -- you would expect to find a page with that exact subject in your results. If it was in the title, you as the user would know you are on the right track. If it was in the description, you have doubled your reliability for clicking on that link. If, when you do click on the link and open up the page and see a heading "How to Make Widget Pie" - you know you have hit pay dirt. Now all you have to do is print out the article, purchase the ingredients, come home and bake that delicious pie.
That being said, there must have been some big mis-step at Google that they will fix because if everyone is finding results for making cherry, apple and blueberry pie instead of widget pie, they might begin using other search engines.
Any duplicate content found by performing searches with sentences in quotes?
I believe this may have something to do with a regional bug in Google that cannot determine the correct origin of the domain and misplaces it, so the filter is perhaps a regional misplacement filter.
For people who have niche sites, would you be able to estimate the number of RELATED phrases that are being used, of different lengths?
i mean, i guess that we could all trash the existing cat > subat > listing navigation which has built much of the web...but wont this just mean that everyone is scrambling to replace what already works with something less clear, less logical and less efficient
That might be the way it's going. I've contributed on a separate thread on this topic: What you have to remember is that Google does not want structured points of reference on the web. These serve to undermine the importance of a search engine. What Google wants is to encourage pages to be published in the most unstructured manner possible, so that only search engine technology can be used to find content.
google giving priority to meta tags?
my new company's official website which has not yet been launched still with "under construction" page has been spidered by google.
i have not submitted the url to google or even any other search engine / websites, no links from other websites , how this is possible?
my new company's official website which has not yet been launched still with "under construction" page has been spidered by google.
i have not submitted the url to google or even any other search engine / websites, no links from other websites , how this is possible?
If you browse to the url with the google-toolbar pr-display enabled, the url is transmitted to google and to their bot to get indexed. I read some examples where this could be the only way Google gets to know the site.
But if I search for "97 Wid" (Wid being an accepted and widely-used slang name for Widget) only one page on my site comes up, at #154.
In every example, the same site is at #1.
Doing these additional searches did turn up pages that didn't come up in other searches, so there is some relevancy, although in each case a lot of pages were included in the SERPs that had little to do with the specific search. They might have 0097 on them, and they might have had "Widget" on them, but not in the same sentence, and not used together at any point. i.e., 0097 Wimple or 2157 Widget. In that context, those pages do not have any relevancy to the search.
Is anyone showing their search terms ranking in Webmaster Tools? I have several search terms ranking in the top 10-20 in Webmaster Tools, but when I click on the links, they are no where to be found. A few are also showing up as most clicked query, and again, when I look for them, they do not exist, or are buried at the bottom of the SERPs.
I'm wondering how Google can say a search term shows up at #4 or #9, yet it has never showed up in those spots over the past few weeks that I've been monitoring them.
Webmaster Tools issue? Or am I not seeing something that Google is seeing?