Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
These sites had about 300-400 of excellent content related to "city keyword". No spam technics were used. I can say that my competitors had a pretty similar set up and even-though they are my competitors I must be honest and say they also had excellent content and didn't spam.
Now that these sites are removed the first two pages of serps are dominated by not related to industry sites. Mainly sites which library, newspaper, university, or radio sites that may be mention once or link to a "city keyword1" site. This could imply that pr is becoming a dominate ranking element.
I personally believe Google is filtering optimized pages. The problem I see with doing this is sites that are optimized are generally very relevent and usually have excellent related content. SEO is not a bad thing we (the SEO and search engine) are both working for the same goals. It is a shame that Google does this so close to Chirstmas. I imagine thousands of site owners will be affected.
I'm seeing exactly the opposite.
Tha't what is so confusing about this update, every theory can be disproved by another set of keywords.
In my main keyword phrase
PR1's (I never knew that there was such a thing) are in top 20 of a very competitive category that used to be dominated by PR5 - 6's and an occassional PR4 but never below PR4. Now, in the top 20 results there are 2 PR1s, 2 PR2s, 4 PR3s, 10 PR4s, 1 PR5 and 1 PR6.
If the cause is 'over optimisation' - well, we're smart people - smart enough to de-optimise!
I think maybe it is a second update - and we're just going through the nonsense stage again.
<edit: clarification: by 'nonsense' I don't mean the contents of this thread - I mean Google throwing up nonsense! >
[edited by: superscript at 3:26 pm (utc) on Nov. 22, 2003]
All our sites are full of content 100's of pages each, all different content and different catagories. Our thoughts were that -va was in fact using a content algo of some sort since we and other competitors ranked well if site contained pages of content.
Another thought too is that google obviously monitors this forum, why not in future before going live don't they use 1 particular datacentre and use our feedback first, isn't that called brainstorming, and in fact using many more people than you could fit in a room! With anything especially on this scale some factors get missed or not thought of, and with the diversity of site builds and seo techniques some algo factors as can be seen do not work when used so generally.
Also the other piece is the idea of rules. Up till now Google has had defined rules & filters etc that have been pretty much deciphered by the SEO community. If you view this update as not having rules it certainly seems to make more sense.
Google give me back my money keyword backlinks that I paid dearly for or stop encouraging SEO people to get inbound links.
The nice thing about this being a filter is all Google has to do is flip a switch and everything goes back to normal.
One of my 3 index pages has improved its ranking somewhat after getting hammered last weekend by Florida. Up until last week, I was regularly ranked at #3,4, or 5 for my index page's target keyword (for 9+ mos). For most of the last week, my page has been down around 460 or so since Florida started. This morning it is about about 250, and the top 10 looked very different than the past few days.
Another index page for a subdomain is showing similar results. Last week, #3. Most of this past week since Flordia around 250, today it is about 180, and the top 10 are very different.
If google is judging a keyword combo, by the entire site
and not by a single page, that may explain why so many
ecomm sites are dropping like a rock.
Look around then. Our main site with around 200 pages was hammered. A smaller site with around 15 kept its position
You know what replaced our main site? A one page site with the search term in its title and H1 and no content but Adsense ads!
I am not convinced that we are looking at a dictionary (lacks the elegance that Google likes so much), but there is certainly something. Great discussions now that all the whining is gone away.
WBF
One thing I have seen that is consistent for many different searches is that the results exactly match the directory results if there are any for that given term. This has been mentioned but not really talked about.
If this is the new way they are blending the algo with directory then think of the ramifications. They only dump the DMOZ results a couple times a year lately. So, now it can take six months or longer to get a site listed if you can get it in DMOZ.
Yahoo does the same thing now. When searching there they replace the title tag with their directory listing if you’re in it. I feel this will play a much bigger role once Ink is incorporated thus giving value back to paying to be in their directory.
If Google has decided to put their future in DMOZ, it is a slippery slope. DMOZ is rife with corrupt editors. When you get spam from a DMOZ editor telling you their position there can help yours in Google, then there is a huge problem.
I can only hope this is some sort of algo test for Google and they don’t go solely on their directory listings. If they do, they will not be as strong a player as they have been in the past.
That’s my one update thread post. Back to lurking.
We removed keyword1 keyword2 from the Title, from the H1, and reduced the page density down significantly, only repeated this phrase once on the index page (<5%). Googlebot visited us and today we are showing fresh results with NO improvement in SERPs.
We have only 3 external links to the site that use keyword1 keyword2 in them.. The index page has a PR4.
And for the first time in a year, from a strictly seo point of view, the thing to do now is not get more and more links links links, the thing to do is to get more and more content on pages.
It's all perspective- but this does not seem the case across the board. Adding content has always been an SEO mantra around here.
I get the gist of what you are stating. I can see 'informational' sites getting a boost in more generic, broad phrases like keyword1 keyword2, but [b]not[/b/] under something like 'industry name company'. <-- That has happened in my little neck of the woods. Luckily the OV CTR has been excellent.
WB
[uweb.superlink.net...]
If you need more info, you can PM me as I don't keep tabs on threads like these.
I think agent10 was right to say that Google should use this forum's voice before making this type of index live. This new index will kill many legimate sites and has made the serps for the most part irrelevent.
If Google isn't finished updating then it shouldn't make the results live. Test, test, and test again should be their method. I believe the current situation will hurt Google. People have little patience and will not tolerate such unless search results.
I'm starting to think giving out more links might be a good idea. Directories seem to be ruling the day for "area widgets" at least. Great time to shop for niche directories!
>Look around then. Our main site with around 200 pages was
>hammered. A smaller site with around 15 kept its position
One of our 6000+ pages sites has dropped 3 pages (to page 5 of SERPs) and our one page site has moved from #11 to #3.
Lots of our 10-15 page sites have not moved at all and some have moved up. Several 25-100 page sites have moved up or down or stayed the same.
Size does not appear to matter by itself.
It does cause confounding effects.
For example:
widget widget <country of the widget> (widget being something every site needs unless you do it yourself)
In this example what seems to be prevalent is lists & pay for inclusion lists of widget companies. (Interestingly the customers of these pfi directories are faring better in updateFlorida)
So once you have the authority sites for your keyword, do sites mentioned (doesn't have to be a link) on the authority site rank better?
Tip, you can preface your authority search with "resources for....."
[Hey - if my site goes from #3 to #800 they must be silly results ;-) ]
=======================
If this were to be the case, then how about the overly optimized sites / pages with "less popular" keyword phrases? With less popular and non-generic ones, it does not appear that the sites or pages are dumped by Google. IMHO, Google does not impose any filter based upon "on-page" factor. If that keywords are popular and competitive, then filter is applied based upon "on-site" factor.
You aren't looking very hard then. If the above was true, all brand.com sites would get penalized for "brand".
>Look around then. Our main site with around 200 pages was
>hammered. A smaller site with around 15 kept its position
>> One of our 6000+ pages sites has dropped 3 pages (to page 5 of SERPs) and our one page site has moved from #11 to #3.
================
I second that...
>>> Lots of our 10-15 page sites have not moved at all and some have moved up. Several 25-100 page sites have moved up or down or stayed the same.
Size does not appear to matter by itself.
================
Are your targeted keywords generic and super competitive?
The general trend right now does not seem to support that.
The hard work & time I/we all put into our sites to make them relevant and good resources for people (yes so we can profit but that's fair) and remember Google also relies on our sites to attract traffic to their search engine-- and with the snap of a finger they can take it all away from us overnight.
What is the point? Were their search results not relevant?
Now we are all going to have to take the time (invest more money) to rework our sites so that Google likes them again.
And in the end it will be the same relevant sites at the top that have always been there.
I don't undertand.
Should I support a company that does this?
I don't want to have to re-write every word on my sites. I can't afford to. Then again, I can't afford not to.
Yes my sites are relevant for certain searches, but now all traffic is gone. Why? Just because Google suddenly thinks sites that don't use these keywords are somehow more relevant?
I don't understand.
As far as my own personal search engine queries go, I hate to do it but I have already begune searching at alltheweb.com where I know I can get results similar to what I was getting at Google a week ago.
I know some of you are going to say I should not complain because the traffic I was getting from Google was free.
But ask yourself was the traffic really free?
I mean, consider the time and effort that goes into making the sites all nice and pretty for search engines.
Everyone with the work they do on their sites helps to make Google what it is. Without our sites, Google would have no results to show.
But when one player like Google controls so much (with their results also at Yahoo) I think it is questionable for them to be able to just flick a switch that turns out the lights like this across the board.
Many of us commercial sites have employees we pay based upon the traffic level we are expecting. But now the traffic is gone, and we still have top pay these people.
Am I supposed to freak out and start putting in time to frantically rearrange my site? Or am I supposed to hope that this is just temporary?
It just seems at a time like this that a company like Google has way too much power. If they are going to dramatically change results they perhaps should be regulated in some fashion. In other words, maybe they should only be allowed to change the results for a percentage of the search queries being conducted gradually over time, so as to not cripple various businesses instantly overnight by taking away traffic. You know, phase in a new algo over time. Because these search result changes are so drastic and so is the effect of it all.
Sorry for the venting. Good luck everybody.
don't panic, don't change anything - the results look silly - the sensible ones will be back.
Why would they throw away everything they know about determining the topic of a page? If they're going after "over-optimized" pages, that would be swell, but the way to attack that problem is to put a bell curve on keyword density or something. I believe they already stop counting "hits" after a certain number anyway.
I've seen some of our content drop in SERPs, some of it rise, and a big drop in referrals for a lot of search terms we didn't really deserve to show up for. Overall, Google referrals are down about 15%, but we still show up where we ought to.
It's a bit early to tell, but it seems we're getting broader overall coverage, with more search terms appearing in our logs. Is anyone else seeing "more search terms?"