Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
I highly doubt that the filter is being applied to the SERPS directly ("on the fly"), because I noticed that different sites are being nuked on different days even though they all ranked well for the same keywords.
I think the keyword penalty is assigned to the specific page and then it sticks like glue, and perhaps it is a permanent penalty which cannot be released until the next major spam filter is run in a few months...
I believe this because one site of mine was nuked for a certain keyword phrase last week, so the next day I lowered the keyword density a bit (and google re-spidered the page quickly). Anyway, the rankings for this page just went up nicely for two similar keyword phrases which have just one word different from the phrase I was nuked for -- however, this particular page won't rank at all for the keyword phrase which it was flagged for last week (it won't even rank in the top 500 for that phrase anymore, even after being reindexed and receiving a ranking boost for other very similar keywords).
So I think that this page has a keyword penalty which sticks to it like Elmer's glue (for those who don't know, Elmer's glue is the strongest glue in the universe;)).
In fact, I think this keyword penalty is not only assigned to each page, but I think that it also cannot be released until a major spam filter is run again (next year or whenever).
Or, maybe the penalty is a permanent penalty since Google seems to be in a "shoot first, ask questions later" frame of mind.
By the way, the keyword density for my page was 1.4% last week and this page had very little anchor text for the specific keyword phrase it was flagged for (and keep in mind that this was BEFORE I lowered the density to less than 1% afterward)
If web pages are flagged as spam for using a KW density of 1.4% then god help the future of the Internet;), because pretty soon the google-police will flag any site that uses a keyword more than ONCE on a whole page:)
Who knows, maybe in 5 years it will be illegal to use keywords even once on a page (in 5 years only STOP WORDS will be allowed in order to reduce keyword spam) ;)
In fact, in 10 years from now a person might wake up one morning to the sound of the google-police crashing down their front door with a battering ram, all because they used a keyword at more than .0000001% density on their web page:) Of course, the jail time would be reduced if the person agrees to go to a rehab center for people who are addicted to using keywords at more than .000000001% density on their web pages:)
After all, the reduction of spam seems to be the most important part of search results now, with the concept of "relevant results" running a 'close second' to heading off those pesky spammers who won't stop using keywords at more than 1.4% on their web pages ;)
[edited by: Brenda_J at 9:03 am (utc) on Nov. 23, 2003]
Markis00, I would also like to know the answer to that question you asked.
I can deal with the losses and I can deal with being labeled as a spammer (for my 1.4% density;)), but before I make any serious changes I want to verify that the update is done.
Anybody? I don't know how to interpret the Datacenters so I would like to know if the Florida massacre is done.
In answer to Brenda_J, I think the filter is run on the fly as part of the query. However, the data used to satisfy the the over-SEO part of the query is probably updated as part of the ongoing monthly update cycle. So any changes you make now probably won't be incorporated for at least a month. So the effect would be that the penalty is a semi-permament one. Patience will be requird - remove a few obvious tricks and wait to see what the effect will be in a month or so.
I'm sure Google will be expecting smart webmasters to now be removing obvious tricks that are now being picked up by the filter.
There are two scenarios here. One scenario is what we are seeing is the new Google filter/algo addition that is coming around and completely nuking/penaltying things that are blatently optimized for keywords (my homepage disappeared from the SERPS for using keyword 1 in title tag, header tag, and keyword density of around 20%. The site was new with around 20-30 backlinks, and ranked 200-300 but rising quickly for optimized keywords)
I have also looked at some of the sites that a firm I used to work for has. All of their sites have dropped from top rankings for optimized keywords using the normal optimization procedures.
So, either what we're seeing here is that there is a new filter, changing the way we will optimize for a long time, or that the update isn't done and everyone's pages who were dropped completely from the SERPS are someone safe and being slowly re-evaluated, etc. while the update continues.
I really hope for the latter but the update was supposed to be finished Wednesday, and a theme is occuring here...that theme is people who have optimized for keywords are getting dropped left right and center, while people who haven't optimized but just happen to have the keyword in their text are getting top rankings.
If this is the new method of optimization, we're all in big doodoo.
By the way, go do a search for sex right now on google. You won't find a single dirty site in the top 10 - only safe sites that talk about it. AND, the keyword is used in a sentence. Perhaps the new method of optimization is using keywords in sentences, not using stop words or just the keyword in your title tag...it's worth investigating.
Anyways, I really hope the update continues, or some kind of new filter testing is going on, because if this is the update, if it is done, if this is the new filter that will be in place for a long time, our entire view of search engine optimization for the google search engine is going to change.
>1044: markis00> Is the update done.
Compared to 2PM (GMT) yesterday, things still seem to be in flux. I shall take another shot in three hours.
I've got some terms that have been first out of 300,000+ throughout; terms that were well placed and dropped into the abyss and one term has come out of the abyss and is now third this morning (was at 14 yesterday PM).
These are for a client where I have been doing extensive changes over the last weeks and months, so that kills the theory that was arround at one stage that only old static / stable pages were being included.
Bearbrian
And jesus christ, all this crap just in time for Christmas too.
You should never have stopped!
aspdesigner - very good :)
until today I have refused this "-owfho theory" but now I am not sure at all.
All my sites except one, are german sites, german language and of course german keywords.
Nothing happend with them, they stayed at the positions the had before Florida. Nothing changed.
But this morning I have checked my sole english-language page. It was used to be listed on #50 on a "money keyword" and indeed it is gone. When I add -egfkihef to the keyword it is back on its usual position.
Thank god most of my keywords are german one's. I hope nobody at the plex is able to speak german! ;-)
But to get back to the topic:
The search results for this english keyword I talked about look VERY VERY bad. Barely index pages on the 1st result page, many results looking like www.mydirectory.com/search.php?string=kw1%20kw2
If results stay that bad people will stop using google and if google really intends to make webmasters use more adwords it will bring them no efforts because users will not use a search engine where only adwords bring up good results.
greg
I would say the update is done and has been since at least tuesday.
What do you mean? Are you saying the results haven't been changing since Tuesday? They've been going haywire!
It's been more than 72 hours that directory results are the same as the main search results.
Let me explain.
When I search for a popular term and obtain medicre results and then click "Directory" on the page upper menu, the directory results include sites that aren't included in the Google directory.
If I click "Directory", I would expect to see only sites that are included in one of the categories at [directory.google.com...] . However, Google shows sites retrieved by its search engine mixed with directory sites.
powdork,
- me too thinks it's been cooked for a long time... Some salt and pepper still needs to be added, though.
Now, this -asdfd and double -asfds thing.. Please consider this:
Write just one keyword:
You obviously don't know what you are looking for or you are looking for a very specific thing. It could be widgets, but it could also be gadgets or gizmos. G put the best matches on top before, now it mixes it a bit with broader results so that you get something to choose from.
Write two keywords:
Okay, so you're not content with widgets, they also have to be blue. Or is it that you're not content with blue, it must also be widgets? Or it it really blue gadgets you want, but you don't know the name? Before this would get closer matches, now the broad match kicks in.
Write three keywords:
Now, for each of these three you could in fact be wanting something like it, but not the exact phrase. Cheap blue widgets could really be affordable turqoise gizmos.
Write any amount of keywords in quotes
You want pages matching the exact phrase and that is what you get. Pages optimised for that exact phrase will of course show.
Write three keywords and "-something"
Ah, that was an advanced command. Now there's something you don't want. That means it must be clear to you what you do want, eg. cheap blue gizmos that are not from Arizona. Do an "Exact phrase" search and filter out the filter-word.
Write three keywords and two times "-something"
Double advanced command. So the cheap blue gizmos should not be from Arizona and they should not be the cool kind either. Obviously you are missing something in the search results. Do an "Exact phrase" search, but do it on an expanded set of data, that involves some broad matching terms to the keywords you did not put a minus sign before.
There is no such thing as a commercial filter. There is an understanding of search patterns.
/claus
[edited by: claus at 1:51 pm (utc) on Nov. 23, 2003]