Forum Moderators: open
New keywords - missing targets
Do not recommend chasing this algo as it is bad still too many irrrelevant sites showing - Get with it G!
The algorithm now skips phrases and looks for keywords like:
"keyword1 JUNKWORD keyword2" = "keyword1 keyword2".
rather than "keyword1 keyword2" = "keyword1 keyword2"
The exception seems to be sites paying for PR and there's more companies being successful with that approach or sites with over 100 different backlinks from 100 different sites (Good Optimization techniques still work!).
For all I know Google may even have a randomness component in their SERPS that makes us all look like fools for trying to ascertain a pattern...
I never thought about that, but it could make sense, in a game theory sort of way. I mean, for many searches there is no significant difference in quality of the site at number 5 and number 15. So Google would not do a disservice to searchers with some randomization. And if Google just wants webmasters to concentrate on quality content instead of figuring out the system, maybe they should randomize.
Now we are showing up for the singular version of our keyword phrase, which we never came up in this search prior to Florida.
A search for this keyword phrase shows us bouncing around the datacenters. On most datacenters we are in the 40s but today we are #384 in -fi.
After we reappeared, I played with changing the title of the page. One version was only our company name. The second version I added the keyword phrase. Oddly enough, neither effected our placement one way or the other.
We showed up for another keyword phrase for two days but are now totally gone again.
Frankly, I do not believe that anything that I am doing is having any influence what so ever. I think it is all Google making adjustments.
I am also starting to believe that Google is working feverishly to make a new algo work. Perhaps it is due to replacing pagerank. I have seen absolutely NO stability from one day to the next since Nov 15th in any aspect of Google from the bot visits to the results.
The Google we all knew and loved is dead and gone and what we are trying to figure out now is a "work in progress" of the next generation of Google.
IMHO
I had some good luck removing a few keywords on one page, but I also had good luck just waiting on another. They were both penalized and now they are both back.
There are potential reasons:
- de-optimization
- links have now 'aged' and are clearly not spam
- Adding links pushed me over a certain threshold for the filter to let me back in.
- google tweaked the filter/algo
(note: deoptimization, adding links, and doing nothing - done in varying degrees to different pages)
The fourth seems most likely reason for any return of sites. Google is doing a darn good job of keeping us guessing.
[edited by: jcoronella at 4:32 pm (utc) on Jan. 6, 2004]
In the thick of Florida there were a bunch of folks rushing out to de-optimize their websites.
I've been re-optimizing in the thick and thin of Florida... while folks are talking about "waiting until google settles", seven weeks have passed... that's 13% of a year's sales -- just how much of a year are others willing to give up while waiting for google to "settle"?
Anyway. I've had a good bit of luck reoptimizing one site, next to none re-optimizing another.
trust me, i feel your pain. But what if Google NEVER settles? Where my wife used to work they saw that "continual change" was an end in itself. I know that my website is continually changing...why should Google not be trying to continually tweak things?
For instance their algorithm might be adaptive. It looks at the data it has then sets the parameters based on the data. So at every update the parameters might change. How could you optimize for that scenario? You cannot (easily).
So in some ways it comes down to changing business practices:
(1) making sure your site is as good as it can be. Basically attract customers, and make them repeat customers etc (the usual stuff)
(2) making sure your site performs across the range of search engines. Better to be in all of them in a lesser position than to be in one at a premium position. Spreads your risk out.
(3) Basically (1) again but more than just the web angle. Do good business and you'll get good business.
has only led to rampant speculation... For all I know Google may even have a...
The speculation is pretty rampant there. :)
For those who haven't noticed yet, the algo has settled. Recap: Since Florida we've had a couple knob tweaks, PR udate, backlink update and a pseudo-update named Ginger.
Now, let's please get back on topic. The topic of this thread is "You De-optimized your website. Are you happy now?"
There were many people saying they were going to de-optimize in response to someone's theory. It's kind of unsettling that those who advocated de-optimization have thus far been silent. Let's hear from a couple of you, please.
[edited by: martinibuster at 5:01 pm (utc) on Jan. 6, 2004]
This was a number of months ago, and I believe 2 updates back, someone showed me an example involving a peculiar test site that he set up to test Google. From what I saw, I concluded that there *was* a randomness component in Google. One problem with this conclusion is that it is logically possible the randomness that I thought I saw was due to some Google bug, rather than by intentional design. If it was just a bug, then this might have been a temporary state of affairs, as Google presumably would eventually find and fix it.
I had thought up this idea of a randomness component to confound SEOs independently a long time back. The trick in adding such to an algo is that it would have to significant enough to cause SEOs problems, while conversely be subtle enough the typical searcher wouldn't see what they considered a material degradation of the SERPs. For example, the randomness component could cause a site to rarely appear more than 2 places higher or lower on a SERP than it would without the randomness component. If the site that would have been #3 shoots one month up to #1, or conversely fall to #5, it is still easy enough for the searcher to find. However, if that site that should be #3 got kicked down to #23, that would mean one of the most relevant sites was buried so far few would ever find it.
Moving back to your topic, if they are seeing the same old, same old, it may mean that it will take another major update for them to see substantial improvements. However, if they deoptimized and fell much further down the SERPs than they were, this is significant.
Judging by some remarks people almost felt entitled to the top positions by virtue of their "optimizations" which amounted to a laundry list of "tricks" to apply upon a website. In another thread I remember calling those websites "silicone sites" because of their unnatural nature.
In a salon.com article [bmedia.org] last June in which I was interviewed, the writer described my approach to Google as "zen-like" because I stated that the harder you try to "trick" Google, the more Google can get away from you.
In the gold-rush to "optimize" I think that some folks forgot about good-old clarity. It would be interesting to hear from more people who have deoptimized.
What potential? Google isnt rolling back.
>Pre-Florida I saw a disturbing tendency to seize on html elements as Google hacking tools. There were many threads about H1 tags and multiple repetitions in title tags, etc.
Judging by some remarks people almost felt entitled to the top positions by virtue of their "optimizations" which amounted to a laundry list of "tricks" to apply upon a website. In another thread I remember calling those websites "silicone sites" because of their unnatural nature.<
Excellent points, martini.
Back OT, all I did to my html site (with all of the title, h1, h2 tags, etc with appropriate keywords) was to dump the extensive crosslinking with a few other industry related PR5 and PR6 sites. We had placed links from every page to the home pages of the other sites and vice-versa. I did this during the first week of Florida after I fell 100+ positions.
I bounced back by the 1st of December and have remained top 5 since, even though during this last PR update, I dropped from PR6 to PR5 and Google went from showing 150+ links to 35.
As far as de-optimizing, you should really have a good understanding of what you intend to achieve. I've seen sites by other members of ww and have to say that you can hardly tell there's anything going on- but there is.
Kirby, that's a good strategy that's worked for others as well.
[edited by: martinibuster at 7:06 pm (utc) on Jan. 6, 2004]
Many of the pre-Florida search terms which had pulled up that URL on the first or second page of Google SERPS have consistently brought up news articles (and the odd random page) for the last six weeks.
So I did the obvious thing. I divided the index (rather long at 106kB) into three manageable sections and crosslinked them, so that - to the user, at least - there's not much difference between navigating the index by jumping up and down the page or by jumping from page to page.
This left me with three pages at just over 20kB each. (Bear in mind these pages are almost all text...)
Then I researched and added my own feature articles to the top of each page - to compete with the news articles in the SERPS - and reconfigured the CSS, so that the page heading and content comes right at the top of the code.
Finally to make this section of my site more appealing (and to ease the compilation of the new feature articles, which I will rewrite probably once a month) I added a fourth page, summarising the news in that field on a daily basis and a fifth page summarising consumer research on a daily basis. (Maintaining the summaries merely involves jotting down what I have to research on a daily basis anyway for the index).
In order to minimise the on-page keyphrase frequency, I wrote out a list of synonyms beforehand and made sure that the three feature articles contained as little repetition as possible - not always easy on such a niche topic!
Result: Nothing until today. Suddenly I find the new features (all three pages) have been spidered and are springing up on the first page for all sorts of terms (although, who knows, they might be gone again tomorrow?)
The most obvious conclusion to draw is that the original index rose so high in the SERPS on the basis of page title, page heading and keyphrase repetition. The new pages are doing well but not on the basis of these factors... more likely because they contain a lot of information on a consistent theme without an overwhelming amount of repetition.
redwidgets.com = widgets.com/red/index.html
bluewidgets.com = widgets.com/blue/index.html
yellowwidgets.com = widgets.com/yellow/index.html
Although my main widgets.com index page remained #1 in Google for its major keyphrase after Florida, the red, blue, and yellow listings disappeared.
I then changed my navbar links so they pointed to the internal URLs instead of the separate domains, and the three secondary index.html pages were back in Google's index for their major keyphrases last week. (In fact, one of those secondary index.html pages is #1, as it was before Florida.)
Otherwise, I didn't make any changes. (There was no need to, since my other 3,500+ pages did just fine in the Florida update.)
In any case, I don't think link text has been penalised by Google at all, I think it just doesn't have as much influence as it used to - I would only change your link text (or change it back) if the changes make the links and/or the page design more intuitive for readers.
<added>what an appropriate typo</added>
When the filter was loosened a few weeks later, some of the rankings came back naturally, but about 50% of the sites did not.
A week before Christmas I finally started to de-optimize and shortly there after, rankings returned. Most of the rankings are not back to #1, but they are back in the top 10.
I didn't do anything that hasn't already been mentioned. Changed my titles tags, changed header tags, removed some of the internal anchor text, changed some hyperlinking and added some stemming words.
NOTE: I also removed all outbound links that were apart of a link exchange program, but I doubt that had much to do with my rankings returning.
Everyday a few more rankings come back and a few new clients get ranked in Google for the first time. This new algo isn’t that hard to crack once you start seeing the results.
I haven't seen too much of a change except for newer content moving up in the ranks and increasing PR. The older content is holding where it is.
If I could get over the PR6 hump I'd be happy but it might take an update or two.
In the meantime I'm just plugging away adding new content and optimizing what I have now. It worked in the past, it didn't hurt me during and after Florida so there is no reason for me to change my strategy now.
I just keep adding content day after day. After Florida I noticed a BIG increase in postings here! That was/is great for WW (another 1000+ pages added for free) but bad for those wasting their time doing so. IMO they should have used all that writing to add more content to their own site.
I did several things to make the site more user oriented rather than Google oriented. Something I should have done from the start.
The result? 1000+ to #5 to #2 for the money keyword that was turned upside down.
P.S. My site is an affiliate site.