Forum Moderators: open
De-optimizing is a fast way to get better ranked. Try it, you will be pleased.
I have little doubt that what you did helped the site. You apparently saw what you thought were some excessive "optimization" tactics. You wisely scaled them back, and now you're reaping the rewards. Well done.
It's possible to over eat, over indulge in alcohol, become over-sexed, and over optimize Web sites (i.e., too aggressively optimize). In almost all things (well, I'm not sure about sex), cutting back on excessive behavior generally produces postive results.
OTOH, I'm skeptical that steveb's neighbor's cat had much to do with the increase in his sitemap's performance. Still, you never know.
;-)
I don't agree with this deoptimisation concept. I don't agree with half the penalties I hear about or hill-top or numerous other theories bounced about. Far too much is based upon limited experience and reading too much into the situation.
I have a PR4 site number 1 worldwide out of 9.5 million pages and it does alt tags and keywords in directories/ filenames. Does it help? I don't know. Does it hurt. Seems not - I went up in the last update.
IMHO
Suggy
No one is arguing against kw's in filenames in here. Optimzing kw's, filenames, Hx, page titles, etc, is not only wise but essential these days.
It's *how* you do it that makes or breaks the site.
Come on now. Google doesn't get suspicious. It doesn't get anything. It's just a series of dumb machines running an algo. Never forget that. Assume it has human intelligence and you'll be in all sorts of trouble.
The algorithms may not have human intelligence, but they're designed by humans with intelligence. It certainly can't be that hard for Google's very intelligent humans to design algorithms that take multiple factors into account when determining how much weight to give anchor text, titles, keyphrases within the body text, etc.
Many Webmasters worry about getting into trouble with Google because of this or that or the other thing. In reality, it's likely that this and that and the other thing are taken into account when applying spam filters or compensating for "aggressive" SEO techniques.
Also this last backlink update was only an exclusion update that removed the earlier ones and added the lesser value links. Appears to be a way to get at pr selling sites, just show the lower pr links and then no one knows what counts.
However, (again in my view) Google did relax their criteria a couple months after the OOP began and many sites returned without tweaking.
The fact that many webmasters still complain about an inability to improve their positions hints that the OOP still exists, and hence there should be some benefit to watching keyword proximity on your site.
The second concept is much more important, and what renders the first post here even more non-useful. Google has announced they are introducing some number of new algorithms all the time. (I don't remember the number or the period of time, but it was a bunch of different algorithms.) The fundamental fact of the short-term future so far as Google is concerned is that on as little as a day to day basis, the results will differ because algorithms are valuing things differently.
What Marcia said above "isn't any working too well any more" doesn't go far enough. Literally, some things that work well today, on fifteen datacenters, might not work nearly as well tomorrow on twenty other datacenters.
Optimizing today is to consider a "target" of maybe a half dozen different algorithms operating at the same time. They are obviously similar, but they have key differences. At least in my niche the update a few days ago seemed to favor raw number of links over mid or high quality ones... meaning a 1000 pages of PR5, PR4 and PR3 links began to exert less influence compared to five or 10,000 PR0 or PR1 links from message boards. Most datacenters seemed to reflect that, but a few seemed to be marching to a different drummer (66.102.9.99 for one).
Suppose the reverse phenomenon hit, and the quality links again exerted more influence than the volume of links. A person who just removed 5000 message board links but added one from a decent competitor might see a ranking increase, and wrongly conclude the removing of the links was what caused it, when in fact removing the links was ignored, and the adding of the single link rewarded.
There could be hundreds of combinations like this, but the main thing is that even if you do nothing, you are going to be judged by different algorithms that ALL have a different definition of what "optimized* means for THAT algorithm.
It was useful to me and many others.
These sites didn't just move up or down a few pages, they reappeared from oblivion. I doubt that is due to the
frequent algorithm tweaks to Google. The fact is, nobody knows for sure. If you did, then you can say it wasn't useful.
funandgames, thanks for starting this thread. It's been worth reading.
Uh, no. I have plenty of these. Either you didn't read what I wrote, or you didn't understand what you wrote, because I explained it exactly. Pages going completely haywire awol temporarily can occur for several reasons, but one common one these days is that they are temporarily judged as without merit by the current algorithm, or they are given some major algorithmic demerit. Next batch of algorithms come along, and they get resurrected. Obviously that has nothing to do with some overoptimization idea as doing absolutely nothing can bring them back.
Sure, there have been many reports of sites coming back with no changes made. Not so with the case being discussed in this thread. They didn't come back UNTIL something was done. Coincidence? I doubt it.
I guess you didn't read what I wrote, or you didn't understand what you wrote.
Which if you read what I wrote, I'm pointing out a different consideration. What you call a "coincidence" could in fact be primarily the result of a significantly different weighting of ranking factors. The point, again, is even if a webmaster does nothing, Google is making changes on a daily basis.
The days are gone of a one size fits all idea of optimization, or of a stable idea of what Google "likes".
If you are getting hit over the head by a two by four, you could move out of the way, or in another scenario you could do nothing and the person could stop. The fact you are not being hit anymore is not a "coincidence".
At least in my niche the update a few days ago seemed to favor raw number of links over mid or high quality ones
Good point steveb, I tend to agree.
After carefully monitoring about 10 different sites since the Florida update, some of which were "deoptimized" some of which were not, I've come to the conclusion that time heals all wounds.
I think it's a matter of Google slowly reintroducing sites that got hit during the Florida hurricane because the current algorithms deem those sites to be once again relevant.
I find no evidence to support the theory that depotimization works.
While there will surely be cases of webmasters deoptimizing their site and subsequently finding their sites to reappear there are many more cases up webmasters deoptimizing their sites and the sites did NOT reappear.
There will always be coincidences.
Don't forget, no one knows what goes on in the box. We can only summise. Scientists did this for thousands of years and boy did they get some stuff wrong!
Sug
It's of course probable that if the site was way over optimized some tuning back could do it well but these days one of our sites ranks 1-25 for 3 different 11,000,00 result terms depending on which day/hour you search. The DC's and algos that present your results are (IMO) rotating very frequently. G is testing always just like we are.
We pick up (conceived) differences and make alterations that we feel will be effective and G does the same thing.
I think it all has to do with the algo they are running at the current time.
wellzy
Aside from that and on another note, I personally have never been able to come up with any justification for anyone belittling others because they don't agree with them.
I think it all has to do with the algo they are running at the current time.
Agreed. In nearly all cases, changes in ranking come from either; a change in algo, change of text/content, and of course backlinks.
Phrases in file names and alt text are unlikely the cause of poor ranking. OTOH, google has done some pretty silly things in the past year :)
If a site is dead and buried for a long time it needs resurrection. I'm not a believer in blind fate. If you *know* what the problems are and you *fix* exactly those problems and it works - and other sites have those exact problems, aren't fixed and stay in Google purgatory then it's asinine not to apply what's been seen to work if it's plainly identifiable.
So how come the ones with the same problems that weren't tended to have stayed buried if it's all up to blind faith in the changing winds of the algo?
So how come the ones with the same problems that weren't tended to have stayed buried if it's all up to blind faith in the changing winds of the algo?
The "problems" were alt text and file names? This is the part that I find hard to believe, not that making any changes can have a positive effect.
How come sites/pages that make no changes at all often pop up 300-500 positions after being buried? IMO, it is very, very difficult to isolate a few variables when there are an enormous amount of factors that may contribute to the "tipping point".
Personally, the only optimisation I do is through backlinks and proper titles. Of course, I make sure that the proper keywords are on the pages as well.
Although there are probably quite a few factors in the major SE algos, it seems that backlinks account for nearly everything in scoring, assuming text is on paeg and title, so the obvious place to look when one's ranking changes is backlinks. Of course, as steveb suggests, they often change the way they interpret/score backlinks as well. More often than most people realize.
Optimize :To make as perfect or effective as possible.
There is no such thing as "over-optimizing" of course, as it makes no sense, and "de-optimising" would simply mean to make less optimal or less effective, but that is a different topic entirely.
IOW, did the original pages "come back" or did NEW pages simply get the FreshBot effect?