Welcome to WebmasterWorld Guest from 54.166.46.226

Forum Moderators: open

Message Too Old, No Replies

De-optimizing pays off

De-optimized several customer sites and went from nothing to top 10

   
3:48 pm on Jul 22, 2004 (gmt 0)

10+ Year Member



Several people I met socializing at various piano bars and art events complained that their SEOed sites were not listed on Google at all. I decided to take a look and saw several 'uneccessary' tactics such as phrases in filenames, <alt> text etc. I removed it all and within TWO weeks, every single page that was edited was at least top 20 and most top ten with five number ones for a two word phrase. The sites already had great content and plenty of related incoming links.

De-optimizing is a fast way to get better ranked. Try it, you will be pleased.

5:02 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



funandgames, take pride in your post, and thanks for sharing! Besides, threads that stir conversation are a good thing! Without them, things can get boring around here. :-)

I have little doubt that what you did helped the site. You apparently saw what you thought were some excessive "optimization" tactics. You wisely scaled them back, and now you're reaping the rewards. Well done.

It's possible to over eat, over indulge in alcohol, become over-sexed, and over optimize Web sites (i.e., too aggressively optimize). In almost all things (well, I'm not sure about sex), cutting back on excessive behavior generally produces postive results.

OTOH, I'm skeptical that steveb's neighbor's cat had much to do with the increase in his sitemap's performance. Still, you never know.

;-)

5:19 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



I agree with him that being de-optimized pays off, the more optimized a site is the more suspicious Google gets.
5:31 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



Come on now. Google doesn't get suspicious. It doesn't get anything. It's just a series of dumb machines running an algo. Never forget that. Assume it has human intelligence and you'll be in all sorts of trouble.

I don't agree with this deoptimisation concept. I don't agree with half the penalties I hear about or hill-top or numerous other theories bounced about. Far too much is based upon limited experience and reading too much into the situation.

I have a PR4 site number 1 worldwide out of 9.5 million pages and it does alt tags and keywords in directories/ filenames. Does it help? I don't know. Does it hurt. Seems not - I went up in the last update.

IMHO

Suggy

5:34 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



if you changed the file names the pages would loose their page rank and I dont think they would get them back in 2 weeks, sounds like someone trying to make everyone else rerank
5:43 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There is a *major* difference between scaling back a site that went too far and thus tripped filters (which by now are very well understood by those paying attention), and de-optimizing a site that is already performing well. Funandgames obviously just rained in a site that had overstepped certain boundries, and it benefitting from it. Not rocket science.

No one is arguing against kw's in filenames in here. Optimzing kw's, filenames, Hx, page titles, etc, is not only wise but essential these days.

It's *how* you do it that makes or breaks the site.

5:49 pm on Jul 23, 2004 (gmt 0)



Come on now. Google doesn't get suspicious. It doesn't get anything. It's just a series of dumb machines running an algo. Never forget that. Assume it has human intelligence and you'll be in all sorts of trouble.

The algorithms may not have human intelligence, but they're designed by humans with intelligence. It certainly can't be that hard for Google's very intelligent humans to design algorithms that take multiple factors into account when determining how much weight to give anchor text, titles, keyphrases within the body text, etc.

Many Webmasters worry about getting into trouble with Google because of this or that or the other thing. In reality, it's likely that this and that and the other thing are taken into account when applying spam filters or compensating for "aggressive" SEO techniques.

5:51 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member caveman is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>>likely that this and that and the other thing...

Exactly.

5:55 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



Seems that nearly all of the penalties being handed out lately are based on optimizing....

Also this last backlink update was only an exclusion update that removed the earlier ones and added the lesser value links. Appears to be a way to get at pr selling sites, just show the lower pr links and then no one knows what counts.

6:23 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



Hmmm...discount high PR backlinks to screw up PR sellers. I did not realize my DMOZ and Google directory links were paid PR links. Nevermind, that is a whole new can of worms.
6:27 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



funandgames,

It is good that most people won't listen, so we can keep our listing high! If all of them listen, we might have too much competiton!

Let_Them_Stay_Where_They_Are!

9:31 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Now Hill Top was a probalbe thing. The details of it are still fuzzy but there is something there. If you have any experience with Florida it was quite obvious.
9:43 pm on Jul 23, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is no doubt in my mind that there was a so-called OOP.

However, (again in my view) Google did relax their criteria a couple months after the OOP began and many sites returned without tweaking.

The fact that many webmasters still complain about an inability to improve their positions hints that the OOP still exists, and hence there should be some benefit to watching keyword proximity on your site.

11:50 pm on Jul 23, 2004 (gmt 0)

10+ Year Member



>>It's the derisive sarcasm that was uncalled for.

A kitten would not like to be in his place -ask the neighbour

1:49 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The $64 question: is it that over-optimization is being penalized or that what's traditionally worked isn't any working too well any more?
3:34 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The arm-waving overoptimization idea misses two key concepts. The first is discussed fairly often. While red-widgets.com/new/ is generally good optimization, red-widgets.com/new/red-new-widgets/new-red-widgets-widgets/new_red_widgets.html is not "overoptimizing". It's just foolish, bad optimization.

The second concept is much more important, and what renders the first post here even more non-useful. Google has announced they are introducing some number of new algorithms all the time. (I don't remember the number or the period of time, but it was a bunch of different algorithms.) The fundamental fact of the short-term future so far as Google is concerned is that on as little as a day to day basis, the results will differ because algorithms are valuing things differently.

What Marcia said above "isn't any working too well any more" doesn't go far enough. Literally, some things that work well today, on fifteen datacenters, might not work nearly as well tomorrow on twenty other datacenters.

Optimizing today is to consider a "target" of maybe a half dozen different algorithms operating at the same time. They are obviously similar, but they have key differences. At least in my niche the update a few days ago seemed to favor raw number of links over mid or high quality ones... meaning a 1000 pages of PR5, PR4 and PR3 links began to exert less influence compared to five or 10,000 PR0 or PR1 links from message boards. Most datacenters seemed to reflect that, but a few seemed to be marching to a different drummer (66.102.9.99 for one).

Suppose the reverse phenomenon hit, and the quality links again exerted more influence than the volume of links. A person who just removed 5000 message board links but added one from a decent competitor might see a ranking increase, and wrongly conclude the removing of the links was what caused it, when in fact removing the links was ignored, and the adding of the single link rewarded.

There could be hundreds of combinations like this, but the main thing is that even if you do nothing, you are going to be judged by different algorithms that ALL have a different definition of what "optimized* means for THAT algorithm.

4:09 am on Jul 24, 2004 (gmt 0)

10+ Year Member



"what renders the first post here even more non-useful."

It was useful to me and many others.

These sites didn't just move up or down a few pages, they reappeared from oblivion. I doubt that is due to the
frequent algorithm tweaks to Google. The fact is, nobody knows for sure. If you did, then you can say it wasn't useful.

funandgames, thanks for starting this thread. It's been worth reading.

4:30 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



"What you miss steveb, is that these sites didn't just move up or down a few pages, they reappeared from oblivion."

Uh, no. I have plenty of these. Either you didn't read what I wrote, or you didn't understand what you wrote, because I explained it exactly. Pages going completely haywire awol temporarily can occur for several reasons, but one common one these days is that they are temporarily judged as without merit by the current algorithm, or they are given some major algorithmic demerit. Next batch of algorithms come along, and they get resurrected. Obviously that has nothing to do with some overoptimization idea as doing absolutely nothing can bring them back.

5:05 am on Jul 24, 2004 (gmt 0)

10+ Year Member



"doing absolutely nothing can bring them back."

Sure, there have been many reports of sites coming back with no changes made. Not so with the case being discussed in this thread. They didn't come back UNTIL something was done. Coincidence? I doubt it.

I guess you didn't read what I wrote, or you didn't understand what you wrote.

7:32 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



"Coincidence? I doubt it."

Which if you read what I wrote, I'm pointing out a different consideration. What you call a "coincidence" could in fact be primarily the result of a significantly different weighting of ranking factors. The point, again, is even if a webmaster does nothing, Google is making changes on a daily basis.

The days are gone of a one size fits all idea of optimization, or of a stable idea of what Google "likes".

If you are getting hit over the head by a two by four, you could move out of the way, or in another scenario you could do nothing and the person could stop. The fact you are not being hit anymore is not a "coincidence".

8:34 am on Jul 24, 2004 (gmt 0)

10+ Year Member



At least in my niche the update a few days ago seemed to favor raw number of links over mid or high quality ones

Good point steveb, I tend to agree.

After carefully monitoring about 10 different sites since the Florida update, some of which were "deoptimized" some of which were not, I've come to the conclusion that time heals all wounds.

I think it's a matter of Google slowly reintroducing sites that got hit during the Florida hurricane because the current algorithms deem those sites to be once again relevant.

I find no evidence to support the theory that depotimization works.

While there will surely be cases of webmasters deoptimizing their site and subsequently finding their sites to reappear there are many more cases up webmasters deoptimizing their sites and the sites did NOT reappear.

There will always be coincidences.

11:31 am on Jul 24, 2004 (gmt 0)

10+ Year Member



I'd like to see how those pages that were renamed fare in a few more weeks. In my experience newness still has a certain cache with google. You produce a site, get a few links, and show the client it's on page two after only a month. Next month it disappears completely, and it's a long haul back. That's not an optimisation penalty (in my opinion); it's a loss of some newness rating - a bit like back in the days of the dance!

Don't forget, no one knows what goes on in the box. We can only summise. Scientists did this for thousands of years and boy did they get some stuff wrong!

Sug

11:36 am on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What probably really happened is you significally changed your content enough that Google's engine said, hmm, let's completely re-index the site. :) and the problems with how you had it coded before were finally removed... heh
12:10 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



SteveB. Agree.

It's of course probable that if the site was way over optimized some tuning back could do it well but these days one of our sites ranks 1-25 for 3 different 11,000,00 result terms depending on which day/hour you search. The DC's and algos that present your results are (IMO) rotating very frequently. G is testing always just like we are.

We pick up (conceived) differences and make alterations that we feel will be effective and G does the same thing.

12:13 pm on Jul 24, 2004 (gmt 0)

10+ Year Member



I tend to agree with steveb. I have sites that were hit hard in the rankings. Some I deoptimized, some I left alone. All 3 sites came back a few weeks later. The sites I left alone came back higher. the deoptimized sites came back, but not as good.

I think it all has to do with the algo they are running at the current time.

wellzy

12:21 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Just because some sites come back with nothing done does not make it a universal truth like a mandate from the burning bush. The plain fact is that there are sites that will *never* come back unless some changes are made. And the changes that one person makes may be just the ones that someone else needs to make.

Aside from that and on another note, I personally have never been able to come up with any justification for anyone belittling others because they don't agree with them.

1:24 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think it all has to do with the algo they are running at the current time.

Agreed. In nearly all cases, changes in ranking come from either; a change in algo, change of text/content, and of course backlinks.

Phrases in file names and alt text are unlikely the cause of poor ranking. OTOH, google has done some pretty silly things in the past year :)

1:48 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



steveb is no doubt right. On the other hand, if one's site is buried for some time then it could pay off to make some changes even if one has no scientific basis for doing so, and if it seems to give a positive result then it's tempting to say there was "cause and effect" as the orginal poster did. But given all the variables it can hardly ever be truly scientific, and would be much less fun if it was - long live the "art of search engine optimisation" and the ensuing discussion.
1:52 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>buried

If a site is dead and buried for a long time it needs resurrection. I'm not a believer in blind fate. If you *know* what the problems are and you *fix* exactly those problems and it works - and other sites have those exact problems, aren't fixed and stay in Google purgatory then it's asinine not to apply what's been seen to work if it's plainly identifiable.

So how come the ones with the same problems that weren't tended to have stayed buried if it's all up to blind faith in the changing winds of the algo?

2:24 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So how come the ones with the same problems that weren't tended to have stayed buried if it's all up to blind faith in the changing winds of the algo?

The "problems" were alt text and file names? This is the part that I find hard to believe, not that making any changes can have a positive effect.

How come sites/pages that make no changes at all often pop up 300-500 positions after being buried? IMO, it is very, very difficult to isolate a few variables when there are an enormous amount of factors that may contribute to the "tipping point".

Personally, the only optimisation I do is through backlinks and proper titles. Of course, I make sure that the proper keywords are on the pages as well.

Although there are probably quite a few factors in the major SE algos, it seems that backlinks account for nearly everything in scoring, assuming text is on paeg and title, so the obvious place to look when one's ranking changes is backlinks. Of course, as steveb suggests, they often change the way they interpret/score backlinks as well. More often than most people realize.

Optimize :To make as perfect or effective as possible.

There is no such thing as "over-optimizing" of course, as it makes no sense, and "de-optimising" would simply mean to make less optimal or less effective, but that is a different topic entirely.

3:18 pm on Jul 24, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Uhh...at the risk of suggesting the obvious...if the "de-optimization" process included removing keyphrases from file names, doesn't that mean that the file names were CHANGED, and therefore viewed by Google as new pages, which would be subject to the well-known FreshBot effect?

IOW, did the original pages "come back" or did NEW pages simply get the FreshBot effect?

This 103 message thread spans 4 pages: 103