| This 103 message thread spans 4 pages: < < 103 ( 1 2  4 ) > > || |
|De-optimizing pays off|
De-optimized several customer sites and went from nothing to top 10
| 3:48 pm on Jul 22, 2004 (gmt 0)|
Several people I met socializing at various piano bars and art events complained that their SEOed sites were not listed on Google at all. I decided to take a look and saw several 'uneccessary' tactics such as phrases in filenames, <alt> text etc. I removed it all and within TWO weeks, every single page that was edited was at least top 20 and most top ten with five number ones for a two word phrase. The sites already had great content and plenty of related incoming links.
De-optimizing is a fast way to get better ranked. Try it, you will be pleased.
| 3:42 pm on Jul 24, 2004 (gmt 0)|
new URL = new page
Mostly, that is, as with 301s it's some intermediate form of old-to-new page. afaik,imho,fwiw, etc.
>> Optimizing today is to consider (...) different algorithms operating at the same time.
Wise words, steveb.
| 5:03 pm on Jul 24, 2004 (gmt 0)|
Steveb's argument is defeatist. Yes of course data centres have different algo's and things change, but there are sites out there that ride all these changes and remain on top. The reason for this is that they are not subject to these algo tweaks and variations, they have an overall optimisation which is pretty bullit proof.
funandgames was smart, he did something and changed the situation. He may have got closer to having a site built of brick rather than sand. It sounds like his site is now less reliant on a few superficial tactics and the seo emphasis is more subtle and robust.
If you experiment and make the right changes, you find yourself never having to make changes again. If you fluctuate with algo tweaks, you ain't got it right.
| 5:26 pm on Jul 24, 2004 (gmt 0)|
If you have fallen off there is nothing wrong with starting over again. I just don't think it is a good idea to remove some op if you want to move from 30 to number one.
| 5:37 pm on Jul 24, 2004 (gmt 0)|
That is not a disaster, ogletree, though it could be a minor penalty. But when someone's fallen out of reach of the civilized world, that isn't normal fluctuation.
| 10:52 pm on Jul 24, 2004 (gmt 0)|
"Steveb's argument is defeatist."
How is it defeatist to be able to ride out all the algo tweaks and rank #1? Google's algorithm is multiple moving targets, but solid webmastering, business sense and seo will be able to handle them all over time, even if occasionally some stuff temporarily flips out.
Personally I missed the "phrases in filenames" part probably meant renaming pages. Of course if you delete a horribly performing page and put its content on a new page that the new page will perform like any "fresh" page. Any demerits assigned to the old page won't immediately transfer to the new page, and if the new page is on an established domain, it won't have too many "holding pen" problems compared to those new domains have.
So, if page1.htm became page2.htm, then there is nothing to talk about now. Months from now, maybe. I just moved a page that was ranking #1 for a competitive but not very profitable term. I've dropped over a hundred spots for that term now, and I know mostly why, but the point is that the drop has nothing at all to do with Google's algorithmic judgement of the value of that page. Good or bad, the algorithm will catch up eventually with the new page.
If you have a page performing terribly, deleting it and building a new one is in itself a massive change that would dwarf anything else you do concerning the page.
| 3:43 am on Jul 25, 2004 (gmt 0)|
I was saying that top 30 is not a penalty it is something that a few back links can fix. I think that most people that talk about penaltys are talking in the hundreds. I don't think anybody thinks 30 is a penalty. You can work with that.
| 4:36 am on Jul 25, 2004 (gmt 0)|
If you know something about the Google algorithms and you've read the published information such as the Google White Papers, patents, etc, and you have some experience of successfully putting generalised "SEO" into practice, plus you actually understand the PR formula, and you can also say you've carefully studied "cause and effect" in practice across a number of websites, then you might be able to predict with a reasonable degree of confidence what will be the effect of a change here or there. You might even become an expert (I'm not) and perhaps you are working scientifically and you can make a page go up or down at will according to things you know for a fact and you can prove if you need to.
That aside, pages go up and down in the natural way of things and without any change to the page itself, which is not surprising as the algorithms seem to be in constant flux and there are all the 5 billion other pages to take account of. When someone says they did this and their page did that (so it "must be true") it reminds me of the TV show magician who says he will stop the watches of people who's name begins with "J", and of course people whose name begins with J start to phone in saying their watches have amazingly stopped, ignoring the fact that there are countless millions of other unrecorded Jeremies whose watches are still going.
I've found this thread interesting and informative and it was worth hearing funandgames's experience, but unless we also hear from the countless millions of other webmasters who didn't do what funandgames did but whose pages did the same thing - or from people who did do that thing but the effect was the very opposite, very little can be concluded, especially when it concerns an action that (as far as I know) isn't referred to in any officially published documentation.
| 5:45 am on Jul 25, 2004 (gmt 0)|
If no one's sharing of their experience is worth anything unless there are MILLIONS agreeing, then why even bother having a public forum to share? Why not just wait 'til Krishna Bharat decides to write a paper for us all telling us exactly what the algo is? Or 'til a million webmasters share all their experiences and their secrets for all to see in a wiki? But even then, there would still always be some critical people throwing their negative poison around - it's just their nature, regardless of what.
There are people who DO know what they're doing and who HAVE read the papers, and when their sites fall off the face of the earth they would be flat out stupid to sit and wait and never do anything at all after a reasonable period of time elapses. Out of every thousand pages, which is all that's returned, there are 900 pages that are not even in the top 100. To suggest that the webmasters of those 900 pages sit and wait for a little tweak of the algo to put them in the top 10 or 20 is as ludicrous a thing as could possibly be suggested.
There are not millions of SEO's who will share all they know with millions of others. Not a chance.
Hey, when someone posts something meaning to be helpful, why even bother have a thread that's dozens of posts long? Why not sticky the mods to remove any posts that tell anything or share anything, in case some WFA happens not to agree with it?
How about not ever sharing anything at all, in case some WFA disagrees with it - and just spend all our time trying to guess when the next PR update will be?
| 5:54 am on Jul 25, 2004 (gmt 0)|
|If no one's sharing of their experience is worth anything unless there are MILLIONS agreeing... |
Not sure where you got this from, but it wasn't me. What I did say is it's been an interesting and informative thread, but that one anecdotal piece of feedback isn't really "evidence" of anything one way or the other if one thinks about it scientifically.
| 8:15 am on Jul 25, 2004 (gmt 0)|
"The days are gone of a one size fits all idea of optimization, or of a stable idea of what Google "likes"."
Maybe you didn't understand my post or don't understand what you wrote. The above is defeatist talk. There are plenty of sites that rank at the top month in and month out. They provide a stable idea of what google likes.
"How is it defeatist to be able to ride out all the algo tweaks and rank #1?"
Are there two steveb's? Or are you very confused about what you previously wrote?
Marcia is right, don't sit back expecting your sick neighbours cat to help you, instead, emulate the successful webmasters who no doubt secured their success on taking every observation they could find and putting a large part of the jigsaw together.
Hats off to funandgames, he's being proactive and making a difference. He may fall again next month, but that in itself will be valuable information and demonstrate that there are important pieces of the jigsaw still missing. The trick is to move on and make the right conclusions.
The conclusion that "The days are gone .... of a stable idea of what Google "likes" " would make my cat sick as well. My cats a street fighter, he would never surrender to such weak aspirations, especially while others are proving this not to be correct.
| 9:34 am on Jul 25, 2004 (gmt 0)|
"The above is defeatist talk. There are plenty of sites that rank at the top month in and month out. They provide a stable idea of what google likes."
Maybe you don't understand what defeatism is since I've posted exactly the opposite. Or, more likely, you are jumping to an uninformed conclusion rather than perpetually seeking a correct answer. This is the thread in a nutshell really.
Some folks want a definitive, simplistic black and white answer that they can cling to forever. Too bad for them. A more sensible road is to understand what I said earlier. The days are gone of a one size fits all idea of optimization, or of a stable idea of what Google "likes". The search engine world now is ever changing, even on a day to day basis. To suceed in this world you have to constantly be on your toes and working to understand the wide variety of phenomenon, some even contradictory, that are in play. MHes wants to give up trying to understand what is going on because it just too darn hard to have to think about stuff all the time, but that is the nature of the game.
Blindly lurching about in the dark is no way to succeed. Analyzing, experimenting, conversing and trying your best to understand is the way to go, whether you like to do the work or not.
There are plenty of sites that rank well month after month, and they do it by doing their best to understand what they are up against, not blindly pretending there is no way of understanding what is in play, or worse, blindly declaring that they KNOW what works or what has paid off.
Thinking you know it all and are "done" makes you dead. Even if something worked yesterday, that doesn't mean it will work tomorrow, and MHes may just hate it, but you have to study and learn and adapt again tomorrow.
| 9:45 am on Jul 25, 2004 (gmt 0)|
Oh my goodness.... there are 3 Steveb's! Everytime they post they say something totally different.
Substitute references to 'MHes' with 'Steveb' in your last post and argue with yourself. This last incarnation was what I would have said to you. :)
| 9:52 am on Jul 25, 2004 (gmt 0)|
Then read more closely.
Looking at it again, I suspect you took something I wrote that said "It is hard", and concluded that meant "give up."
No, the answer to "it is hard" is "work harder".
I'd also add one other thing. The basics of optimization do not change daily or month to month. The basic of building a content rich site (with good page titles) with authoritative merit that is valued by other quality sites, that stays the same. What changes is a myriad of smaller things, that taken together can be very big, but invidually seem minor.
I was talking to afriend about his HTML yesterday. I wanted him to take out the unneccessary bloat. When I mentioned the first example, he asked if it really made any difference, and I told him that this one thing was trivial on its own, but he had twenty bits of this bloat on a page and thousands throughout his domain. It's the same with SEO.
| 10:25 am on Jul 25, 2004 (gmt 0)|
This whole entire thread is sprinkled with a couple of people criticizing and making insulting remarks to a member who was kind enough and cared enough to share with fellow members. It's been a long time since I signed up as a member here, but I don't seem to remember any radio buttons we click when we sign up where we volunteer for any of the following roles as part of our membership
1. Critic. Duties: decide whether other people's posts have value and if they think not, publicly insult or ridicule them so that they'll stop posting.
2. WFA (aka "World's Foremost Authority"). Duties: make sure to make demeaning, belittling remarks to those members who in their estimation know less than they are *SURE* they know and make sure everyone knows how superior their knowledge is to everyone else's.
3. Carnac the Mind Reader. Duties: without having seen the sites in question, knows beyond a shadow of a doubt whether a post is correct or whether a person is talking off the top of their hat when they relate something about a site they've worked with, and is thereby qualilfied to correct everyone.
4. Self-appointed moderators. Duties: decide what should or shouldn't be posted based on their highly superior algo-cracking skills.
Funny thing is, the first post was "sharing" some tips and out of everyone who jumped on the member's case, none followed suit by also sharing tips about what might have been helpful to other members in improving rankings.
What I saw was egocentric, scathing sarcasm and criticism, with a few narcissistically and quite rudely placing themselves as the center of attention and showing themselves to be "experts" and taking the entire discussion off- topic, thereby leading to other members trying to jump in and smooth over the despicable display of rude behavior and try to save a fellow member's feelings.
A couple of people around here, who I might add happen to be chronic offenders with being insulting and negative and starting friction and dissension in discussions, need to forget about what great algo-crackers they are and take a little time to learn a little about common courtesy and develop some better people skills.
| 11:23 am on Jul 25, 2004 (gmt 0)|
Funandgames did the right thing in my opinion, as long as the site in question had been 'missing' for at least a month. I believe Florida was the first application of hilltop, which proved to be too aggressive. They toned it back, hence some sites recovered with no changes. However, things have settled a bit since then and slowly the hilltop effect is being increased in a more controlled way. One way they may be doing this is by only applying hilltop to the top 100 positions in an initial serp for a search term. Many low quality sites in the past have gained high rankings by simplistic and crude optimisation. These sites are successfully being weeded out via hilltop, because their 'links in' just don't bare scrutiny. Thus, by default, the sites that initially ranked lower, are replacing them. Ironically, by not appearing in the first 100 in the initial search, you are not subjected to hilltop and after hilltop has been applied you can rise dramatically. If your site has crude optimisation and/or appears in the top 100 AND survives hilltop, you will still rank very well. I can understand google observing that sites that rank 100+ for a search term using their initial search are often very relevant, but fail to rank well because of poor or absent seo. This addresses the problem, by scrutinising and removing the sites in the top 100 that do not deserve to be there, except for clever seo stuff but no 'trusted links in'. This gives non seo sites that never have ranked well in the past, a chance to rank better, based not on what the webmaster does, but by removing the 'seo'd sites' that do not deserve to be there.
What Funandgames may have done is lower his rankings beneath the hilltop radar. His site may have been being booted out because of a lack of good links in from relevant sites. Now he is appearing because his site is not being subjected to hilltop, but based on 'moderate seo'. The top sites will have a combination of seo and relevant links in. Many will have vanished, despite ranking well on seo factors but not surviving hilltop.
It strikes me as a very clever tactic by google. They know there are plenty of great sites which never rank well. They cannot tell the webmasters what to do to improve things, so they target the ones who know what they are doing and apply hilltop. By 'de-optimising' a site with poor 'relevant links in' you escape hilltop and the net result is a rise in rankings. However, the trick is to have maximum seo AND survive hilltop.
Its just a theory.... but hey, what do I know?
| 12:01 pm on Jul 25, 2004 (gmt 0)|
I honestly don't think de-optimising is the way ahead (unless in the process you remove something genuinely 'dishonest' - whatever that means! - or unwittingly add something)
It's easy to get causality messed up when tinkering with sites.
IMO it's best to think in terms of what your site may be *missing* - not what it already contains. This is more in keeping with a semantic algo.
| 12:59 pm on Jul 25, 2004 (gmt 0)|
|Its just a theory.... but hey, what do I know? |
My experiences confirm the same phenomenon, MHes. I posted similar experiences in several other threads, but no one seemed interested.
Thank you, Marcia. Very well said.
| 1:39 pm on Jul 25, 2004 (gmt 0)|
"IMO it's best to think in terms of what your site may be *missing* - not what it already contains."
Yes, I agree. The perception of a 'penalty' is often a case of google choosing to ignore that bit of the page.... so make sure it always has something it likes... whatever the algo tweak.
| 6:10 pm on Jul 25, 2004 (gmt 0)|
I would be interested in seeing what happens if funandgames changed back to the last versions of the pages, i.e. take back the changes he made. This would be the best way to see if the changes in the ranking were caused by the changes of the pages or not. Although this wouldn't be a proof, it would give strong evidence.
Unfortunately, I doubt that he would do this just to (dis-) prove a theory.
Anyhow, I wouldn't call the process de-optimizing for reasons already mentioned. Also, I never understand why some people think that more use of a phrase should automatically yield a better ranking.
| 11:44 am on Jul 26, 2004 (gmt 0)|
After Florida, a friend and myself decided not to kneejerk, and looked at the semantic side instead.
I didn't 'deoptimise' in any way, although my friend did - slightly - but in both cases we actually added stuff.
Rankings for both our businesses were eventually
back to normal. And with my our new understanding we are now at the top of the rankings - rather than No 2/3 etc.
| 6:17 pm on Jul 26, 2004 (gmt 0)|
Marcia, as always, speaks with wisdom, and encourages balance. Words to live by. :-)
Unfortuantely the OOP threads tend to become muddled when members hijack them by redefining what the thread was intended to be about (or, at least that happened in this case).
*This* thread was intended to be about a site that had bee too aggressively loaded with kw's:
|"I decided to take a look and saw several 'uneccessary' tactics such as phrases in filenames, <alt> text etc." |
Note that funandgames specifically said "too agressively loaded"...he did not say that any of those things were bad per se.
In fairness to those who keep attacking "OOP", the problem seems to have to do with the terms 'optimization' and 'SEO' when used in the context of "over-doing it" (a.k.a. OOP).
Yes, taken out of context, "optimizing" implies "working towards optimum performance," "making perfect," etc. But in the context of SEO, optimizing is really a matter of developing and tweaking site elements to improve site rankings in the SERP's.
"Over-optimizing" can sink a site. This is not a new concept. In it's simplest form, over doing kw density in text will get you in trouble. In reality, the alog's continue to become more complex, and it is the combination of multiple (excessive) tactics that more often than not causes problems.
Since Florida, when G tightened up its filters or changed it algo or whatever you want to say, "over-optimization" became a bigger, more emotional topic. For some of us, the term "OOP" become a shorthand way to refer to the process of site tweaking for rankings enhancement...taken too far.
Some will insist there's no such thing as "over optimization" arguing that optimation can't be "overdone," by definition. "There is no such thing as an OOP filter - there are just algo's, good webmastering, and bad webmastering."
While this may be techically correct, it is misleading if not downright sneaky wrt younger/junior webmasters, who may conclude that there is no such thing as going too far with their "optimization" efforts. Such a conclusion is likely to keep their site buried forever.
So perhaps we should stop calling it "OOP". I personally like the term just because it's colorful and pretty darned descriptive of the problem (which is either overuse of keywords in most cases, lack of understanding of the algo's, or both). But the term clearly causes problems.
So: What shall we call it when a webmaster uses the same keywor(s) too frequenly (e.g., in interanl text, internal links, H1/2/3, titles, META, inbound links, etc.) ... and as a result their page/site vanishes from the SERP's? What shall we call that?
If we can come up with a term for that behavior - one that we all find to be accurate - perhaps we can get on with discussing in a useful way what constitutes, ummm, OOP. Because it ain't just your mother's kw density anymore.
| 6:30 pm on Jul 26, 2004 (gmt 0)|
The reason some of us said what we said is because we have sites with more over op than the original poster and the sites do very well because of it. I repeat my kw's a ton I have the same kw in title, h1, and in the file name. I have the same kw with links pointing out to several of my own internal pages. It is easier to dissprove this theory than to prove it. Just making changes to your page can help. Renaming your pages can help because G sees them as new pages. I have just seen way too much evidense that disproves that theory. Some of us know this and are tired of hearing missinformation.
I accidentlyl de-optimized some pages the other day (most of my pages are auto generated) and they fell like rocks. There is no penalty for too many kw's. There is a penalty for not having enough backlinks and PR.
| 6:42 pm on Jul 26, 2004 (gmt 0)|
|So: What shall we call it when a webmaster uses the same keywor(s) too frequenly (e.g., in interanl text, internal links, H1/2/3, titles, META, inbound links, etc.) ... and as a result their page/site vanishes from the SERP's? What shall we call that? |
In one thread a couple of months ago, someone referred to it as a filter looking for "unnatural" pages and linking. (I forget who said it)
I think that is a much better, and more accurate, way to talk about it.
| 6:47 pm on Jul 26, 2004 (gmt 0)|
|So: What shall we call it when a webmaster uses the same keywor(s) too frequenly (e.g., in interanl text, internal links, H1/2/3, titles, META, inbound links, etc.) ... and as a result their page/site vanishes from the SERP's? What shall we call that? |
What shall we call it when a site has way too much blue or too many vowels or too many pictures of cows and the site drops?
| 7:03 pm on Jul 26, 2004 (gmt 0)|
BigDave - Let's call it the UP filter (Unnatural Pages Filter). Trigger the UP filter, and G knocks you DOWN. :-)
Ogletree, we have proven to our own satisfaction that there are circumstances that will allow some sites to go further with their SEO tactics that others. A number of members have commented on it. Sort of a threshold thing. You hit a threshold level and G becomes more forgiving on other fronts...making the strong stronger...
It has been a while since I've covered this ground, but: We reserve 10 sites at all times for testing only. We have (and still can) make pages and/or entire sites go away and return based on changes we make. There are filters that knock you out for going too far, in any number of areas. And as EFV noted, they often seem to be somewhat co-dependant.
| 7:30 pm on Jul 26, 2004 (gmt 0)|
While it has not affected me or my sites, and I have not done any testing on it, I tend to believe it exists simply because it makes sense for it to exist.
If a page has 10,000 external links, and they all have the same anchor text (not even a "click here" or "link"), that is not natural.
Repeating a word 3 times in almost every sentence is not natural.
The reason that I do not really like references to over-optimization is that I am under the impression that Google wants you to optimize your pages. They just don't want pages where the *only* thing they have going for them is the optimization. They want some objective proof that your page has value, not just that you have worked your way down the SEO checklist.
That is my uninformed opinion, but it is what I would be looking for if I were to be working on such a piece of code.
| 8:57 pm on Jul 26, 2004 (gmt 0)|
|They want some objective proof that your page has value, not just that you have worked your way down the SEO checklist. |
Exactly. In my experience, a good number of on-topic inbound links from expert sites is one thing that makes Google more forgiving of the UP filter.
| 9:24 pm on Jul 26, 2004 (gmt 0)|
Unatural pages could be confused with lack of properly constructed sentances or long lists, both of which do not apply to oop.
What about OTT filter (over the top)
| 9:30 pm on Jul 26, 2004 (gmt 0)|
I asked a question about this topic back in May -- [webmasterworld.com ]
I inherited a site that was very spammy in nature and dropped off the radar--though not completely out. The site has mostly rebounded back into the top 10-30 with regular fluctuations for most keywords since then. I would definitely say that correcting on page over-optimization brought this site back to life.
Although the initial goal of this correction was to hit the then google algos, ultimately it was optimization for human use that brought the site back into balance (and sound advice from Marcia and caveman). The following quote was part of the advice I received, and I think it is also appropriate for this discussion:
|Perhaps it will help you to know that most of what causes a site to suffer in G is algo based ... so to the extent that you are able to remove or modify elements that might be causing a problem for the site (WRT its ranking in the SERP's), the more likely it is that the site will be free to rank based on its merits. |
| 9:35 pm on Jul 26, 2004 (gmt 0)|
"What shall we call it when a site has way too much blue or too many vowels or too many pictures of cows and the site drops? "
Did this happen to your site?
| 9:40 pm on Jul 26, 2004 (gmt 0)|
Over The Top filters...yeah I like that better. More of a ring to it...yet tasteful, and descriptive.
| This 103 message thread spans 4 pages: < < 103 ( 1 2  4 ) > > |