homepage Welcome to WebmasterWorld Guest from 184.73.87.85
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Accredited PayPal World Seller

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 62 message thread spans 3 pages: 62 ( [1] 2 3 > >     
Is Google developing an SEO neutralising policy?
John_Caius




msg:87192
 6:16 pm on Jul 7, 2003 (gmt 0)

Recent posts on the topic of the Dominic and Esmeralda updates have introduced the idea of SEO neutralisation as a possible goal by Google. As far as I can tell, there is no hard data to back this up, but it remains an interesting topic to discuss - specifically whether it makes rational sense.

Too often analysis of updates and the Google algorithm in general lapse into detailed hypotheses with no hard evidence, forgetting about the basic core of search - that is identifying the most relevant results for a particular search query.

What is SEO neutralisation?

Some of us get a bit twitchy around update time - "have we used unethical practices, are we going to get penalised, have we linked to bad neighbourhoods" etc. This leads logically to "have we over-optimised?"

We might think that Google would introduce a new element to the algorithm, that is SEO neutralisation - i.e. penalisation of those who have over-optimised.

However, of course Google already does this, with its myriad of algorithm elements designed to combat spam, including penalising link farms, warning against linking for the purpose of increasing PageRank, introducing its spam report form etc. Over-optimisation is really only a different term for spam, or an unethical site design strategy.

Optimisation and over-optimisation

Optimisation is making sure that your <title> accurately reflects the content of your page, making sure that the header is marked as a header, i.e. <h1>, making sure that your images have alt text describing the images etc. Optimisation is analogous to good site design.

Over-optimisation is putting every possible keyword in your <title> tag, putting seven different keywords on the page each in their own <h1> tag, stuffing keywords into your image alt attributes. Over-optimisation is analogous to trying to kid the search engine that your page is more than it really is.

So let's get back to the basics

Let's take a big step back, forget about whether <h1> tags are still relevant, whether we've used the same anchor text too many times, whether our outbound link to keyword density ratio should be nearer 0.34 or 0.57 - and think as if we were a search engine reading a page that had never been near an SEO consultant.

The googlebot looks at pages. It needs to know what a page is about. It can't read, it can only see words. Certain specific words tell it what the page is about.

On most webpages, the title tag says something pretty accurate about what's on the page. Where <h1> tags are used, almost all the time they're used to highlight a heading. Not many people know about resizing h1 text with a CSS file, so <h1> means big bold text. When people link to a site, they usually use a sensible word or phrase that says something about the site, such as "buy books here" or "Amazon". If the page is about fish then it's pretty likely that the word "fish" will turn up in the body text and there'll be some pictures of fish, hopefully with some alt text saying that they're fish.

However much you worry about the intricacies of SEO, at the end of the day, any search engine has to get its information about the relevance of a webpage from somewhere. The relevance of a page is defined by what's on the page and the links to that page - that's a fact and it's not going to change.

Good website design and good SEO

Good website design is about creating useful, interesting and relevant content that is graphically pleasing, simple to use and interact with.

Good SEO is simply about telling the search engine what the page is about. Most people don't use all the factors that googlebot assigns importance to, so their links might be called "click here", for example, or they don't assign <h1> tags to their headings.

Too much website with not enough SEO leaves a website that underachieves, simply because Google doesn't know what it's about and therefore doesn't know that the content is relevant to what the user's just searched for.

Too little website with too much SEO leaves a website that overachieves, for which Google uses the term 'spam'. If you feel that you've reached a limit as to the amount of ethical SEO you can do then build more website - your ethical SEO will therefore now be more powerful.

So let's have a bit less paranoia and remember the principle - Google needs to know what your page is about to be a meaningful search service, but it doesn't want to be cheated. So tell it in all the ways you can, but don't oversell an inferior product - improve on what you've got by building a better site for your users, and see your ranking and traffic improve with it.

 

Mohamed_E




msg:87193
 7:29 pm on Jul 7, 2003 (gmt 0)

John_Caius,

Many thanks for an excellent post. It is posts such as yours that have prevented me from giving up completely on Forum 3.

swerve




msg:87194
 7:32 pm on Jul 7, 2003 (gmt 0)

Great post, John.

I have thought about this too recently. Here is a question that I think professional SEOs should contemplate: Given your experience and knowledge about Google and SEO, do you think you can help a client's pages rank higher than other, more-relevant pages? If the answer to this question is yes, and if Google's goal is to provide the most relevant results to users, it would make sense that Google would want to make some changes to ensure (user) relevancy of search results.

That said, I don't know how successful they would be at attempting to identify "over-optimized" sites in order to improve relevancy. Keyword density, overuse of <h1> tags, and anchor text-stuffing are not as cut-and-dried as something like hidden-text. The best solution, I think, would be better semantic page analysis algorithms - algorithms that do a better job understanding what a page is about. This would allow Google to rely less on other scoring factors such as keywords, titles, and <h1> tags.

Oaf357




msg:87195
 7:38 pm on Jul 7, 2003 (gmt 0)

Outstanding. Now I remember why I come to forum3.

JonR28




msg:87196
 8:06 pm on Jul 7, 2003 (gmt 0)

I have some of the most irrelavant stuff ranking in front of me, even domain names for sale(okay that was MSN search) so I say bring on the SEO Neutralising if it means more relavent results.

glengara




msg:87197
 8:21 pm on Jul 7, 2003 (gmt 0)

I'm working towards pages that need to be looked at twice to determine if they had actually been optimized.

satanclaus




msg:87198
 8:46 pm on Jul 7, 2003 (gmt 0)

Great post. Someone should put that on the front page of WebmasterWorld.

MOOSBerlin




msg:87199
 9:02 pm on Jul 7, 2003 (gmt 0)

John_Caius: Optimisation is making sure that your <title> accurately reflects the content of your page, making sure that the header is marked as a header, i.e. <h1>, making sure that your images have alt text describing the images etc. Optimisation is analogous to good site design.

You are right, greets from germany and thanks for your great post.

Mohamed_E




msg:87200
 9:04 pm on Jul 7, 2003 (gmt 0)

I'm working towards pages that need to be looked at twice to determine if they had actually been optimized.

An excellent way of putting it! I describe my pages as "lightly optimized", but I prefer the way you put it. Hmm, soon I may be saying that I coined the phrase.

ericjunior




msg:87201
 9:30 pm on Jul 7, 2003 (gmt 0)

john - nice to see a post from someone who's taken a step back and looked at the bigger picture.

Things have got so fraught over the past months that i must admit i have been reading less and less of this forum as every thread seems to repeat the same issues of cunfussion and despair. Like you suggest - "improve on what you've got by building a better site for your users, and see your ranking and traffic improve with it." - i have refocussed on making my sites better. But here also lies my problem with one part of your thoughts...

These new filters are hindering even the most cautious of SEO's who ARE purely trying to improve their sites for the end user and not to spam!.

I recently created 2000 pages of new content which are generated dynamically and which DO very much improve the websites services and which were undertaken without a single thought as to how they would improve the sites listings - but i now sit here waiting nervously as to whether i sail to close to the wind on one of these new new algorythms and end up getting penalised. Penalised for improving a site!

Your ideas make every sense - but i am affraid the reality of the current situation is a little harsher!

needhelp




msg:87202
 9:30 pm on Jul 7, 2003 (gmt 0)

Not to put a damper on this thread, but...what if you have to consider more optimization than you normally would like - in order to rank better than the pages that are clearly LESS relevant than yours, but keep ranking better? There seems to be a very very fine line, and for me, as a site owner that only wants to play by the rules, I have no clue anymore what to do. I know, I know...boo hoo!

Oaf357




msg:87203
 9:53 pm on Jul 7, 2003 (gmt 0)

Relevancy is a general opinion isn't it?

Perhaps others seem to think your "more relevant" content isn't really relevant.

Just food for thought.

Mohamed_E




msg:87204
 9:57 pm on Jul 7, 2003 (gmt 0)

An issue related to needhelp's question: Are there some subject areas in which there is so much spam that an honest site has no hope of doing even moderately well?

I have no personal experience, but suspect that there are indeed areas where spam rules.

[edited by: Mohamed_E at 9:59 pm (utc) on July 7, 2003]

Powdork




msg:87205
 9:58 pm on Jul 7, 2003 (gmt 0)

Given your experience and knowledge about Google and SEO, do you think you can help a client's pages rank higher than other, more-relevant pages?

The first thing to go over with my clients is how they can add more content to make theirs' the most relevant or to make sure it stays the most relevant.
But in my experience, optomisation often means fixing things where someone made them unspiderable. A page about widget packages that links to 20 different packages with a js rollover that could easily be handled with css. A photo gallery about widgets where the alt text is hp318 6/03/03 (what the digital camera named it). Everything hidden behind a flash intro, etcetera. Finding the balance between design, searchability, and usability is required to keep clients coming back. Not finding the balance brings them back, for a refund.

MOOSBerlin




msg:87206
 10:03 pm on Jul 7, 2003 (gmt 0)

ericjunior: i am affraid the reality of the current situation is a little harsher!

Nobody knows (besides Google)!

steve128




msg:87207
 10:09 pm on Jul 7, 2003 (gmt 0)

>>><title> accurately reflects the content of your page, making sure that the header is marked as a header, i.e. <h1>, making sure that your images have alt text describing the images<<<

If only it were that easy, if google is attempting to stop "spammers" thru esmeralda et-al they are not making a good job of it.

The original posters message in all sincerity.. is naive beyond extreme.

A good spammer will always beat a good "optimizer", maybe short-term, but that is their aim, short term goals, and another site/s the following month.

I have no doubt whatsoever that spammy sites will win, I have had it with google, call me a can't beat em join em member, so what!
It is far cheaper to create a spammy site than try to "go by the rules" and still end up page #50

You know creating a full no messing about full **** spam site is dead easy, sure it won't last long....will it?

swerve




msg:87208
 10:11 pm on Jul 7, 2003 (gmt 0)

Relevancy is a general opinion isn't it?

Relevancy is a Google opinion :-)

As you say, true relevancy is in the eye of the beholder: what is relevant to one person may not be relevant to someone else, even for the same query. At some point in the future, search engines will get to know users on a more personal level, taking into account a searcher's profile and/or previous searches in order to improve relevancy for individual users. In the meantime, relevancy is what is generally (and geographically) determined by Google's algorithms.

swerve




msg:87209
 10:16 pm on Jul 7, 2003 (gmt 0)

The first thing to go over with my clients...

All good points, Powdork. However, since you did quote my question, I would like to point out that you didn't answer it :-)

mfishy




msg:87210
 10:17 pm on Jul 7, 2003 (gmt 0)

<<Perhaps others seem to think your "more relevant" content isn't really relevant. >>

Google now only thinks my content is relevant on odd days, but on those days it is REALLY relevant... :)

As far as an seo neutralization algo goes, there has always been one. Google has never WANTED pages to rank high that have been heavily manipulated (backlinks). Their entire system is based on the presumption that web pages link to others "naturally".

Of course, things that may have worked last month, may not work at all today. But this is and will always be the way GG operates. I can think of a ton of tactics that used to work really well five months ago that are already dead.

The only "strange" occurence that I see with the new and "flighty" Google is the random appearance and removal of pages with no real changes. Whether this is because of the new algo/crawling process or a lousy side effect of the new algo- I have no clue.

I do not find it surprising that Google may have tightened up on some common seo practices as they have always done so. The problem is it seems that many times they change to combat abuse, overall quality tends to dip.

Either way, over the years I have learned to position myself so that even my bad days/months are pretty darn good :)

I hope that those who are just starting out or have been lucky enough to not experience the wild swings of SEO, have learned that the only constant in this game is change.

Of course this doesn't mean that losing your index page, bouncing in and out of SERPS, or being banned doesn't just "flat out" suck. It's the actions you take from this point, however, that will either make you rich, or back to the 9-5 world.

ericjunior




msg:87211
 10:22 pm on Jul 7, 2003 (gmt 0)

"9-5 world" - now theres a distant memory! Currently working the 7am-1am world - where did i go wrong!

steve128




msg:87212
 10:30 pm on Jul 7, 2003 (gmt 0)

lol ..I'm 9am till 3am...but it is more fun, and the wages are better, I can even take a break when I want ;-

steve128




msg:87213
 10:33 pm on Jul 7, 2003 (gmt 0)

>>>Either way, over the years I have learned to position myself so that even my bad days/months are pretty darn good :) >>

position yourself in google, lol, even David Blain can't do that...you on to something.. or maybe, never mind -;

mfishy




msg:87214
 10:38 pm on Jul 7, 2003 (gmt 0)

<<position yourself in google, lol, even David Blain can't do that...you on to something.. or maybe, never mind -;>>

Not magic :)

I just have it set up so that if some pages/sites do poorly, there are always others out there to take up the slack. Not to mention a ton of return visitors.

ericjunior




msg:87215
 10:41 pm on Jul 7, 2003 (gmt 0)

so mfishy - are you immune to a site wide penalty?

mfishy




msg:87216
 10:47 pm on Jul 7, 2003 (gmt 0)

<<so mfishy - are you immune to a site wide penalty? >>

Of course not. Ranking poorly hurts everybody.

Having multiple sites, and of course other streams of revenue, makes "penalties" hurt a lot less. This was my point.

I could not sleep at night if I had only one site and that one site generated all of it's income through Google however.

TheWhippinpost




msg:87217
 11:10 pm on Jul 7, 2003 (gmt 0)

I think it's a "lock & key" situation; Google has the lock, but only hints at what key we should have.

If Google wants to provide it's users with good relevant results, it should try to perform algo analysis such that it "forces" us to create pages that are relevant, and then provide us with the key - I could be argued that they're censoring content by not telling us what the key is.

All this "micro-crypto-analysis" that we've seen debated tirelessly here is symptomatic of the above and reminds me of the code-breakers during times of war! These debates wouldn't live if only we knew the format of how it wants pages presented, ie...gives us the key.

My theory is that we're ALL "spammers", even if we play by the rules, we make sure we PLAY by those rules...We almost have a duty to ourselves and our audience to do so! The problem is, we don't know what those rules are, it's left to us to "accidentally" break them to find out.

I also think Google has unintentionally created another spam problem, that of spamming webmasters for links in order to gain PageRank, whether the link is context-relevant or not... effecting members of this board too I've come to notice!

mipapage




msg:87218
 11:11 pm on Jul 7, 2003 (gmt 0)

A good spammer will always beat a good "optimizer", maybe short-term, but that is their aim, short term goals, and another site/s the following month.

Sad but true.



In my humble and not so expert opinion (though our sites do well!), GoogleGuy has mentioned and others as well that Google likes what is good for the user.

Well, I think the W3C likes that too, so for me, SEO for google means to simply make a good well thought out lean site with valid code. And have you seen this thread [webmasterworld.com]? Very good, very basic 'new-school' design ideas.

There are a lot of 'non-validating', poor-useability-and-accessibility sites out there. I've had success just writing a site with valid code (h1's, h2's p tags and all) with a good title tag and some focussed inbound anchor text. There's an advantage out there that I think Google embraces - I would call it designing with web-standards. So in this respect, they won't be doing any SEO neutralizing.

Unfortunately the quote at the top of this message is true - and if Google can't do anything about it (and I see they aren't, as we're surrounded by Tos violations) then people resort to 'fine-line-seo' (or just blatent spamming), for lack of a better term. I really don't see them SEO neutralising until they get rid of the easy stuff - black text on a black background, for instance. But some people may call 'BT on a BB' SEO - like the aforementioned Tos violators - sadly each one offers web design and web promotion <rant> off they go misleading clients with their Tos violating methods </rant>.

So, we design to standards, build content, get good quality links, and hope for the future...
Albeit holding our own against them, I think we're the lucky ones...

John_Caius




msg:87219
 12:13 am on Jul 8, 2003 (gmt 0)

There are distinct groups of WW members here - some who go for "build a content-rich site well and watch for a general trend in ranking and traffic improvement" and some who go for "build a highly targeted site and go for high ranking in a single highly competitive keyword/phrase". The first group will probably generally agree with my first post and I would consider myself amongst them. The second group will probably generally disagree with my first post as the highly competitive areas are indeed often dominated by spammers, networks of interlinked sites etc.

However, perhaps it would help the second group in their endeavours to consider the views of the first, specifically in two areas:

1) The page versus the chapter approach

Taking the first post's approach of thinking like a search engine and taking the search term 'asthma' as an example. If I'm ranking pages in terms of relevancy then I might have three types of results, in order of importance:

a) the authority, e.g. The American Asthma Association
How do I know it's an authority? Well, it's got 10,000 external incoming links all with the word asthma in, from some very important sites like the WHO and the CDC.

b) a chapter on asthma in an online medical textbook
There's the chapter homepage and then another 30 pages of content. The SEO bit is to link the subpages back to the chapter homepage with anchor text "asthma homepage" so that the googlebot sees and comprehends the hub arrangement, seeing that the homepage is actually the top of a pyramid of lots of pages of relevant content.

c) a single page on asthma, perhaps someone's personal site.
Since there's only a single page of content the googlebot has to assume that it's not as relevant or valuable a result as the 30 page chapter.

This approach can be applied to any keyword area. Even in the most competitive keyword areas, pages frequently stand alone as a single, albeit highly optimised, page of content. If you're trying to outrank your competitor then build a chapter of content under that main page and make it clear to the googlebot that the main page is now the hub of a little pyramid of information, hence more relevant and valuable than a single page of content.

2. The broad site theme versus the narrow site theme

When Google launched Google News, did they have to optimise that section very strongly to rank in the top ten for the keyword 'news'? Did they have to find a thousand high PR incoming links? No - because the main site, albeit based on a different specific theme, passed on ranking weight to the new service, such that it could compete instantly with some of the most widely linked sites on the web.

What lesson can we learn? Well, most sites built to compete in a highly competitive area do one thing - maybe they sell widgets, maybe they give booking information for Widget Hotel, whatever. Unfortunately, if your site only does one thing then you need a heck of a lot of optimisation before it will compete. Look at Google - its main focus is as a search engine and after several years and a quarter of a million incoming links, it's still behind Lycos and Altavista in the SERPs.

If your competition has optimised sales pages then they also only have links in from sites who want to link to sales pages. If you have sales pages linked to from your comprehensive informative guide to your product area then many more people will link to your informative product area and that passes significant ranking weight to your sales pages.

Added to this, you're no longer having to bank on your position in a single search term, which may fluctuate drastically from month to month - rather you're now competing across a broad range of search terms, in some you'll go down, in some you'll go up, overall you'll be more stable.

Analyse your own posts

If you're complaining because you feel a less relevant site is outperforming you then make the googlebot see your content as more valuable by enhancing it. If you've got an incredibly optimised single page then develop it into a chapter, which will make the chapter homepage rank better.

If you're complaining that "my ranking dropped from 5th to 17th so I hate Google" then you're hanging your hat (and your traffic and ROI) on a single keyphrase. Develop broader content and watch your monthly traffic both stabilise and increase.

Personal experience

One of my sites has about 100 pages of fairly simple content, in a theme that includes a lot of highly optimised but narrow commercial sites. Since January I haven't modified the site in any way and my average unique visitors per day over each month has remained stable within a range of 10% - even through Dominic and Esmerelda when SERPs for some keyphrases dropped quite a bit. I'm not ranking outstandingly well for any huge keyphrase, but I'm ranking very well for a wide variety of two, three, four and even up to eight word phrases. I've tagged on two or three targeted pages for associated commercial services and these have gone straight to the top five in the keyphrases I targeted, plus several others I didn't even think of. The main content added powerful ranking weight to the targeted pages when they were competing against pages that were part of just three to five page sites.

Remember that there's always more than one way to skin a cat. If the keyphrase "buy blue widgets" is completely stuffed with sites you can't compete with then build the best informative site all about blue widgets, get traffic for all the associated terms and use intelligent within-site design to make your visitors into customers for your product.

Patrick Taylor




msg:87220
 12:38 am on Jul 8, 2003 (gmt 0)

TheWhippinpost:

My theory is that we're ALL "spammers", even if we play by the rules, we make sure we PLAY by those rules...We almost have a duty to ourselves and our audience to do so! The problem is, we don't know what those rules are, it's left to us to "accidentally" break them to find out.

I agree with this completely. No-one seems to be able to demonstrate exactly the point where an optimizer becomes a spammer. It's quite natural to keep on "optimizing" to improve one's ranking. The example of alt and title tags is a good one. What is unethical about alt text which uses a whole phrase to describe what the image actually is about? ... or title text which gives a short description of where a link leads to? Where is the line to be drawn and how is anyone to know where this line is?

I used to be an architect and we had something called Building Regulations. The regulations were very clear, and didn't leave one guessing if the building would be closed down at short notice by an invisible higher authority.

rfgdxm1




msg:87221
 12:49 am on Jul 8, 2003 (gmt 0)

You have a point Patrick. The rules for Google are unclear. Some clear no-nos, but the boundaries are fuzzy.

This 62 message thread spans 3 pages: 62 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved