homepage Welcome to WebmasterWorld Guest from 54.205.236.46
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 73 message thread spans 3 pages: < < 73 ( 1 [2] 3 > >     
What should be new linking strategy after florida?
h1 or not, alt .... what and what not ....
AjiNIMC




msg:165360
 6:58 pm on Dec 1, 2003 (gmt 0)

Hi,

After the florida update i thought of penning down the summary of it apart from cries.

Here what I hate to see but true

Linking Strategy
------------------

You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage. Donot do reciprocal linking, you will be penalized for that.

H1 tags
---------------

You should not have more than one h1 tags, donot follow the h1, h2, h3 sequence. Have H1 and h4 and thats all.

Alt tags
-----------------

I recommend donot use it right now, the reason being all tricks of SEO should not be applied on the same page. So leave it, less priority.

Title
-------------

Title, meta tags keep it as early, but I will say donot use title and h1 together.

Keyword density
--------------------

between 3% to 10%.

Links + Title + 5% keyword density +nothing else =#1

More to add but right now I will stop here for other experiences, lets share it and get into the right track again.

Thanks
Aji

 

onedumbear




msg:165390
 11:52 am on Dec 2, 2003 (gmt 0)

>Google does not penalize for SEO. Google does not penalise for proper HTML coding. Google does not penalize for incoming links or for outgoing links. In general, i will say that Google does not operate with penalties at all

I just had to "second that"
good post clause

Nicola




msg:165391
 11:54 am on Dec 2, 2003 (gmt 0)

The Florida update was not a lot about your pages. It was mostly about Google's own pages - specifically that little thing known as the search box. What people type in here is now treated a little different, and it's actually pretty impressive.
Why is it pretty impressive?

By all means, it was a minor tweak in terms of page ranking and weighting. In terms of focus and understanding of search patterns it was a major leap.
Do you mean a commercial leap? I cannot post evidence of this, although it does exist at a website the mods would not want me linking to.

merlin30




msg:165392
 1:02 pm on Dec 2, 2003 (gmt 0)

I tend to agree with claus, although I feel that there has been a major shift in the "understanding" and "meaning" of data to supply some more ranking attributes. The ranking algorithm itself may not have changed that much - it just has some new attributes to play with.

Just Guessing




msg:165393
 1:27 pm on Dec 2, 2003 (gmt 0)

>What they do is to assign weights to all kinds of things and then they rank pages according to those weights. If you page has the highest sum of weights for some term or phrase, you will simply rank highest for that. And, yes, weights can be set to zero as well as any other number - and they keep changing a little now and then.

I'm not sure if Claus is suggesting that weights can be negative as well as positive or zero?

Personally, I don't see why not.

And a "penalty" might be a good name for a negative weight :)

jady




msg:165394
 1:32 pm on Dec 2, 2003 (gmt 0)

I think Google is beginning to score OTHER pages on your site, to find relevance to per say your index page or other main page. Meaning if you have a page about BLUE WIDGETS and stricly follow the patterns that seem to work pre-florida it wont rank as well as a site with the SAME page that has 10 articles (on other pages of the same site) about Blue Widgets.

CONTENT CONTENT CONTENT

Was it Brett that always says "Its better to get 1 hit from each of 50 pages -vs- 50 hits from one page"

too much information




msg:165395
 2:19 pm on Dec 2, 2003 (gmt 0)

Here's something for you, from Google's Help section:

>Google now uses stemming technology. Thus, when appropriate, it will search not only for your search terms, but also for words that are similar to some or all of those terms. If you search for "pet lemur dietary needs", Google will also search for "pet lemur diet needs", and other related variations of your terms. Any variants of your terms that were searched for will be highlighted in the snippet of text accompanying each result.

So it is better to theme your pages rather than target them for keyword combinations. Maybe that's what GG ment when he said that someone searching for Cheeseburger most likely would be interested in results for making one rather than buying one? So there is no penalty, your list of competators just got much bigger.

I would think that by working for links on sights with similar themes as yours you should be ok.

jady




msg:165396
 2:27 pm on Dec 2, 2003 (gmt 0)

Dont see this 100% yet though, searching for web design in my local area -vs- web designs in my local area pull two totally different set of results, each catered to singular and plural.

Not to say it isnt true as it probably is - just not in full force yet.. :)

too much information




msg:165397
 2:53 pm on Dec 2, 2003 (gmt 0)

I completely agree, I see the same thing for different searches, but on a specific search that I used to rank #1 or 2 and ALL of my competators were on the first two pages, there are now only directory style sites.

I do not completely understand why they have taken over that page yet, but I think I am getting close to an answer. What I have noticed is that I am showing up for a more broad range of searches. Some of which get 6 times the search traffic than the one I once targeted.

The problem is that my site is only 'related' to the search, not specifically targeted for the search. This is a good thing for name recognition though so I'm not complaining.

kaled




msg:165398
 3:03 pm on Dec 2, 2003 (gmt 0)

With regard to the opening post of this thread, my site breaks most of the suggestions and is unaffected by Florida so far as I can tell (however, the search terms are not big money ones).

In the searches I make, I have not seen poor results, however, I take on trust that many searches are produing daft results. I'm reasonably confident that I understand why now (in general terms). It would appear that either as a result of a bug or deliberate design, pages may be dynamically classed as spam depending on the search terms. This is simply not a strategy that can last - it's ludicrous - one of the most stupid ideas I've ever encountered.

For example, a vague search might put a relevant site at #1. A more precise search could see that relevant site classed as spam and therefore placed nowhere. Think about it for a moment. Does that sound sensible to anyone? GoogleGuy, does that sound sensible to you? You would need to be certifiably insane to think that was a good idea. So either Google will drop the idea/fix the bug or users will eventually stop using Google.

It therefore follows that making wholesale changes to sites and backlinks at this time is premature. It is natural that people will want to look for small tweaks to get back into the SERPS so here are my suggestions.

1) Change your titles slightly (but keep them sensible for users, other search engines and post-Florida Google). Perhaps use alternate spellings, or synonyms.

2) If you have H1 text at the top of the page, treat it as you would the title. Perhaps use CSS to create the same visual effect by other means.

3) If you have keywords stuffed at the bottom of your pages (or elsewhere) clean it up.

Beyond that you may be wasting your time or even worse - doing harm. Time would probably be better spent trying to get more backlinks (with neutral anchor text) and perhaps creating more content (on topic or off-topic). After all, it has been suggested that high pagerank is helpful in the current climate. Certainly, these should help in the long term.

Kaled.

Total Paranoia




msg:165399
 3:32 pm on Dec 2, 2003 (gmt 0)

Could it be true that you can now hurt a competitors ranking on a competitive phrase if they did survive this update?

It appears that all you need are several sites you can link to them from with their keywords in the anchor text and they will be dropped with the rest of us that have keyword rich anchor text.

I have a site that does not link out but has around 80 inbounds with anchor text with 30 showing in google. This site has gone from #5 & #6 to no-where on a competitive commercial phrase.

My new linking stategy -- forget about finding webmasters to link to your own sites, get them to link to your competitors instead! ;)

crobb305




msg:165400
 4:33 pm on Dec 2, 2003 (gmt 0)

1) Change your titles slightly (but keep them sensible for users, other search engines and post-Florida Google). Perhaps use alternate spellings, or synonyms.

I have often wondered if Google compares various cache versions of a page to determine exactly how much tweaking is done to the title and other tags. Excessive tweaking may indicate the type of SEO that Google is trying to eliminate. Whereas, changing body text might be more indicative of changing/adding content. I have noticed that many of the sites that have suffered recently have done continual title tweaking. Some of the sites I still see at the top have had relatively consistent titles for quite some time (possibly indicative of a stable theme/concept). Just some food for thought. I doubt that now is a good time to be doing any tweaking (which may send a red flag to Google that SEO is being done).

C

Total Paranoia




msg:165401
 4:50 pm on Dec 2, 2003 (gmt 0)

I have noticed that many of the sites that have suffered recently have done continual title tweaking.

Nah - I tweak titles daily on some sites and google seem to love it.

crobb305




msg:165402
 4:58 pm on Dec 2, 2003 (gmt 0)

Total_Paranoia, I am just saying that Google seems to be making a statement about obvious SEO. They may consider excessive tweaking to be bad SEO (obvious/aggressive attempts to remain at top of serps). Again, I am talking about excessive tweaking, and I may be way off. But in the end, you still have to do what you think is best to reach your users.

too much information




msg:165403
 5:29 pm on Dec 2, 2003 (gmt 0)

Google doesn't care about penalties, they have far to many sites to go through to worry about it. What they are trying to do is return good SERPs as fast as possible. I believe this update is only half over (remember the Google Patent [webmasterworld.com] discussion) and if your SERPs are filled with directory sites and link pages, then your best move is to get your site listed on every single one of them.

Mix up your anchor text, and exchange like a fool. When the next wave hits it will be nice to have links from all of those sites that fit your keyword category.

kaled




msg:165404
 5:34 pm on Dec 2, 2003 (gmt 0)

Tweaking does not equal SEO. I have tweaked my titles recently, partly for SEO and partly because I thought they were too verbose - yes I shortened my titles. The idea that any tweaking of a page could push it over the threshold into spam is daft. However, you are entitled to believe Google is daft - the evidence is there.

Kaled.

PS
This would require considerable CPU time that could be better used.

crobb305




msg:165405
 5:38 pm on Dec 2, 2003 (gmt 0)

Tweaking does not equal SEO

I said EXCESSIVE tweaking. I see SEOs sites changing page titles after every new fresh tag trying to get higher/better rankings. Think about it...if Google is theming as some have speculated [webmasterworld.com], and a theme, by definition, is consistent, then changing titles daily MAY reveal inconsistency. It may be just one of many red flags that Google could use to cause other (consistent/non-tweaked) sites to rank higher.

I know that some tweaking would be expected depending on the forces that drive a particular industry, and ever-changing marketing strategy. It just seems like Google could decide that too much tweaking (any of the tags) is bad. Who knows LOL. It was just a thought. Everything here is so speculative, but certainly interesting to talk about. ;)

seofreak




msg:165406
 6:10 pm on Dec 2, 2003 (gmt 0)

sparticus and marcia, as a web designer i am experiencing the same.

funny thing about my "web design countryname" search is ..

1) 4 out of 10 sites that are at top have descriptions from ALT tags .. which are stuffed heavily. Its just funny how ALT tag is being considered for the top position. the 1st two SERPS have the exact same descriptions, word to word. So possibly one of them is spam of the other, but its almost as if google found the right combination of words and listed them at top .. sites i am talking about are PR5 and 6 .. mine is 7 and i am listed on the 4th page.

2) secondly, a site has 5 listings on the 1st page itself .. the sub-domain, the parent domain, the internal page, 2 other spam / door way pages. Its disheartning to see other people get away like that.

SlyGuy




msg:165407
 6:19 pm on Dec 2, 2003 (gmt 0)

SEOFreak said:
sparticus and marcia, as a web designer i am experiencing the same.

I started a thread over here [webmasterworld.com] concerning this topic.

- Chad

AjiNIMC




msg:165408
 6:19 pm on Dec 2, 2003 (gmt 0)

Hi,

I am not giving a certain advice in this uncertain realm, I was just putting down what I saw and what I have observed.

I know five for a key phrase out of million is very less to get a good conclusion but I am sure of one thing, this exam has changed. As some of the geeks have mentioned about the weight,

1)1% to a % ------- fail ( you will not get good ranks)

2)a % -- b % ---- optimum ( you are there close to #1 if not #1)

3)b% to 100% ---- again u fail ( over SEO tactics).

I am very very sure about this, I have gone through enough sites to conclude this, but sure a% and b% yet to decide. All of the sites which are there in the top ranks, doesnt have all SEO tricks. I will say they have onle few, some of them even donot have meta tags like keyword and description. Some of them doesnt have key phrases in their title.

These are just few attempts to know the theories but I know no one can be sure about this, all we can do is see and guess.

Aji

mquarles




msg:165409
 6:24 pm on Dec 2, 2003 (gmt 0)

What if Inverse Document Frequency [www9.org] were being used to determine a and b?

This might also help to explain why big brands with their anchor text survived.

MQ

Chndru




msg:165410
 6:32 pm on Dec 2, 2003 (gmt 0)

Ahh..Google never attacks a problem with a hammer.

I seriously dont know who's intelligence are we undermining. Google or Ours, with our wild conspiracies/theories oozing from 5 sites' observation.

Dave_Hawley said it all, in msg #14 of this thread.

AjiNIMC




msg:165411
 6:34 pm on Dec 2, 2003 (gmt 0)

I am maths guy but donot know how to solve this equation, all I can do is predict as most of us are doing here.

Aji

plasma




msg:165412
 6:54 pm on Dec 2, 2003 (gmt 0)

Linking Strategy
------------------
You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage. Donot do reciprocal linking, you will be penalized for that.

reciprocal linking: can't agree on that one
several keyphrases: agreed

But there's another thing I'd like to add.
The negative effects of Florida are much harder with english/international keywords.
Our german keyphrases aren't affected at all. Maybe due to a dictionary, but most likely due to less occurences.

claus




msg:165413
 7:58 pm on Dec 2, 2003 (gmt 0)

>> Why is it pretty impressive?

Have you tried a serial number search recently? Just entering some product code? Searching for "review soundwidget 5.1"? A technical/specific one like "font-face" or/vs. "font-family"? Or more general/topical like "apache url rewrite"? Or an ambigous one like, say, "turkey christmas" (vs. "christmas turkey")?

It seems to me that even without entering special operators the searchbox can often distinguish between if the terms should be interpreted very specifically or very broad/topical (like suggesting alternatives). As for the broad results, think in terms of, say, 10 "pre-Florida related searches" (being performed, filtered and ordered) for every one word in a "post Florida" query with no apparent loss in query time.

>> Do you mean a commercial leap?

No, i do mean a leap in better understanding of search behavior. I don't know jack about the economy of Google, but i think it's better than mine. Let's not go there, please.

>> weights can be negative as well as positive

Doesn't really matter - you can accomplish the same with entirely positive numbers, even without zero, all you have to do is rank something higher than something else (or, you could even make the whole scale negative). It could even be binary as well as decimal, octal, percentages, fractions, rules and equations, whatever...I really don't know, i suppose that's the "rocket science" part of Google's inner workings; optimizing for maximum efficiency.

Perhaps my definition of "penalty" is unclear... To me, a penalty is not when you rank bad. It is not even when you rank good, and then you do something (or you do nothing, but the algo is changed) and then, suddently you rank bad. A penalty would be if you ranked worse than you were supposed to, as dictated by the current rulesets/scores/whatever.

As in, say: Two guys are bad spammers - okay, make that a boy and a girl, and make them wear nice white hats in stead. Both do exactly the same things in terms of the factors that are considered for ranking, but one ranks significantly different than the other.

However, i agree that "broad matching" can have some similarities to "penalty", as in the extreme such two sites could rank very differently - not that this would be caused by penalties, but due to "filtering" to obtain broad matches (for lack of better words). I don't think Google has shown us anything but a quick glimpse of this technology yet.

So, what to do about linking now?

AjiNIMC, if i did so, i'm sorry i caused this thread to go off topic. Here are my two cents at the moment:

  • It does not hurt you to link out. This has been considered a bad thing to do by some in the past, but it doesn't really seem to be the thing to avoid right now.
  • As for incoming links, it is probably a waste of time writing to others to make sure they all use the same anchor text.
  • It does not hurt you to have a clear structure and hierarchy across your site. Also on the individual pages, using titles as well as headline tags for pages and sections. I'll recommend it anytime.
  • Validated html might not benefit you directly but it will reveal any coding errors you might have made, and it's a good tool to get that document structure right, so it might be a wise thing to do anyway. As ALT-text on images is part of valid HTML, it might even help you if important parts of your site is made using graphichs.
  • Added: If you have more than one domain pointing to your site, or other "duplicate" issues, now might be a very good time to clean them up.

Still, the next update might even be different once again.

So, how to beat the broad match?

That's a tough one. Basically the same things that made you rank high before Florida will also make you rank high now... although in some cases it will be for another query. And, as plasma points out - the broad match database (if such a thing exist) has not been translated yet, it seems. It's mostly an English thing (as in language, not geography).

So, there's a choice: Go for good ranking in the broad match, or keep your good ranking for an exact search. Anyway, perhaps you can even do both. For some competitive searches i've done post-Florida i have seen clearly commercial sites in the top SERPS for standard "broad" searches, but still, one industry's "broad" is another's "specific".

Still, the next update might even be different once again....oh, sorry about the repetition.

Btw. I think that the knob has been turned a little towards more exact match lately - the new algo seems even less "different" now than a few days ago, but i might be wrong here, i've only done 10 searches or so today.

/claus

mil2k




msg:165414
 8:51 pm on Dec 2, 2003 (gmt 0)

I think trying to understand the concept of Eigenvalue and eigenvector as well as webgraph can be valuable.

I think you are talking of Second eigenvalue of Google matrix [webmasterworld.com]. That would help you understand artificial linkage patterns better but IMHO that should have been applicable before Florida.

The triggering of current filter can be best seen in commercial searches. Not all commercial searches are affected, but many are.

You should not have one single keypharse as anchor text for all backlinks, try to have around 4 to 5 keyphrases (40% use the main keyphrase, next 30 % some other,...). Also try to link different pages instead of your homepage.

I do not have much problem with that :)

Donot do reciprocal linking, you will be penalized for that.

Have only seen 1 paper by Krishna Bharat which was about filtering based on reciprocal linking. But there are many sites using reciprocal linking and are perfectly allright. So I would like to disagree with this point.

I would not like to comment on the rest of the things bcoz as many here I believe that many serps are in a mess and would not stay this way if Google is concerned about it's QC.

allanp73




msg:165415
 10:46 pm on Dec 2, 2003 (gmt 0)

I noticed the majority of the discussion here and with the other update threads deals with what things are on the page which could be causing Google to filter them out. After analyzing the serps I noticed that generally the sites that are well ranked for commercial terms have something which the filtered out sites didn't.
This is links. It seems that Google is filtering sites out based on whether or not they lacked outward links rather than page factors. Thus commercial serps are dominated by directories and sites with cross linking. I think Google might even take into account whether these outgoing links are located on the same ip. Thus you couldn't make a directory by just linking to your own sites. You have to link to others. In order to rank well in Google you have to make your site look like at directory. I noticed it gives a bonus to sites which link to others with anchor text that is the search term. Adding a bunch of links at the bottom of the page won't help. There also seems to be a preference to links which appear within a paragraph of text.
So my point is stop de-optimizing it won't work. Look at how to link to others within you money word theme. If Google wants to be the search engine for directories than become the best directory.

I made this observation because I noticed many of the top directories now listed for my phrases are just my link partners. Also, one of client's competitors remained listed for a money term much to the confusion of the client. At first I couldn't figure out why they were ranked #1 but then I noticed that their links actually went to other sites, giving them the appearance of being a directory.

bekyed




msg:165416
 12:56 am on Dec 3, 2003 (gmt 0)

Nah - I tweak titles daily on some sites and google seem to love it.

Total Paranoia,

Perhaps you should get out more lol!

;)

AjiNIMC




msg:165417
 1:22 am on Dec 3, 2003 (gmt 0)

Thanks claus, Nice post, made some of the things clear to me. Its been a shift here and there in few days but now it seems I got a ray of light and can follow it.

I think Google might even take into account whether these outgoing links are located on the same ip.

This is a very risky move to link the sites under same ip, today or more this will harm you.

It seems that Google is filtering sites out based on whether or not they lacked outward links rather than page factors.

Trust me allanp73, out of 15 sites only three had outward links and the rest had,

  • Only five to ten outward links and some had none.
  • I am sure that none of them had all things together, title, h1, alt,.....
  • they had atleast 2 to 3 different anchor text. Only one site was an exception which had the same anchor text almost 90%. But they have more than 7k incoming links. Their anchor text is kw1-kw2-"another word" and this is their company name as well.

    I am on my way to do more on this topic but surely nothing can be 100% accurate.

    Aji

  • Dave_Hawley




    msg:165418
     1:49 am on Dec 3, 2003 (gmt 0)

    I'm not going to win any friends here, but if I want friends I'll join a social club :o)

    SEO is NOT rocket science and it *in my opinion* is something that has grown from SEO companies playing on site owners fears. You have all seen the ads "We will get you in the top ten on Google for your search term..." What a load of rubbish, there are only 10 positions in the top ten! Most SEO business is founded on fear, lies and ignorance.

    It appears to me that many here are getting Shakespear from Jack and Jill. Forget all about SEO and and focus almost entirely on writing good easy to read relevant text. This alone WILL have Google sit up and take notice of the page.

    Rather than post here how bad the SERPs are, use that energy and time to write content for your site(s).

    Despite what is written elsewhere, DO use the Title and description tags to give a brief concise highly relevant description that would entice a HUMAN not a robot.

    Make a site map that use HTML only.

    Do have a H1 heading that again gives a brief concise highly relevant description that would entice a HUMAN not a robot.

    Do have sub headings.

    Do link to relevant sites if you think your visitors would be interested in the site.

    Do have sites link to you. Do NOT worry about PR.

    Googles ultimate aim is to return highly relevant results for humans. If you aim to do the same you and Google are singing from the same song book.

    Build your your site for humans and they + robots will find you, build your site for robots and only the robots will find you.

    So the word of the day is humans :o)

    Dave

    dazzlindonna




    msg:165419
     2:40 am on Dec 3, 2003 (gmt 0)

    dave said, (with my remarks in caps and brackets)

    Rather than post here how bad the SERPs are, use that energy and time to write content for your site(s). [DID THAT]

    Despite what is written elsewhere, DO use the Title and description tags to give a brief concise highly relevant description that would entice a HUMAN not a robot. [DID THAT]

    Make a site map that use HTML only. [DID THAT]

    Do have a H1 heading that again gives a brief concise highly relevant description that would entice a HUMAN not a robot. [DID THAT]

    Do have sub headings. [DID THAT]

    Do link to relevant sites if you think your visitors would be interested in the site. [DID THAT]

    Do have sites link to you. Do NOT worry about PR. [DID THAT]

    [RESULT: Went from #1 to nowhere]

    AjiNIMC




    msg:165420
     2:56 am on Dec 3, 2003 (gmt 0)

    Dave,

    I do think the same but when you have cut throat competition, you can't be honest enough to take the back seat.

    Whatever you said is right and I donot think SEO means a site full of keywords, which is #1 in SERPs and give a tummy pain for the users. This is a balance between both. I will put it this way,

    I have a good site with good content , very attractive, good for user, top to bottom and left to right strategy followed for user.All well done.

    Why will I do a link exchange or ask someone to link back though link and anchor text are king and queen of this kingdom. I have done all right.

    Why am I not #1, what is that I am lacking, what are the tricks behind it, how is G and other SEs work?

    This analysis is SEO. Everyone knows what their users want(or trying their best to understand them), otherwise having a #1 position is useless.

    Good site first and SEO second not the reverse, I am sure all of us here are following this sequence only.

    Aji

    This 73 message thread spans 3 pages: < < 73 ( 1 [2] 3 > >
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Google / Google News Archive
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
    © Webmaster World 1996-2014 all rights reserved