homepage Welcome to WebmasterWorld Guest from 54.226.80.55
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

This 54 message thread spans 2 pages: < < 54 ( 1 [2]     
Top 100 SEO tips
All time top 100 Search Engine Optimization tips
Habtom




msg:3429881
 10:52 am on Aug 23, 2007 (gmt 0)

I would love to see what the community here believes are the top 100 SEO tips.

To quote Google "no evil" and "white text on white bg" tricks please.

> I suggest tips only first and discussions can follow later (Right after we reach 0 :) )

100. Use the H1, H2 and H3 tags for what they are meant to be. Use H1 tag once on the page, H2 for a few of the sub headers, and H3 for less important titles.

This is not the best tip, but hey

99 - anyone?

 

zuko105




msg:3433339
 3:45 pm on Aug 27, 2007 (gmt 0)

Steve
61. Run Zenu or similar for links out on page internal navigation

I love zenu, I thought I was one of the only people using it.

Got any sweet Zenu mods?

mona




msg:3433405
 4:29 pm on Aug 27, 2007 (gmt 0)

58. Keyword research
57. Keyword research
56. Keyword research

OK, you get the point:) Use automated tools, Google Adwords, YSM, your web logs, your competition, whatever works for you, but do it. Rinse and repeat a couple times a year.

zuko105




msg:3433678
 9:18 pm on Aug 27, 2007 (gmt 0)

55. Become a webmasterworld.com supporter. Access to much more specific and fine tuning information there than any book you could ever read or any list of tips could cover.

zuko

Habtom




msg:3439830
 9:14 am on Sep 3, 2007 (gmt 0)

Any fresh ideas?

davec




msg:3440994
 5:21 pm on Sep 4, 2007 (gmt 0)

54. Link out to quality external sites where appropriate.

martinibuster




msg:3441058
 6:06 pm on Sep 4, 2007 (gmt 0)

99 - Use valid, lean code for your site. This has helped my sites tremendously.

92 - try and make your code as lean as possible,

63. Although this was the number 99 tip, this needs to be reiterated and maybe expanded upon.
valid html code.

Those three points, essentially the same tip three times, is probably one that can be thrown out. Valid and lean code doesn't matter. It's enough for the code to not have exceptionally bad mistakes in it. It's possible for a bot to get confused by exceptionally bad mistakes, but that's the exception. Bots are smarter today than they used to be.

As far as having lean code with content close to the top, that's nice, but I don't think it's going to have an SEO effect anymore. A look at the SERPs bears this out. If anything, the order of your content is going to be more important than JavaScript clutter. For instance, throwing the navigation into the end of the code is probably more beneficial than throwing JavaScript into an external file. The reason is because JavaScript and table tags etc. are going to be ignored, the bots are going straight to the content.

Just take a look at the SERPs if you don't believe me. TripAdvisor hasn't been hamstrung by the miles of code in their pages, so neither does your site. Do searches for "cheap plane tickets to (country) and you'll see them in the top five for practically everything. Take a look at the code on the top five sites and you're going to find spaghetti.

Then do less competitive searches, like (city) kung fu school. Yelp, with it's bird's nest of code, is all over the place. Content and often content with links wins out time after time; code has very little, if not nothing, to do with it.

[edited by: martinibuster at 6:30 pm (utc) on Sep. 4, 2007]

buckworks




msg:3441066
 6:14 pm on Sep 4, 2007 (gmt 0)

probably one that can be thrown out

I disagree with you on that, MB. It's certainly true that valid code won't be a make-or-break issue if you do enough other things right, but that's true for anything on this list.

Never turn down a chance to do something a bit better than your competitors. It all helps.

martinibuster




msg:3441076
 6:22 pm on Sep 4, 2007 (gmt 0)

Funny, I was just discussing this with someone who ranks on thousands of phrases and he responded, well it wouldn't hurt to have valid code.

I responded, Dude, you don't even have a freaking doctype. You're the poster boy for ranking well without valid code. :)

phantombookman




msg:3441079
 6:26 pm on Sep 4, 2007 (gmt 0)

>>Those three points, essentially the same tip three times, is probably one that can be thrown out.

I agree with MB.
Some of my first sites and pages were hand-coded using notepad as I sat with a HTML book on my lap - I was absolutely clueless! They might be embarrassing to me now but they sit at #1 or top 5 in Google!

My latest stuff, years later, is spot on or would be if 50% of the pages didn't go supplemental!
So long as your code works in explorer that's enough SEO wise.

Just my opinion based on experience
PBM

PS: That opinion is also offered on video somewhere by some fella by the name of "Matt Cutts" - whom ever he may be!

CWebguy




msg:3441092
 6:38 pm on Sep 4, 2007 (gmt 0)

I would love to see what the community here believes are the top 100 SEO tips.
To quote Google "no evil" and "white text on white bg" tricks please.

> I suggest tips only first and discussions can follow later (Right after we reach 0 :) )

100. Use the H1, H2 and H3 tags for what they are meant to be. Use H1 tag once on the page, H2 for a few of the sub headers, and H3 for less important titles.

This is not the best tip, but hey

99 - anyone?

I think you pretty much summed it up in your first two sentences.

Use valid, lean code for your site. This has helped my sites tremendously.

Does this really make a difference? I could see maybe a little bit of a change, but it almost seems a little biased maybe? I just can't see how a search engine would care about your code, seems like they should be after content, but who knows.

zuko105




msg:3442998
 1:42 pm on Sep 6, 2007 (gmt 0)


Use valid, lean code for your site. This has helped my sites tremendously.

Does this really make a difference? I could see maybe a little bit of a change, but it almost seems a little biased maybe? I just can't see how a search engine would care about your code, seems like they should be after content, but who knows.

Yes, Yes, YES!

Spiders aren't browsers. They don't interpret javascript. They allow only a certain degree of sloppy html, where browsers allow a HUGE degree of it. To what degree spiders allow sloppy html where they have the ability to crawl your content, we don't know.

Therefor, the only known thing we can do as webmasters is institute w3c standards to eliminate any possibility that a spider will not index your documents properly. At least that's my opinion.

I'm going to get off my web standards soap box now.

CWebguy




msg:3443007
 2:01 pm on Sep 6, 2007 (gmt 0)

Well, I could definitly see a diference if your page is not being crawled at all :) . But do you really know if this will improve SERP's? Does Google say "Hmm, his code looks nicer, I'm going to bump him up in the rankings?"

P.S. I definitly try to go light on the javascript for the reason you mentioned.

callivert




msg:3443909
 11:07 am on Sep 7, 2007 (gmt 0)

Does Google say "Hmm, his code looks nicer, I'm going to bump him up in the rankings?"

No, Google says "this page is a load of crap, so I don't think I'll index it."

zuko105




msg:3443988
 12:57 pm on Sep 7, 2007 (gmt 0)

Well put callivert.

The point being is that if google and other SEs won't index a page if it is a load of crap, because it simply can't properly index the page.

To what degree of crap any search engine allows is proprietary information and I don't think anyone outside those organizations can accurately predict what the "allowable crap" degree is. Therefor the only way to make sure everything gets indexed properly is to validate.

martinibuster




msg:3444064
 2:27 pm on Sep 7, 2007 (gmt 0)

Therefor, the only known thing we can do as webmasters is institute w3c standards to eliminate any possibility that a spider will not index your documents properly. At least that's my opinion.

Sorry, I apologize, perhaps I should have been more explicit that I was stating a fact, not an opinion. As I said in my previous post, unless your code is exceptionally bad, the bots are smart enough these days to understand it.

Bots do not need valid code, they are intelligent enough these days so that they can read invalid and even badly coded HTML and still index it properly.

I personally tend to code neat and tidy HTML that is valid to W3C standards. But that's my personal habit. It's probably good for future-proofing your site for browsers. However, because search engines do not need valid or clean code, I don't feel valid code qualifies for inclusion on a list of top SEO tips.

buckworks




msg:3444126
 3:41 pm on Sep 7, 2007 (gmt 0)

It's obvious that spiders are capable of wading through many kinds of invalid code. But it doesn't follow that the issue of valid vs. invalid code has no effect on how the search engines perceive a page. Think "signals of quality".

It's not safe to conclude that [someone's favorite SEO tip] doesn't matter and won't help just because it's easy to find sites ranking high even though they haven't done that particular thing very well.

martinibuster




msg:3444158
 4:09 pm on Sep 7, 2007 (gmt 0)

Think "signals of quality".

Well, I have to respectfully disagree and add that Google does not validate website code. Good clean code that validates is not necessary for Google to index your content, and it won't send a signal of quality because Google is not vaildating website code.

If you think about it enough, it begins to make sense why Google (and probably Yahoo) wouldn't throw resources into validating every site they encounter and using that as a quality signal. Valid code says nothing about the quality of the content itself.

Running your code through a validator is a great practice and I encourage people to do that just like it's a good practice to keep your house clean. But because Google does not validate code nor need clean code to index the content, I don't believe having error-free valid code belongs in a list of top SEO tips.

I didn't mean to hijack this thread, so I started a new one on the subject over here.
[webmasterworld.com...]

buckworks




msg:3444360
 7:08 pm on Sep 7, 2007 (gmt 0)

Valid code says nothing about the quality of the content itself

True. But it might say something about the accessibility of the page ... which is a major quality issue for some folks.

don't believe having error-free valid code belongs in a list of top SEO tips.

The thread is about the top 100 tips, not the top ten. MB, you must know a lot of tips you're not sharing if you think a hundred other things are more important than validation! :)

martinibuster




msg:3444375
 7:29 pm on Sep 7, 2007 (gmt 0)

53. Plan the site so it is scalable

52. Plan the site so it can easily change web technology

51. Plan the site architecture using a rational naming hierarchy so that the folders make sense and are meaningful when seen in the SERPs.
(example.com/cheap-widgets/)

Those get highlighted in the SERPs and helps draw attention to your listing. Inbound links can form the keywords in the anchors, etc.

50. Create content that a dot edu, gov, or .us will find useful to link to. Kind of like social engineering, where you look at the behavior then tailor your response/approach to appeal and fit in with that behavior. In this case you're looking at what the pages are linking to and create pages to match that profile.

49. Create a site map.

48. Link to your less tasty pages from the site map.

47. Eliminate or make less prominent all links to fluff pages (like member profiles).

tedster




msg:3444412
 8:05 pm on Sep 7, 2007 (gmt 0)

46. Use a sound Information Architecture for your entire website. This includes, but is not limited to, choosing menu labels that make keyword sense (without stuffing) and also are no-bariner intuitive for your visitor.

45. And my second tip is similar, but more localized - use a solid semantic structure for your page. That makes the algorithm's job a whole lot easier.

buckworks




msg:3444803
 6:26 am on Sep 8, 2007 (gmt 0)

44. Check regularly for dead links or old links that are redirecting to something else. Update or delete as appropriate. You won't get SEO brownie points for relevant outbound links unless they're working!

vincevincevince




msg:3444813
 6:50 am on Sep 8, 2007 (gmt 0)

43. Build everything you want indexed so that it works perfectly with javascript, flash, java, activex and css disabled.
42. Stop bots from indexing pages which you don't want in the index, especially dynamic sections which have no relevance to the SERPs.
41. Adjust your game-plan based upon your market sector - every sector requires a different strategy so don't think what worked for selling widgets will work for philsophy.
40. Cut out the 10000000 pages of recycled trash (that is, if you still want to rank next month).
39. Make tables make sense. 'Name' 'Born' 'Died' column headings won't help the SERPs. Rewrite it, row by row as: Queen Elizabeth was born in 1533 and died in 1603. Fancy CSS will tabulate that if you let it!

vivalasvegas




msg:3444866
 8:44 am on Sep 8, 2007 (gmt 0)

38 A few quality links can weigh much more than a lot of non related low quality ones.

greendrift




msg:3446417
 3:30 pm on Sep 10, 2007 (gmt 0)

37. Content, content, consistent content.
Check that each page is focused on one topic and that the page title and description tags are consistent with it. Check also that the main keyword chosen for that topic is included in the title, description, and the first paragraph of the page.
Ensure also that secondary keywords appear towards the top of the page and are consistent with the primary topic / keyword.

36. Content, content, more specialised content.
More pages of content mean more potential incoming links for your site = more traffic. This will also lead you towards breaking larger pages into two or more smaller ones.
Each page will then have a narrower focus and should appear higher up the SEO rankings for less common keywords - compared with a page with a broader ranging topic and a larger spread and number of keywords.

This 54 message thread spans 2 pages: < < 54 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved