homepage Welcome to WebmasterWorld Guest from 174.129.103.100
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 41 message thread spans 2 pages: < < 41 ( 1 [2]     
Tricks Of The Trade to Improve Ranking
Tips on how best to make use of tools at your disposal.
AlgorithmGuy




msg:3103451
 3:40 pm on Sep 30, 2006 (gmt 0)

I will kick start this thread and hopefully many others will join in to disclose tricks, tips and other things that a webmaster should know.

The WWW in websites URL's
This is a sub-domain. It is useles and totally unnecessary to a run of the mill website. Most registrars themselves have little or no idea why they sell two empty websites to a purchaser. And most hosts have not got a clue about canonical and duplicate content issues. That is what you get. TWO WEBSITES. Or one domain and a free sub-domain taht is a time bomb waiting to explode because it will cause duplicate content.

It is crazy and rediculous. Why are they selling a sub-domain.
http://example.com/ is the domain you purchased, and the registrar throws in a subdomain as a bonus, totally useless and a very dangerous and misleading one at that.

If you do not KILL the subdomain at source via the ANAME RECORDS your site stands to get canonical and duplicate content issues.
.

[edited by: tedster at 8:12 pm (utc) on Sep. 30, 2006]
[edit reason] use example.com [/edit]

 

lmo4103




msg:3103635
 6:34 pm on Sep 30, 2006 (gmt 0)

orphan page is a page that doesn't have any links pointing to it

Laugh all you like.

But I changed the url for a page and changed the link to point to the new url.
BANG! Supplemental!
Both the original url and the new url became supplemental.

I'm not laughing.

The lesson I learned is from the W3C - Cool URIs don't change

g1smd




msg:3103648
 6:48 pm on Sep 30, 2006 (gmt 0)

I prefer to work to this logic. You buy the rights to use:

domain.com

and then you set up services on it that you require:

www.domain.com
smtp.domain.com
pop.domain.com
ftp.domain.com
irc.domain.com

etc

The fact that web server software usually assumes that you'll have a website directly at domain.com is a quirk that you can easily correct. You could just as easily have the mail server there or something else, or nothing.

tedster




msg:3103657
 6:53 pm on Sep 30, 2006 (gmt 0)

OK - is there hope for this thread? Any other tips to improve ranking -- besides making sure we don't have a "no-www" and "with-www" problem? (And that issue is already discussed in another long thread [webmasterworld.com]).

My own "tip", especially for a relatively new site, would be not to target those big one-word "trophy" rankings. First, we may well go beyond Google's limits in our efforts -- then we trip a filter or even get a spam penalty. And second, a lot of that traffic my not help our business and only cost us bandwidth.

So instead, we should structure our site, our code and our server for the greatest possible clarity over all, not just around one keyword. In that way, we will be ranked for a wide variety of searches, some we never even dreamed of - not even prowling through Wordtracker results. Some of our most valuable and lucrative rankings can end up coming from unpredicted directions.

But of course, in order to know this (and here comes tip #2) we must watch our server logs at least as much as we watch our Google rankings and [site:] results. Probably a lot more. Worrying too much about [site:] results and trying to make them look clean can be a big mistake. We might even end up making ill-advised changes that only harm things on Google.

So I say we should stay focused on our actual visits, our real Google traffic, how those visitors find us and how they then react to our site.

And also notice googlebot, what crawling patterns we see, how our server responds, and so on. It all starts with the spider, before our content ever gets indexed.

[edited by: tedster at 8:11 pm (utc) on Sep. 30, 2006]

AlgorithmGuy




msg:3103658
 6:54 pm on Sep 30, 2006 (gmt 0)

lmo4103,

Sorry, I did not mean it directly to you.

An orphan page is a name that webmasters have given that page to describe it is redundant.

Google does not apply a penalty to an ethical orphan page.

There is nothing wrong in an orphan page if used in an ethical manner.

If an orphan page resulted in what you mention. Then simply looking at a competitors website could reveal pages that can be considered orphans, then all you need to is to link to that page to cause the competitor problems.

Orphan pages are useful. Specially for the method I described.

A directory or a scraper site that has one page with your link in it. Is that an orphan page pointing to your website? What if that directory altered itself and its internal links desert that page. It will become an orphan page pointing to your website.

It now has the same status as the orphan I described. It is on its own and pointing to your site. If not cleared out by the directory it will stay there and google may continue to request it.

That would be more unethical so far as orphan page tricks are concerned compared to what I described as to how a rogue link can be made to work for you.

And what if an unethical link pointed to your site trying to cause you a duplicate content?. Simply because you want to defend your site does not mean it is unethical.

There are lots of websites with pages not cleared out of servers that are no longer a part of the website. Many still get crawled.

.
.

AlexK




msg:3103694
 7:20 pm on Sep 30, 2006 (gmt 0)

What a damn shame - a good idea for a thread, spoiled by errors within the first post.

I still have a free sub-domain on Freeserve (the original UK free-ISP). When that company bought 'freeserve.co.uk' they bought a million+ sub-domains - just as everybody else that buys a domain also gets an unlimited number of sub-domains. That is how HTTP works ever since HTTP/1.1.

"KILL the subdomain at source via the ANAME RECORDS" is one option to resolve canonical problems, but is not available for many, also not the only way to do it, nor even the preferred way to do it (depends on site setup, history, preferences). The last sentence in the first post--as it stands--is incorrect.

tedster:
Note: ww.google.com will resolve ... they use a 302 redirect

Nice catch, Tedster! Hmm, do as I say, not as I do, hey?

BigDave




msg:3103705
 7:42 pm on Sep 30, 2006 (gmt 0)

Good internal navigation, most importantly breadcrumbs.

Any site, with more than 1 level of pages away from the home page, should include breadcrumbs. There are still a lot of sites out there that assume that you can just use your back button to head one level up in the navigation structure.

The problem is that most of your visitors are going to arrive at an internal page from a search engine, and they my not want to go to the home page.

The other problem is with search engine ranking itself. PageRank and related factors from deep links should be spread to nearby pages, instead of sending it all back up to the top level.

As a believer in the long tail, I don't ever care if a searcher lands on my home page. The home page is for repeat visitors and regular readers. I would rather have deep links and keep the PR down deep.

Patrick Taylor




msg:3103734
 8:33 pm on Sep 30, 2006 (gmt 0)

A rule I've made for myself - a tip, maybe - is always to build pages that load quickly. I like to see them load in an instant, and this means lightweight markup (CSS layout and styling), php includes, and as few (but as meaningful) images as possible - optimised in Photoshop for the smallest possible filesize.

It's my belief that this approach can improve ranking in the sense that one is required to break down complex content into small, micro-topic-focussed chunks of content that can then be intelligently woven together by the navigation system - always with carefully chosen and positioned anchor text.

decaff




msg:3103761
 9:08 pm on Sep 30, 2006 (gmt 0)

A rule I've made for myself - a tip, maybe - is always to build pages that load quickly. I like to see them load in an instant, and this means lightweight markup (CSS layout and styling), php includes, and as few (but as meaningful) images as possible - optimised in Photoshop for the smallest possible filesize.

It's my belief that this approach can improve ranking in the sense that one is required to break down complex content into small, micro-topic-focussed chunks of content that can then be intelligently woven together by the navigation system - always with carefully chosen and positioned anchor text.

Match this with rock solid hosting - 9.99% uptime minimum...and the spiders will easily and readily find your site ... and the human visitors will have a site that responds quickly to their queries...chances are you will improve your "stickiness" and increase your conversions (granted you are carefully targeting your visitors needs)...

climb512




msg:3104210
 12:10 pm on Oct 1, 2006 (gmt 0)

Wow, 9.99% uptime...can any host really deliver such stellar performance ;)

decaff




msg:3104230
 12:46 pm on Oct 1, 2006 (gmt 0)

ooops...well you know what I really mean ;-)
99.99% uptime...jeesh!...9.99% uptime would equate to
like 605 hours of down time per month...yeah...that's a recipe for success...!

theBear




msg:3104286
 1:56 pm on Oct 1, 2006 (gmt 0)

Let's try this set.

Ranking factor tips or maybe I should say SERP visabilty:

Make certain you also don't have any possiblity of query string issues. This is yet another content duplication issue.

Make certain you don't have content duplication issues with other sites on the same server (named based shared hosting).

Make certain you don't have other server aliases that resolve and provide a means for a search engine bot to get yet another duplicate copy of a page.

Don't assume that a search engine doesn't have some of the same types of problems that other search emgines have, chances are you are very close to having the odds be 100% against you.

Don't leave to chance that which you have a means to control(aka When faced by an attacking horde, seek shelter fast.)

Pay attention to BigDave's advice on breadcrumbs.

What follows is a site stickyness tip.

Build a site that has useful information, is easy to navigate, has decent load times, and is on a reliable host.

What to consider when building a site tip:

The long tail, discover what it is, and make use of it.

Tips about other things:

Be very careful of what you allow on any forum system you may have.

This 41 message thread spans 2 pages: < < 41 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved