homepage Welcome to WebmasterWorld Guest from 54.204.127.59
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Absolute links easier to crawl?
How much of a difference will it make?
fom2001uk




msg:136648
 12:50 pm on Jul 30, 2003 (gmt 0)

If you use absolute links throughout your site, are they more likely to be followed and spidered than if you used relative links?

Most designers will use relative links, I assume becuase it's easier for them to manage, but will this put off the spiders, particularly Googlebot?

 

davester28




msg:136649
 1:43 pm on Jul 30, 2003 (gmt 0)

Certainly Google has no problem following relative links.

There is a good point to the suggestion Google may blindly follow the link as if it were an external link. If Google liked following external links more than internal, you may have a point. (i doubt there would be an advantage to this though)

The point I wanted to make on the subject is... I think this would slow down your site, as the users PC would likely do a DNS check on every click (to resolve the domain name to the IP). IMO.

skipfactor




msg:136650
 1:44 pm on Jul 30, 2003 (gmt 0)

Absolute URLs [searchengineworld.com] are the way to go

richmc




msg:136651
 2:51 pm on Jul 30, 2003 (gmt 0)

s'pretty trivial really i think. you could be spending your time doing more important things.

Martin Dunst




msg:136652
 3:21 pm on Jul 30, 2003 (gmt 0)

fom2001uk,

imho, http-clients do always calculate an absolute uri from relative links before requesting the linked resource.
i.e. they're able to generate a complete http-request from the relative link, including the protocol, host etc.

if there's a relative link, the client checks for a <base href="..."> element.
if there is no such element, the document's uri is assumed to be the base href.

i don't think googlebot would care about a link being absolute or relative.

regards
martin

ogletree




msg:136653
 3:44 pm on Jul 30, 2003 (gmt 0)

I have really weird URL's and Google has no problem. It is not that big a deal.

ie www.domain.com/?pane_2=page-name_keyword

I have no actual pages like .htm, .asp every page looks like the above and I have no problems.

Herenvardo




msg:136654
 4:12 pm on Jul 30, 2003 (gmt 0)

Once again, you are all caring more SEO than users. As it has been said, a relative link gives a faster response to the user.
If you change, for any reason, your domain name or your directory structure, absolute links are a problem. With relative links this is easier.
Going further, the relative link lets you know the displašement trough the diretory tree to reach the destiny resource.

In my opinion: while linking into your same domain, use relaitve. keep absolute links only to out-of-domain links. My page is in a subdomain, given as a free service by an important web portal, and my page contains somthings like this:
<a href="/">Visit our hosters</a>
Even this link goes out of my site, its destiny is in the same domain, so i use relative linking.

So do the best for your users, and if you are worried by SEO, think this: If the objective of Google is to rank the page by their quality and relevance, then they will care everything they find is good for the user.

Regards,
Herenvard÷

allanp73




msg:136655
 4:57 pm on Jul 31, 2003 (gmt 0)

I am in the process of making all pages use asp urls, but wanted to confirm that the following naming convention was suitable for Google:

e.g. ps.tw.widgets_blue_whitedots.asp

We use such a naming convention to identify the top level, and second level category to generate the left navigation. So this file would be part of

products & services/trading widgets

I just wanted to make sure we're not penalized for using periods and such long filenames.

Does anyone know how Google handles these types of urls

skipfactor




msg:136656
 5:03 pm on Jul 31, 2003 (gmt 0)

Hi allanap73,

"link:.default.asp" shows there are files like this in the index

allanp73




msg:136657
 5:45 pm on Jul 31, 2003 (gmt 0)

Thanks. I was trying to find an example and it looked like there was none.

Gus_R




msg:136658
 6:33 pm on Jul 31, 2003 (gmt 0)

allanp73:
I use dots in my urls and it works but I suggest to use "-" because sometimes "." is assumed "/" in queries: prefix.name.htm for pages returns prefix/name.htm for directories also. I don't know how this affect rankings.

allanp73




msg:136659
 7:41 pm on Jul 31, 2003 (gmt 0)

Thanks again. If dots become '/', then it would not be good because this would mean that Google would see it as a lower level. Usually you will notice pr drops by 1 point per level.

Mozart




msg:136660
 8:20 pm on Jul 31, 2003 (gmt 0)

allanp73:

That is not quite correct!

If you go down a directory level (away from your root) the toolbar add-on for Internet Explorer reports a lower PR, so far so good. But this is not the real PR as recorded by Google, it is only an estimate based on your homepage's PR.

So, you don't have to worry if not all pages are at your root. Or the other way around, it does not help at all if you use your naming convention, because for the calculation of the true PR only the link structure is important and not the directory level.

Hope that helps!

Mozart

allanp73




msg:136661
 8:34 pm on Jul 31, 2003 (gmt 0)

Actually, I find that pr does drop. I noticed that the toolbar lately only show pr for known pages. I added some new pages which weren't yet index. Previous the toolbar would show a guessed value now it shows that the pages have not yet been given pr. With this in mind looking at pages which are on different levels which are indexed by Google, I noticed a consist drop of pr between levels. The toolbar is producing accurate results.

NOTE: Google has going about updating it's pr information on a continuous basis yet and shows pr based on the last major update.

g1smd




msg:136662
 11:43 pm on Aug 1, 2003 (gmt 0)

I use blue.widgets.twisted.html type filenames. Not caused me any problems.

I don't like ../../../overthere/page.html links as I get confused where they are pointing. If moving things around (the page that contains the link, that is) then it is easy to forget to edit a link and so it will no longer resolve.

I like /some/folder/deep/down/page.html as I can see where it is pointing and quickly spot a mistake. If the pointing page moves folder the link does not need to be edited.

One point about links that do not contain the domain. You risk having the wrong domain returned for a result. I had a site on a .net free host. It has many backlinks. One person linked to the .com alias and as it happens, the site is listed in Google as a .com URL. It has taken months for Google to see that they are the same site, and list the same backlinks whether you specify .net or .com versions.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved