Certainly Google has no problem following relative links.
There is a good point to the suggestion Google may blindly follow the link as if it were an external link. If Google liked following external links more than internal, you may have a point. (i doubt there would be an advantage to this though)
The point I wanted to make on the subject is... I think this would slow down your site, as the users PC would likely do a DNS check on every click (to resolve the domain name to the IP). IMO.
Absolute URLs [searchengineworld.com] are the way to go
s'pretty trivial really i think. you could be spending your time doing more important things.
imho, http-clients do always calculate an absolute uri from relative links before requesting the linked resource.
i.e. they're able to generate a complete http-request from the relative link, including the protocol, host etc.
if there's a relative link, the client checks for a <base href="..."> element.
if there is no such element, the document's uri is assumed to be the base href.
i don't think googlebot would care about a link being absolute or relative.
I have really weird URL's and Google has no problem. It is not that big a deal.
I have no actual pages like .htm, .asp every page looks like the above and I have no problems.
Once again, you are all caring more SEO than users. As it has been said, a relative link gives a faster response to the user.
If you change, for any reason, your domain name or your directory structure, absolute links are a problem. With relative links this is easier.
Going further, the relative link lets you know the displašement trough the diretory tree to reach the destiny resource.
In my opinion: while linking into your same domain, use relaitve. keep absolute links only to out-of-domain links. My page is in a subdomain, given as a free service by an important web portal, and my page contains somthings like this:
<a href="/">Visit our hosters</a>
Even this link goes out of my site, its destiny is in the same domain, so i use relative linking.
So do the best for your users, and if you are worried by SEO, think this: If the objective of Google is to rank the page by their quality and relevance, then they will care everything they find is good for the user.
I am in the process of making all pages use asp urls, but wanted to confirm that the following naming convention was suitable for Google:
We use such a naming convention to identify the top level, and second level category to generate the left navigation. So this file would be part of
products & services/trading widgets
I just wanted to make sure we're not penalized for using periods and such long filenames.
Does anyone know how Google handles these types of urls
"link:.default.asp" shows there are files like this in the index
Thanks. I was trying to find an example and it looked like there was none.
I use dots in my urls and it works but I suggest to use "-" because sometimes "." is assumed "/" in queries: prefix.name.htm for pages returns prefix/name.htm for directories also. I don't know how this affect rankings.
Thanks again. If dots become '/', then it would not be good because this would mean that Google would see it as a lower level. Usually you will notice pr drops by 1 point per level.
That is not quite correct!
If you go down a directory level (away from your root) the toolbar add-on for Internet Explorer reports a lower PR, so far so good. But this is not the real PR as recorded by Google, it is only an estimate based on your homepage's PR.
So, you don't have to worry if not all pages are at your root. Or the other way around, it does not help at all if you use your naming convention, because for the calculation of the true PR only the link structure is important and not the directory level.
Hope that helps!
Actually, I find that pr does drop. I noticed that the toolbar lately only show pr for known pages. I added some new pages which weren't yet index. Previous the toolbar would show a guessed value now it shows that the pages have not yet been given pr. With this in mind looking at pages which are on different levels which are indexed by Google, I noticed a consist drop of pr between levels. The toolbar is producing accurate results.
NOTE: Google has going about updating it's pr information on a continuous basis yet and shows pr based on the last major update.
I use blue.widgets.twisted.html type filenames. Not caused me any problems.
I don't like ../../../overthere/page.html links as I get confused where they are pointing. If moving things around (the page that contains the link, that is) then it is easy to forget to edit a link and so it will no longer resolve.
I like /some/folder/deep/down/page.html as I can see where it is pointing and quickly spot a mistake. If the pointing page moves folder the link does not need to be edited.
One point about links that do not contain the domain. You risk having the wrong domain returned for a result. I had a site on a .net free host. It has many backlinks. One person linked to the .com alias and as it happens, the site is listed in Google as a .com URL. It has taken months for Google to see that they are the same site, and list the same backlinks whether you specify .net or .com versions.