Forum Moderators: martinibuster

Message Too Old, No Replies

Placement of links on a page

Noted some recip links in weird places

         

Marval

12:57 am on Feb 21, 2004 (gmt 0)

10+ Year Member



I happened to be viewing the source of a page that links to me - and noticed that the WM had placed his "trade" links inside a separate <html> and <body> tag set within a page. I was trying to track down the reason that these links had not shown up as backlinks as much lower ranked links had already. I put the page through a "sim" spider and it saw the links, however it leaves me wondering if a spider such as Googlebot will go beyond a close body tag for links? Any experiences with this?

internetheaven

4:27 pm on Feb 21, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Did this extra HTML section have any meta tags such as robots=noindex,nofollow? The sim spider you used, does it catch HTML coding before server side includes are added? The page you are viewing may be being called via a <!--var include= --> command where the called file is in a robots.txt excluded file.

Alot of webmasters use simple tricks to fool other webmasters into thinking they have placed a link to them in order to obtain reciprocal links. The above two are the simplest and most common I've seen and they basically mean that when you look at the page in question it looks as though you are being linked to when in fact the spider is being told to ignore the link.

Marval

7:03 pm on Feb 21, 2004 (gmt 0)

10+ Year Member



I dont see any tags - as a matter of fact the only tags this guy uses in any of the three heads he has on a page is the title tag. Im a little lost on the sim spider part as I dont know if there is one out there that would be able to see the page before the includes if there are any? I used the sim spider available here in the links section

rogerd

2:03 pm on Feb 24, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



I've heard the expression, "Two heads are better than one" but three? ;)

Server side includes are inserted by the server before the page is delivered. They won't depend on the browser or the robots.txt file (unless the site employs cloaking, which sounds unlikely).

I think this sounds like bad coding, either a bad cut & paste job or a poorly thought-out include (SSI or scripted) that is jamming an entire HTML page into the middle of another one.

Whatever it is, it's bad coding and won't be helping the site owner, either.

Marval

10:48 pm on Feb 24, 2004 (gmt 0)

10+ Year Member



Discovered it was SSI that was being used by the site owner with just bad coding - when showed the results of the coding, he immediately changed it