homepage Welcome to WebmasterWorld Guest from 50.19.172.0
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Broken links and Google
What does G do with broken links on the homepage?
freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 25473 posted 6:54 pm on Aug 26, 2004 (gmt 0)

I ask because a former client of mine a few months ago had an amateur make some changes to his site, and the guy deleted a page and left a broken link on the homepage. Subsequently, and I have no idea if the two are relaed, Google stopped indexing the page properly, showing no title or description in the SERPs. Furthermore, G dropped most pages on the site from the index, leaving only about ten or so, not titles or descriptions on any of them.

The strange thing is, the homepage didn't drop much in the SERPs, it's still on the first page for a highly competitive term which took me months of work to get. And there it is, at number five or so, with no description or title, and has been like that for months.

Could this have been caused by the broken link?

I've fixed it now, so we'll see if G indexes it properly again.

 

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 25473 posted 7:15 pm on Aug 26, 2004 (gmt 0)

It doesn't sound like that's all that was wrong, just the one broken link. It might be a good idea to run the page through a validator.

I moved some sites a month or two ago, and ended up with one broken link to a page on the site - the page never got put up on the new server. That page only disappeared from the index and lost it's PR, but when I put the missing page back up, it immediately got picked up again. No PR yet, but it will.

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 25473 posted 7:21 pm on Aug 26, 2004 (gmt 0)

Thanks, Marcia, I'll try that. It's odd, though, because it was doing fine before with the same code, unless the guy changed something else that I'm not aware of. I'll check it.

Chad

10+ Year Member



 
Msg#: 25473 posted 7:37 pm on Aug 26, 2004 (gmt 0)

Broken links have no negative effect besides PR-passing opportunity cost of the dead link. Google just assumes that the link is good but the destination page is temporarily unavailable.

In fact. Google goes ahead and assigns a PR value to the ghost page. A neat SEO trick is to create such dangling links to sites or pages that do not yet exist. After a while, when the site or page is actually created, it has a PR value from birth.

JudgeJeffries

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 25473 posted 12:37 am on Aug 27, 2004 (gmt 0)

Chad,
Does that also apply to a site that has robots text barring say Google.
I have such a site and G visits every day but then retreats. If I remove the text will the site have instant PR?

Chad

10+ Year Member



 
Msg#: 25473 posted 3:10 am on Aug 27, 2004 (gmt 0)

Hi freejung,

I've no idea, but it you try it and report back, we'll all know a little bit more about googlebot. ;)

Just out of curiosity, why do you have a site with a robots.txt disallow?

Regards,
chad

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 25473 posted 4:18 am on Aug 27, 2004 (gmt 0)

Actually, it was JJ who had the robots.txt, not me. Update: I've validated my page, it had a few missing and improper attributes, but nothing major, certainly nothing that wasn't that way when it was doing fine in Google. I've been told that this is a common issue, and I've seen it on other sites. It just seems strange that it should go on for so long, and I was looking for an explanation.

freejung

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 25473 posted 5:42 am on Aug 27, 2004 (gmt 0)

Further update: I've been playing around with validation, and I have yet to find a single competitor in the SERPs I'm interested in which validates, and many of them have what I would call eggregious errors, such as closing elements which are not open, certainly much worse than anything my page had. I don't think validation was the problem. Sigh... I guess I'll just have to wait and see what happens.

Chad

10+ Year Member



 
Msg#: 25473 posted 8:30 pm on Aug 29, 2004 (gmt 0)

I just got an interesting email from Google stating that they do add pages to their index even when those pages are disallowed by the robots.txt file. They just do not crawl the page.

They say that the only sure-fire way to keep a page out of the index entirely is to use the NOINDEX meta tag.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved