Forum Moderators: open

Message Too Old, No Replies

Need lateral thinking re googlebot's absence

Stopped visiting client sites

         

anallawalla

3:55 pm on May 10, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I do know that googlebot needs to be referred by an existing site but my client's sites are a puzzle. I am in Australia and they in California, so communications with them are a bit sketchy.

They have three main sites - a.com, b.com and a.net (different products and services. a.net recently copied its pages to c.com while still leaving a.net intact, so it looks like spam. I pointed out the risk, but they have an SSL cert for that site and are hesitant about getting a new one (dunno how long it takes).

All three sites (older a.net, not the new c.com) are indexed and b.com has many top SERPs. All are PR5. Modest number of backlinks. No robots.txt.

"a" represents the company name, hence they used it on two sites with different TLDs.

OK, b.com was picked by freshbot within days of some changes, but the other two have not. The third site is the major worry (a.net aka c.com) because its cached copy is "many, many months" old and the indexed pages are "old". They just made the IIS logs available to me but googlebot has not visited in the log fragments I can see.

The older a.net is linked from the freshbotted b.com, so I am hoping it will be crawled finally (I keep telling them to only use the new c.com domain for all links).

The bad stuff: they own a few more domains and some are 301d to the working site but two are not, hence they appear to be spammy. One site uses <div style="display:none"> to hide some words and I told them to fix that asap.

The three sites are between 2 and 5 years old, other than the new domain that matches the brand name. What would cause googlebot to stop visiting such a PR5 site?

- Ash
(edited typo)

ciml

4:02 pm on May 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> its cached copy is "many, many months" old

If this is on www-sj or www2 then I wouldn't worry. That index doesn't seem to have recent backlink records yet.

I would avoid hiding words as you describe. If that has been caught already, then any penalty will hopefully reverse itself after a few weeks as part of Google's webmaster-friendly approach.

anallawalla

7:02 am on May 11, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Hey Calum, that's a good tip, but a further puzzle.

www-sj, www-fi, www2 and www3 *do* have a fairly recent copy of the deprecated domain a.net but www, ex, va, dc, ab, in, zu and cw do not -- those cache copies are very old.

None of them have picked up the new domain c.com.

Thanks.
Ash

ciml

9:11 pm on May 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This confused quite a lot of us when it first happened, some months ago. Fresh listings (injected over normal listings) provide a fresh cache. As a result, you can no longer use the cache to identify the age of the index your looking at.

anallawalla

1:52 pm on May 14, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Relief at last. New domain got freshbotted!

Don't know if my reinclusion request (site was never banned, just forgotten) did it - too soon for that - or just good luck and incoming links.