Forum Moderators: open

Message Too Old, No Replies

Googlebot/2.1 visits BUT no pages in SERPs

Google has NO cashed pages from the site:mydomain.com

         

Romster

10:44 pm on May 29, 2004 (gmt 0)

10+ Year Member



Has anyone a similar experience and what does it mean.

Googlebot/2.1 has visited my domain a couple of times and accessed ~100 pages in total. (The domain is a USED one I picked up 2 weeks ago because of the good incoming links it has) The earliest Googlebot/2.1 visits are more than a week ago but still if I look to see if Google has any pages cashed nothing shows up (site:mydomain.com - did not match any documents.)

- Could it be that the domain has a penalty of some kind.
- IF there is a penalty > Doesn’t it mean that Googlebot/2.1 wouldn't come to the domain at all
- What could be causing Google to have NO INDEXED PAGES of my site even after I see in the logs that the GoogleBot got the pages that he asked for HTTP/1.0” 200?

Marcia

11:35 pm on May 29, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>What could be causing Google to have NO INDEXED PAGES of my site even after I see in the logs that the GoogleBot got the pages that he asked for HTTP/1.0” 200?

Being included in the index isn't instantaneous when she crawls. And now even when sites are first indexed they're taking a while to rank for keywords.

It it was an expired domain the slate gets wiped clean; for all practical purposes, this is a new site - just give it time.

If you want to see if there's a history behind the site look up the Wayback Machine.

Romster

10:54 am on May 30, 2004 (gmt 0)

10+ Year Member



I understand that for expired domains the PR is dropped to 0 and has to be built up again.

Still what I don't understand is how after 2 weeks or repeated visists to the site Google has NO CASHED PAGES. I don't need / don't expect the pages to rank high - I just want them to be in the Google DB...

Chico_Loco

12:05 pm on May 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm very skeptical about this filter on used domains - has it been proved?

Just think of the additional programming required.

if ($domain-has-ever-expired)
{
foreach $backlink @backlinks
{
&find-time-link-was-put-on-page;
if ($link-creation-time "is after" $expiry-date)
{
&count-link;
}
}
}

------
Doesn't seem like much, but think of all the additional data that would need to be stored for each and every link on the internet in order for this done. They would essentially need parse every single page ever at multiple points in time - which automatically bumps up their database size by about 300 times.

Doesn't seem like the time that would be required to execute this type of function would have the kind of benefits worth the time.

Just storing the last expiry date alone for each domain would take hundreds of gigabytes of data that would need to be parsed.

sfxmystica

2:09 pm on May 30, 2004 (gmt 0)

10+ Year Member



I don't think Google would be worried about a few measly terabytes ... :)

Romster

5:48 am on May 31, 2004 (gmt 0)

10+ Year Member



Well the sitation still remains the same:

Googlebot/2.1 visits on regular bases:
(hits+hits-on-robots.txt) 224+32 7.14 MB (Lasts visit 30 May 2004 - 09:08)

At the same time looking for cashed pages at Google produces - did not match any documents. NO cashed pages.

I have a couple of domains that have a Google penalty. On these domains I NEVER see Googlebot/2.1 and the same NO cashed pages, but I have never before seen regular visists from Google and no cashed....

What could be causing this.