Forum Moderators: Robert Charlton & goodroi
Bots started crawling last week and everything seemed to be going well. Like i said it only appeared yesterday and now is gone again. The owner not gonna be happy :(
P.s. just checked and info:www.example.com still lists the site!
Many thanks in advance for any help.
The accepted wisdom is that the indexing will be propagated across the rest of the servers in time. If you know the IP addresses of the servers that you normally see then you can check which ones have the new details.
Having said that, I started using site maps recently, site was indexed, great SERPS, cache up to date. After a day or two we were back to the old (bad) SERPS and old cache for the best part of a fortnight. Then we jumped back for 24 hours with the cache updated again and good SERPS. Now we are back to the pre sitemaps cache and SERPS.
I don't understand how it was indexed for one day and then not again?
I've seen this happen with several new sites (and new pages on older sites). They start to appear, then drop out a few times, then back etc. I think this is normal and you will soon/eventually (depending on incoming links) see the site/pages consolidate and stop dropping out.
30 new static pages (all unique, hand written etc) were added to an established site (over 5yrs old). A week later they had been spidered, in the cashe and were ranking top 20 (not big commercial keywords). Three days later they were gone, site command showing that the pages were yet to be spidered and the page they were linked off was showing an earlier cashe ie prior to when the pages were added.
At first like others i assumed it was just the data centre not updated and i was viewing a different data centre. I even convinced myself that even with going direct to the same data centre i might not have been getting that data centres data. (if you follow)
A few days later i added five more new pages to the site and had exactly the same pattern of events.
I now believe that this is a google policy. To take out new content for a set period even on established sites (a sort of mini sandbox)in the hope of the webmaster buying adwords - i think its this simple.
Perhaps im being over cynical but i think this is where we are with these missing pages. Google clearly has collected the data at a data centre it just keeps it out of the serps for what ever reason.
On some pages added two months ago from my own experience this mini sandbox effect has lasted three weeks or more depending on how competetive the keywords are.
- this is my take on it, im very interested to hear if other agree, disagree or have had similar expences recently, ie within the last three months.
Cheers,
Rich
So in your experience you would conclude this is now a new part of the google algo.
Interesting - only issue i can see with this is that google run the risk of the serps being stale in certain areas with this new filtering if they keep the new data out to long from the established sites.
Keep pages out long enough to push adwords revenue up, but not too long that the serps become unusable
you would conclude this is now a new part of the google algo.
Sometimes -- as with all things Google, it's getting quite complex. Maybe the more trust a domain establishes, the fewer hurdles new pages have. Or so it seems to me. I don't claim any insider knowledge on this, I just work with a number of sites and describe patterns I see. This particular pattern seems to be coming up more often these days.
Certainly if this kind of filtering became universally applied, then as you said, search results would get pretty stale. So I could be way off-base here. But this is what it looks like to me.
My suspicion is that Big Daddy gave Google the elbow room they needed to roll out more of the "historical factors" that were mentioned in last year's patent [webmasterworld.com], and those factors are a big part of how Google measures trust in an ongoing and historical fashion.