Forum Moderators: Robert Charlton & goodroi
After this process the page is then very stable and is very unlikely to get de-indexed although may drop in rankings as they get older.
Is this normal behavour for all sites as it seems to happen all the time on our site.
The site is 2 years old has plenty of backlinks, hosted in the uk and is a uk site.
Lee
I've also noticed that the pages seem to be still index if you check them on AOL (powered by google) search).. unless they do show as de-indexed on there at a later date because it lags behind with upto date results.
On a slightly seperate note.. Stats (rankings) are well down today, and it seems to effect the same bunch of pages over and over again, yet some pages seem to stay uneffected unless there is a real massive googlequake.
I cannot find any reasons why its the same pages which always get hit with small tremors first. (no age, topic or design similarities), but they do act as a good warning sign.
Things always seem to recover (touch wood) to almost the exact same positions.
Lee
Within days of the 80 pages going online, Google showed (in a site:domain.com search) 3 pages, then 5 pages, then 8 pages, and then got stuck for a few days, then jumped to 16 pages when a new incoming link had been found, then got stuck for a few days, then jumped to 32 pages when another incoming link had been found.
Interestingly, 4 of the pages that were showing when only 8 pages were listed, are NOT showing in the current list of 32 pages.
I find that very odd.
Webmastertools also does not compute.
Two weeks ago, the "Links - Internal" list showed 8 pages and showed that the maximum number of internal links to some of those pages was 8. That data was a few days old when it appeared in WMT, because a normal Google site: search was already listing 16 pages by then.
A few days later, the list showed only 7 pages, but it also shows that some of those 7 pages have up to 14 internal links pointing at them.
So why isn't it listing the 14 pages on the main "Links - Internal" screen then? As it has already discovered those 14 pages do exist, it should at least list all 14 of them, and then show at least one internal link to each of them.
It has been like that for more than a week, maybe nearly two weeks now. A Google site: search shows 32 pages already, and has been stuck at that level for more than a week too.
Addendum - I have just checked a couple of smaller recent sites and WMT seems to be totally messed up and irreconcilable!
If only Google would rollback a few weeks!
I've always thought that it's related to Google needing up-to-the minute results, but also requiring a deeper level of scrutiny for long-term result sets.
In some cases, it's highly desirable behaviour, since for many sites the previous pattern would be that a homepage or another 'index' page mentioning the 'deep' content would be the only one to perform for a while.
To be totally honest, I've taken this process for granted for a while, and I haven't looked at in in the detail I should have done. Time for a bit of data crunching/testing I reckon ;)
[edited by: Receptional_Andy at 9:58 am (utc) on June 8, 2008]
It now lists ten internal pages, and shows that some of those have up to 40 internal links pointing at them. In effect it is saying "we have only found 10 of your pages, and we have analysed the outgoing (but still internal pointing) links on 40 of your pages".
Doesn't make any sense.
A site: search now lists 34 pages from the site.
As you can imagine, there are NO canonicalisation issues of any sort, no duplicate content issues, no architecture or code issues, but a few pages have been flagged for "meta description too short". Those have already been fixed.
Lunched a site..
2 days later launched a test page.
7 hours later got number 1 in Google for search term and many raleted to page..
Today 2 days later page de-indexed in google and totally gone.. I was devistated.. reading this post does give me a little hope...
My search term was non competitive but I have to say I was over the moon... today I was thinking does google now think there is something bad about my site and will remove all new pages I build in this manner.?
This is all very confusing?
Kindest regards David
In all probability, your page will return, and likely in the same results pages you had previously.
Look at it this way - in the past you still had the same wait for content to start getting referrals, now you get a few bonus days early on ;)
page 1 on (3rd June 2008) - indexed on 4th June, de-indexed on 5th June
page 2 on (3rd June 2008)- indexed on 4th June, de-indexed on 5th June
page 3 on (4th June 2008)- indexed on 5th June, de-indexed on 6th June
Its now 8th June and still not re-indexed, but ill keep an eye on them to see if they come back in the same order they were created and de-indexed.
They are still indexed and ranking ok in aol which uses google search.
Lee
The phenomenon has been reported a good bit - and as Andy mentioned above, I also have just come to expect it haven't looked closely for a while. Most of the time there is a return to ranking within a short period.
It's as though Google has a separate partition for newly found pages and they get a different treatment. Then sometimes, as those pages transition to "regular" handling, there's a lapse. At least that's the way I've modelled it in my mind.
They are still indexed and ranking ok in aol which uses google search.
Interesting!
A site:domain.com search lists 39 www pages, and clicking "omitted results" then lists only 32 www pages.
A site:www.domain.com search lists 39 www pages , and clicking "omitted results" then lists 40 www pages.
Why the difference?
There are no non-www pages on this site, nor any incoming links to non-www pages.
I've recently become quite interested in the data that Google makes available to AOL. It seems to be a smaller subset than even the "regular" index, but also a bit more stable
Worth a discussion in itself, but I'll try to stay on topic ;)
So, if pages are 'dropped' from Google but remain in AOL, is the implication is that Google could show such pages if it wanted to - but it doesn't want to do so?
a glitch, not an intention.
Hehe, I obviously have too much faith (don't think anyone's ever said that about me) ;)
I've always assumed it was a way of making sure time-sensitive content appeared immediately, although evaluating such content would much more restricted to the text within it - then it drops (presumably after enough time for time-sensitivity not to matter so much). Then it can be properly evaluated, so it pops back into it's "permanent" position.
So, AOL could have checked the 'new > quality' box on their algo ;)
If I do a site:/example.com where example is my 'plagued' domain then Google tells me that I am logged into my account and the my home page is 22 hours old. I cannot log out whilst this is in the search box!
I then change the search to site:example.com and the page shows that the home page is 18 hours old and I am asked to sign in even though I have never signed out! So it seems that I have two home pages 4 hours difference in age - According to WMT I have no duplicate page issues but according to Google I do?
The only way that I can sign and out is to clear all my cookies each time. In summary, my homepage is reported as different ages depending on whether I am logged into my Google account or not.
The results returned are just crazy numbers that keep changing by factors of 10 or so. Does anything work in Google like it should do anymore?
[edited by: confuscius at 2:02 pm (utc) on June 8, 2008]
A number (not all) of the 'fresh' AOL results are also those that have 'dropped' from Google.
Our site automatically links relevant topics together, these pages are new topics so don't have as many internal links to them as other pages which have not been de-indexed.