Forum Moderators: Robert Charlton & goodroi
Somebody here has a problem or am I just missing something?
Is google's index at period these days?
Thank you for any light shed upon this troubling matter!
PS:If I don't get a decent answer I might slip back into the dark side and go back to scraping as white hat = no hat! ;)
The page appears for:
site:www.domain-name.com/read/ keyword
It does not appear for previous keywords it ranked great on. But for exact longer phrase "phrases" it is the only result.
The page lost all ranking over night and no black hat techniques have been used (and I know black hat).
Does this make it supplemental or penalized?
The page has 44 IBL in WebMaster Tools, no problems and no messages. And google takes daily 50%-75% of the website. News pages are crawled in a matter of hours.
Thanks for your replies.
Looking for sites that had published a particulr email address, I found that I got a varying number of results that actually returned sites that had published that particular email address depending on which parts of the email address I typed in.
However, if I included the whole email address I got zero results.
To me, it appears that Google doesn't always index every single word on the page.
From memory, most of the times this happened, the results were those from the Supplemental Results.
it appears that Google doesn't always index every single word on the page...most of the times this happened, the results were those from the Supplemental Results.
Exactly right. Supplemental Results have always been afflicted with a kind of partial indexing, and that's one of the reasons that duplicate meta descriptions - whihc ARE part of that top-level indexing are particularly bad on supplemental pages. Those duplicate meta descriptions cause even more intensive filtering of urls that really aren't duplicate in any other way, except for the dupe meta tags.
I think the reason Google shuffles bazillions of urls off to a database partition, is that urls in that partition get a less computation-intensive handling, and that economizes on resources. Given their recent patent on the issue, there may be even more types of partitions in the future, but we won't have names for them or see them tagged in any special way.
Given all that, I suspect that the OP's situation could also be compounded by those urls being in the supplemental index.
My site is quite new (one month) and this page used to bring in about 50 hits / day of 100+ in total. I have about 30 pages of content and several pages that provide internal navigation.
Considering all pages are written by ME and are original and the site is coded by me and does not have duplicate page problems in the content pages ... what might have caused this? Could this just be one of the recent google problems?
As I said not dirty stuff on this site of mine.
Right now I am working with a client site where they have a PR7 url with unique, original content - and it's in the Supplemental Index. Something in the algo has tagged this as a url of lesser importance, but darned if I can see what so far. Another client has a similar issue on a PR5 page.
Could my problem be related to the fact that it is a new site (1.5 months). So google gives me some traffic for motivation then levels me to see if I stay true to the website and to quality. And if he sees I update and work on it as before he could pull me out back into the light?
This is the theory of Supplementals for new sites but ... .
And I don't have a lot of links. About 150+ in webmaster tools for the whole site.
PS: I'm serious coder here! The site is as clean as it can be. No dupe pages due to URL problems, well structured both in navigation and in internal files structure. Very easy to maintain and change. I also keep keyword density low, below 5%. I have a script that gives me keyword/keyphrase density for any content I write. So I play by the guidelines.
I still hope this might end at the end of August. Maybe this was Google's time off too!