Msg#: 3645514 posted 3:41 pm on May 9, 2008 (gmt 0)
Maybe those old pages slipped out of the index and are now getting put back in.
Msg#: 3645514 posted 5:39 pm on May 9, 2008 (gmt 0)
this was something i was thinking. Supplemental returning maybe. All pages of the site were indexed just not all in the main index.
Msg#: 3645514 posted 5:51 pm on May 9, 2008 (gmt 0)
So we have an interesting question - can a supplemental page generate a Google Alert?
Msg#: 3645514 posted 8:02 pm on May 11, 2008 (gmt 0)
I would also like to add to this...
Two of my site's pages (our About Us and Contact Us section) never did appear when I did a site:example.com search for over a year. But a few days back I received an email alert for link:example.com and 'example.com'. The alert was our contact us page.
And now when I do a search for site:example.com, I see the contact us page is back in the index.
So to summarize
1) site:example.com does not return page x
2) receive email alert (link:example.com and 'example.com') with page x
3) page x appearing now in the serps for site:example.com
So I guess tedster, this is one small informational loop hole that they forgot to close. How long before they close it? Any bets?
Msg#: 3645514 posted 7:25 am on May 12, 2008 (gmt 0)
why would returning dumped pages create an alert when your own NEW pages dont?
Msg#: 3645514 posted 4:44 pm on May 12, 2008 (gmt 0)
That is the question. It looks like some kind of coding tangle in the Alerts selection process. Now that the issue is on the table, I remember seeing this on some keywords I monitor through Alerts, too. I justs never focused on it before.
Whenever a process is automated, various "edge cases" come up that the original code did not take into account. And sometimes other related processes in the total environment change -- and then the original code does not yet accommodate the new environment very well.
Msg#: 3645514 posted 6:58 pm on May 12, 2008 (gmt 0)
i wonder if this is a edge effect of the multiple running algos they now use. For example could a filter be tagging pages as low quality or other quality reason which results in that page being classified as no longer counting as part of that domain. Then another running algo find that page when normally it shouldnt, does not count i as part of the domain because of the tag and creates an alert. But then i have to ask, why alerts are very rarely created even for genuine inbounds. All very confusing.