Welcome to WebmasterWorld Guest from 188.8.131.52
I had 20,300 pages showing for a site:www.example.com search yesterday and for the past month. Today it dropped to 509 but my traffic is still pretty constant. I normally get around 4,500 - 5,000 to that site per day and today I've already got 4,000.
So, either Google doesn't account for even a small percentage of my traffic (which I doubt) or the way Google stores information about my site has changed. i.e. the 20,300 pages are still there, Google will only tell me about 509 of them. As far as I can tell, I think the other pages have been supplemented.
That resonated with something that I was talking about with the crawl/index team. internetheaven, was that post about the site in your profile, or a different site? Your post aligns exactly with one thing I've seen in a couple ways. It would align even more if you were talking about a different site than the one in your profile. :) If you were talking about a different site, would mind sending the site name to bostonpubcon2006 [at] gmail.com with the subject line of "crawlpages" and the name of your site, plus the handle "internetheaven"? I'd like to check the theory.
Just to give folks an update, we've been going through the feedback and noticed one thing. We've been refreshing some (but not all) of the supplemental results. One part of the supplemental indexing system didn't return any results for [site:domain.com] (that is, a site: search with no additional terms). So that would match with fewer results being reported for site: queries but traffic not changing much. The pages are available for queries matching the supplemental results, but just adding a term or stopword to site: wouldn't automatically access those supplemental results.
I'm checking with the crawl/index folks if this might factor into what people are seeing, and I should hear back later today or tomorrow. In the mean time, interested folks might want to check if their search traffic has gone up/down by a major amount, and see if there are fewer/more supplemental results for a site: search for their domain. Since folks outside Google couldn't force the supplemental results to return site: results, it needed a crawl/index person to notice that fact based on the feedback that we've gotten.
Anyone that wants to send more info along those lines to bostonpubcon2006 [at] gmail.com with the subject line "crawlpages" is welcome to. So you might send something like "I originally wrote about domain.com. I looked at my logs and haven't seen a major decrease in traffic; my traffic is about the same. I used to have about X% supplemental results, and now I hardly see any supplemental results with a site:domain.com query."
I've still got someone reading the bostonpubcon email alias, and I've worked with the Sitemaps team to exclude that as a factor. The crawl/index folks are reading portions of the feedback too; if there's more that I notice, I'll stop by to let you know.
[edited by: Brett_Tabke at 8:07 pm (utc) on May 8, 2006]
Again thanks a lot.
[edited by: tedster at 4:54 pm (utc) on May 13, 2006]
My situation stands better now than it has since this sites issues began last sept.
Until the current situation of missing pages started my site was fully indexed but only a handful of pages were ranking.
1100 pages go missing and the remaining pages returned to their pre september rankings. A few days after emailing the bostonpubcon2006 at gmail.com addy my site's pages started returning.
According to todays reply I have "939 pages listed".
When i site search on google using the 4 variants suggested they all return 10,300 pages. Of course the site doesn't have that many pages, it has around 1300 and up to 999 there are no supps.
Other points to note from the email:
"This suggests to me that the
situation is currently self-correcting"
that your site has not been manually penalized"
I hope my mess and slow improvements gives others a little hope in these times of trouble.
Thanks again to whoever took the time to look at my site and reply.
Edit to add that most of the reindexed pages are back to pre sept positions.
4 out of the 10 pages are added a %22 to it for example:
www.widget.com/super-widgets.html%22 and that is the reason we get a 404 error.
What could be the problem?
Would anyone know if this has anything to do with the site not getting fully indexed?
They were all returned May 7th and 9th as 404 not found. I have no idea why or where google got these url's. They are not in our site map.
Hope Google Guy looks into this as it seems wide spread
The sooner webmasters take G's crap affiliate program (adsense) off their sites, the sooner this whole mess will be solved.
Don't believe me?
Get 10,000+ webmasters to remove their adsense codes today and I'll bet this "impossible" issue will be resolved by sometime next week.
As soon as webmasters look at the bigger picture, (honestly is a couple weeks of missed adsense income REALLY hurting you/us?), the sooner we get our voices heard.
try sitemaps! try sitemaps! thats all it seems to come back and say - funny you'll never guess what my email said "try sitemaps"
Thanks. Thats a very interesting post you bring to our attention:
especially this part of the post:
"There are a few things to consider about our overall crawl and indexing
pipeline. As part of some recent updates
(http://www.mattcutts.com/blog/bigdaddy/) we're taking a much closer
look at affiliate links, linkfarms, duplicate content, and other
factors as described in our webmaster quality guidelines"
is it the same to you?
joined:Dec 29, 2003
I am trying to figure why Google has chosen these 24 pages out of 440 pages?
What's wrong with the rest?
joined:Dec 29, 2003
how many do you have on Google now and how many do you have in total? Are the pages increasing, decreasing and at what rate?
I ask because as Google increased its crawling I noticed more pages and more appearing
There are 100's of webmasters with perfectly legitimate sites using original and interesting content in trouble.
Even Google knows that webmasters and good site owners in general do not adhere to their quality guidelines and to cut them out completely would have the effect of damaging their search products access to interesting information for users.
The overall problem is still suspended at an earlier stage with the DC's being monitored and tweaked [ and yes we webmasters are part of the testing phase ] to produce accurately what Google intends.
We still have to see the reliable effects of backlinks onto fully indexable pages. We're not even close to a solution - if the pages aren't even indexed properly , how long do you think it will be before the backlinks start to work.
2 months , 3 months , 6 months?
And then you have issues with unrelated or old pages [ supps ] still to be sorted.
When this is all sorted, then one can look at the on page content and say, this isn't appearing because because of these reasons.
But how I'd love to be wrong!
Thank you so much for the feedback.
I guess we've got no choice..but wait. Was just trying to figure out if..maybe..maybe..there is something wrong with our site and that might be the cause..
Haven't come up with anything on our end..I guess we have done a lot..We will not give up..Hopefully we will succeed..
joined:Dec 28, 2005
I think that Google meant that affiliates using approved copy might suffer a duplication penalty.
NO - Google meant down right affiliate links.
"...we're taking a much closer look at affiliate links, linkfarms, duplicate content, and other factors as described in our webmaster quality guidelines..."
You'll have to use their crappy adwords program and pay up if you want to promote affiliate links...and they can freely channel it all over the web via their spammy MFA affiliates. As long as you pay there are no penalties or crawling errors.
no free affiliate links promotion for you ( Soup nazzi - Seinfeld)
it would be nice if we could get a little bit more clarification on what they mean here, are they just taking about affiliate sites that are just full of banners with no content, or as in my case a site with 40% affiliate content mixed in with my own unique content?
After all if you've got a site selling lets say books unless your going to read every review you are going to have to use snippets from the affiliate, but if that snippet is in between your own content with additional information from you why is that so bad!