Welcome to WebmasterWorld Guest from 220.127.116.11
[edited by: tedster at 11:54 am (utc) on Apr 1, 2013]
I personally wouldn't be totally surprised if the integration of Panda to the front end of the algo has caused some processing power issues that's keeping the datacenters from being aligned. IIRC the reason they only ran it once in a while was due to the size and processing necessary for it.
A bit before and since they stated it's integration I've seen a definite lag in indexing, slower spidering, obvious cycling in results and different site: result counts way more than I saw previously.
@netmeg - How do you file a DMC report against a site that has scraped a couple hundred pages of your content?
I reserve the right to make as much a nuisance of myself as I have time to become. In past years, this has included (but is not limited to) phoning your place of business to discuss the matter, firing off emails to your management, issuing DMCA takedown requests to your internet service provider to have your offending pages removed, issuing DMCA takedown requests to Google to have your offending pages removed from the search engine (it’s amazingly easy now), calling you out as a douchebag on Facebook and Twitter, and posting screenshots of your theft alongside screen shots of my pages in my Hall of Shame
Today, however, the Googlebot is on a feeding frenzy - 1000+ pages being crawled per niche/category on our main site. This rate It should have the entire site recrawled in a couple hundred hours, lol.
The real "tying together" point for me though is when traffic is down, it's up overall due to expansions, but when it's down on an hourly basis relative to the previous week or year, I get a lower number of pages when I do a site: search. When it's up the site: search is "at it's peak" for pages indexed.
I see something similar to diberry, but I think my phenomenon that I am seeing is that people still search for those terms, but Google does not rank the site at all for those terms.
My guess is that Google is actively comparing the user metrics between the "results" and put all sites into a battle arena. Sites that deliver "comparative" worse user metrics will lose traffic gradually. While the winner grows and eventually dominate the SERP against the loser sites.
I can't say this would stop traffic during extended periods of the day BUT what about search ads in the SERPs.... how many are there during these times that traffic is dead versus the high traffic times... I've seen a good ad or group of 3 ads above a #1 SERP do what everyone describes, stops a keyphrase cold.
In a more competitive niche I am seeing the Geo dial turned way up. This has been a pretty consistent SERP for the past few years, but over the last few days (maybe a week) the SERP changes dramatically with location. Also, no Google local results just local pages.
but I sure can't catch Google "disappearing" my sites from the SERPs even for a minute