Forum Moderators: Robert Charlton & goodroi
Google Updates and SERP Changes - March 2011
< continued from [webmasterworld.com...] >
< related Panda Farm Update [webmasterworld.com] >
New Chrome extension: block sites from Google's web search results
Monday, February 14, 2011 | 12:00 PM
Today the Google web search team launched a new Chrome extension to block low-quality sites from appearing in Google’s web search results. Read more in the post below, cross-posted from the Official Google Blog. - Ed
[chrome.blogspot.com...]
Also - [webmasterworld.com...]
I think user behaviour data is being underestimated in this thread. Each website will have an depth profile building that feeds into a potential quality assessment by Google. What say you ?[edited by: tedster at 8:15 pm (utc) on Mar 15, 2011]
@Walkman, you may be right. Afterall, John is a programmer, and programmers tend to speak in programmer's lingo, so the word "code" may have rolled of his tongue more naturally than "content".
This is not limited to this particular algorithm update & your site
I stumbled on this old thread
I think people sometimes over-analyze things, like every word said by a Google employee. John, most likely didn't ponder each word for hours, just said them casually.
[edited by: TheMadScientist at 7:34 am (utc) on Mar 16, 2011]
Afterall, John is a programmer, and programmers tend to speak in programmer's lingo, so the word "code" may have rolled of his tongue more naturally than "content"
99% of most sites is content, not code so ...
Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.
Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.
Can either refer to what Alyssa said (Algo testing user response--most likely) or maybe waiting for a more general site re-evaluation /ranking, as supposed to just the spider seeing that the page is a pretty good page.
Sometimes, even after recrawling parts of a site, our algorithms will need a bit of time to confirm that the site has really changed for good.
All of this can and will take time.
Another possible reason: wait until they get around to crawling and analyzing the entire site after various changes have been made, since they need to decide if the good to low quality content ratio is now within the "acceptable" range. And, the lowest quality pages would logically be the pages that are normally given the lowest priority for crawling and studying.
...99% of most sites are content (what a gross misconception or imprecise exaggeration)
potentially what changes to make to avoid getting hit by it