Forum Moderators: Robert Charlton & goodroi
"We deeply care about the people who are generating high-quality content sites, which are the key to a healthy web ecosystem," Singhal said.
"Therefore any time a good site gets a lower ranking or falsely gets caught by our algorithm - and that does happen once in a while even though all of our testing shows this change was very accurate - we make a note of it and go back the next day to work harder to bring it closer to 100 percent."
"That's exactly what we are going to do, and our engineers are working as we speak building a new layer on top of this algorithm to make it even more accurate than it is," Singhal said.
[wired.com...]
Odd characters are being put into the URLs that Googlebot tries to crawl, thereby rendering 404. For example, www.example.com/&837262intendedpage.htm I don't know how to eradicate these if they are being "discovered" on search portals.
[edited by: tedster at 4:27 am (utc) on Mar 14, 2011]
[edited by: TheMadScientist at 4:32 am (utc) on Mar 14, 2011]
If receiving a bunch of those bad links is in any way tripping a ranking problem, I'd be astounded.