Forum Moderators: Robert Charlton & goodroi
One site has not had any updates for over a year, has brief, one- or two-line stories that are badly written and often badly spelled and yet has a higher page rank.
Does Google have a way of screening writing for quality? This may be hard, but surely it's easy enough to see when spelling is all over the place?
Should Google be concerned about the quality of writing? A spelling mistake doesn't automatically make a page or a site bad, but repeated mistakes suggests a site isn't being put together with care.
However if it is not used you can bet Google is planning for it in some way in the future
According to who?! You?!
That's absolute nonsense and not based on any type of reality.
As pointed out above, most of the world does not speak english. And even those who know it, do not speak or write it well or correctly.
And in most instances, good grammar has little to do with a usefulness of page according to the searcher . (where are all the cries of Google cares about it's searchers?)
I see well-constructed gibberish pages ranking well all the time.
And again, tell me how well constructed the grammar is on this page
>> [google.com...]
lol. The people who speak like the above page are the same people who grew up inventing functional forms of speech which never equated to good grammar. (or even words that "make sense")
It'd be neat, though, for G's results page to add an "S" count next to the page file size in the SERPs where "S" stands for (probable) number of spelling mistakes. Then the user, if they give a damn, can skip it.
Example
Google Webmaster CentralGoogle's blog for webmasters. The latest news and info on how Google crawls and indexes websites. Webmaster tools (including Sitemaps) ...
www.google.co.uk/webmasters/ - 6k - S16 - Cached - Similar pages
First they need to figure out how to deal with spammers and scrapers. If and when that happens, they can take the quality control to a new level.
Don't hold your breath.
p/g
2) is it possible to tell poorly written text versus well-written text from on-page/textual factors alone?
short answer: no.
long answer: they are so far from being able to do this, that it's not even worth talking about.
let's take spelling and grammar. When you run a spellchecker, how many times do you hit "add" or "ignore"? I won't labor this point, but in short it would be no use. Way, way too much noise.
As for grammar, that's basically future technology, at least with the kind of accuracy that a search engine would need. It's the realm of science fiction. For one thing, parsers are computationally expensive, really expensive. But that's not even the biggest problem. There are at least three limits on parsing: first, long, convoluted sentences with multiple embedded clauses such as the one you're reading now (no parser I know of would manage it), second, slang and very abbreviated text, and third, real spoken language. So when the parser returns junk, what do you do?
Ebonics demonstrates another problem: the task of having a hard-coded correct syntax across a large, diverse user base. In terms of quality, trends change over time. The style of writing that a 70 year old considers to be high-quality versus what a 40 year old would consider high quality are probably two different things, and even then would vary according to region, SES, and demographic.
Google seem to pay attention to some on-page text factors, but it wouldn't be smart to invest too heavily in this - at least as far as "quality" is concerned- because the returns just wouldn't be there for them. Not in 2007, anyway.