Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Does Google care about quality of writing?

         

ToyTalk

3:24 pm on Sep 14, 2007 (gmt 0)

10+ Year Member



I have noticed several sites in the same area that I work in having higher page ranks than me despite being littered with spelling mistakes, badly written sentences and sometimes plain gibberish.

One site has not had any updates for over a year, has brief, one- or two-line stories that are badly written and often badly spelled and yet has a higher page rank.

Does Google have a way of screening writing for quality? This may be hard, but surely it's easy enough to see when spelling is all over the place?

Should Google be concerned about the quality of writing? A spelling mistake doesn't automatically make a page or a site bad, but repeated mistakes suggests a site isn't being put together with care.

whitenight

10:04 pm on Sep 16, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



However if it is not used you can bet Google is planning for it in some way in the future

According to who?! You?!

That's absolute nonsense and not based on any type of reality.

As pointed out above, most of the world does not speak english. And even those who know it, do not speak or write it well or correctly.

And in most instances, good grammar has little to do with a usefulness of page according to the searcher . (where are all the cries of Google cares about it's searchers?)

I see well-constructed gibberish pages ranking well all the time.
And again, tell me how well constructed the grammar is on this page

>> [google.com...]

lol. The people who speak like the above page are the same people who grew up inventing functional forms of speech which never equated to good grammar. (or even words that "make sense")

potentialgeek

6:14 am on Sep 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Automated spellcheck shouldn't knock Google offline. They can do that today. Grammar? Another 10 years.

It'd be neat, though, for G's results page to add an "S" count next to the page file size in the SERPs where "S" stands for (probable) number of spelling mistakes. Then the user, if they give a damn, can skip it.

Example

Google Webmaster CentralGoogle's blog for webmasters. The latest news and info on how Google crawls and indexes websites. Webmaster tools (including Sitemaps) ...
www.google.co.uk/webmasters/ - 6k - S16 - Cached - Similar pages

First they need to figure out how to deal with spammers and scrapers. If and when that happens, they can take the quality control to a new level.

Don't hold your breath.

p/g

callivert

7:13 am on Sep 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



there are two different questions mixed in here.
1)is it possible to tell the difference between auto-generated trash versus something written by a human?
The short answer: yes.
The long answer: it's a cat-and-mouse game (yes, another "arms race"), between spamming technology and anti-spamming technology.

2) is it possible to tell poorly written text versus well-written text from on-page/textual factors alone?
short answer: no.
long answer: they are so far from being able to do this, that it's not even worth talking about.

let's take spelling and grammar. When you run a spellchecker, how many times do you hit "add" or "ignore"? I won't labor this point, but in short it would be no use. Way, way too much noise.
As for grammar, that's basically future technology, at least with the kind of accuracy that a search engine would need. It's the realm of science fiction. For one thing, parsers are computationally expensive, really expensive. But that's not even the biggest problem. There are at least three limits on parsing: first, long, convoluted sentences with multiple embedded clauses such as the one you're reading now (no parser I know of would manage it), second, slang and very abbreviated text, and third, real spoken language. So when the parser returns junk, what do you do?

Ebonics demonstrates another problem: the task of having a hard-coded correct syntax across a large, diverse user base. In terms of quality, trends change over time. The style of writing that a 70 year old considers to be high-quality versus what a 40 year old would consider high quality are probably two different things, and even then would vary according to region, SES, and demographic.

Google seem to pay attention to some on-page text factors, but it wouldn't be smart to invest too heavily in this - at least as far as "quality" is concerned- because the returns just wouldn't be there for them. Not in 2007, anyway.

Bewenched

1:20 pm on Sep 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm going to have to say NO. I've seen pages rank #1 for a term that appears once on a page and is surrounded by complete jibberish.

callivert

9:11 pm on Sep 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm going to have to say NO

I said it was possible, not easy.

cabbie

10:26 pm on Sep 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have a "well constructed gibberish" site full of hundreds of gibberish pages for individual keywords that has been in the serps for over 4 years now.
Anyone reading it, will see its gibberrish but rankings for it keep improving.
I tell you google is not for white hat webmasters.
Why bust your guts over making a squeeky clean website, only to see it disappear for no obvious reason.
Better just to throw heaps of mud and be secure in the knowledge that some will stick.
This 36 message thread spans 2 pages: 36