Welcome to WebmasterWorld Guest from 23.20.147.6

Forum Moderators: Robert Charlton & andy langton & goodroi

Message Too Old, No Replies

Keywords - Outdated or Still Critical?

     
4:21 pm on Dec 29, 2016 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3237
votes: 195


The last couple of weeks there has been some debate over keywords. Are the still critical to your SEO success or are they deprecated? From my perspective, it depends.

Too many people are using out dated thinking when it comes to keywords. They think that you need to mention the specific keyword that searchers are typing into Google a certain amount to perform well. This thinking is too close to the outdated keyword density metric that is no longer a direct factor on rankings but was important about 15-20 years ago (yikes, I've been doing this too long). I am not saying you should create content that never uses the keyword the searcher type in. Rather you should supplement the keyword with relevant synonyms and concepts as you provide a more comprehensive answer with significant value. One test I like to use when evaluating new content writers is to delete the specific keyword, then read their sample article. If I can still understand the article then the writer has done a good job of explaining the article's idea without overusing a keyword. Users & search engines don't like keyword stuffing.

If your content can be summarized into a single sentence, you do not have sustainable content. Sooner than later Google will figure out that single sentence answer and display it in the serps likely making your content irrelevant. This is why you want to focus on substantial content that has real value, but I am going on a tangent and should bring this back to keywords.

So when are keywords still critical? Keyword data lets you identify concepts & ideas to better direct your content creation. That is super important. Thoughtfully placing keywords can also help improve your usability. Properly placed keywords enhance the scent of information for users. Google isn't ignoring h tags or title tags, so some shrewd keyword handling can be beneficial.

How are you using or no longer using keywords in your SEO efforts?
4:33 pm on Dec 29, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member piatkow is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 5, 2006
posts:3382
votes: 36


As I understand things the "Keywords" attribute of the "Meta" tag is redundant as far as the major search engines are concerned.

Google looks at content so the words that people will search on do need to appear on your pages. For me the first pass of content is always purely for the reader. I may go back and tweak text to include the occasional synonym or make sure that a vital search term is included.
4:50 pm on Dec 29, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:2540
votes: 223


search engines don't like keyword stuffing


Sorry, I have to disagree, again.

I see this ALL the time in my widget SERPs from GooYahBing, and especially so when it comes to image search the results can be woeful, absolutely dire and totally inaccurate.

Yes, I can compete with my accurate, honest and complete information but the fact is I see every day almost identical pages from competing manufacturers within my industry stuffed full of keywords and nothing else of any value. Most of these are generated in India and seem to be impervious to any sort of penalty when they are clearly simply spamming the SERPs.

The thing is that I know that trade users take one look at those pages and know they're spam, how come the search engines or manual evaluators do not?
12:13 am on Dec 30, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 957
votes: 63


Google looks at content so the words that people will search on do need to appear on your pages.


Whilst I am in broad agreement, I have frequently seen page one results that didn't contain a search-term at all. However, this is probably sector-dependent: you might say Life Assurance instead of Life Insurance, but some products and product categories are more open than others to descriptive variation.

When I started ("yikes, I've been doing this too long" too), part of the art was in anticipating what search terms your potential clients were likely to use, and giving those prominence (in title, description, h1, content...), and while the ground has certainly shifted, I'm not convinced that the title of a page about widgets should no longer be Widgets: I don't personally interpret "content is king" to mean "verbose content is king", and even Quantum Mechanics can have a two-word title.

On the other hand, I think we can take Google's stated aim of matching user intention with outcome at face value, and I have certainly had the problem as an end-user of not knowing the technical name of the thing I want to find. In that scenario, which I am sure many users often face, keywords are useless: descriptions or synonyms are all we have. Content should allow for non-specialist searchers, and placing exclusive reliance on keywords doesn't do that. The fact is that Google has become better at matching inexact searches to relevant content. You can call a spade an earth-inverting horticultural implement without ending up in the wrong department.

The question here presents this issue as a dipole, when it isn't one. Are keywords outdated? No. Are they still critical? No. They are still important, but not as important as they once were.
2:33 am on Dec 30, 2016 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1028
votes: 213


Keywords - Outdated or Still Critical?
OUTDATED!

Keywords have been dead for years. What appear to be keywords working (as per RedBar above) is an artifact of their replacement. :) If you are still running keyword anything tools you should stop yesteryear; if your fundamental SEO mindset is keyword driven you need a reset, an intervention asap.

The driving impetus since mid-2010 when Google acquired MetaWeb has been named entities.
Note: John Giannandrea who founded MetaWeb became head of search at Google last spring. I was flabbergasted how few people took heed, connected the dots down the years given such a huge hint.

The reason that keywords seem to still be 'working' in some/many instances is that keywords are a subset of entities. Simply put, as:
all cognac is brandy but not all brandy is cognac
so:
all keywords are entities but not all entities are keywords
By chasing keywords one is chasing a continually diminishing subset.

Of course that is not all there is to it. It's an entire rather fascinating field to research. Take a good hard look at all those 'knowledge panels', read about the knowledge graph, etc. et al. Yup, all courtesy of MetaWeb and named entities. Take another look at RankBrain. Yup. Now keep right on looking. There is more, lots more.

And for pities sake raise your eyes up from keywords. There is an entire world past them you been missing.
6:31 am on Dec 30, 2016 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5703
votes: 77


an earth-inverting horticultural implement


Actually, that should be a manually operated earth-inverting horticultural implement. ;)

raise your eyes up from keywords


Google might not "think" so much by keywords these days but they're still what users use to search with.

I'm wishing for a clearer understanding of what exactly to look at past keywords, and more importantly, what to DO about it. I'd like to hear more about what's actionable in all of this!

Side comment: I'm not convinced about how well Google really understands meaning, nuance or user intent, if the broad matching or keyword suggestions within AdWords are any indication.
6:34 am on Dec 30, 2016 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14476
votes: 327


I proposed the rethinking of Keywords in September 2015, where I pointed out that the old way of Keyword SEO has been over. Take a look at all the folks arguing against the idea, still clinging to their keywords. 2016 was the year the SEO industry started the process of FINALLY rethinking how keywords are used.
2017 will see the SEO industry focusing on how Google really ranks sites, which has less to do with links than you may think.

https://www.webmasterworld.com/google/4767412.htm [webmasterworld.com]

I'm not proposing being ignorant of what keywords are being used. But I do believe we should retire the practice of beginning and ending with keywords. That's a strategy for a search engine that no longer exists. In fact, that strategy was born before Google existed. Think about that.

I am saying we should step back and consider adding in thinking in terms of concepts and user experience to the way we create a site and individual pages.

We don't need to focus on keyword phrases.. Time to put a fork in it. The search engines don't care anymore. So why are SEOs still obsessed about whether their keyword is in the Title Tag or in H1?


Given the fact of how the SERPs look, I am questioning the wisdom of strategizing strictly in terms of keywords when Google consistently ranks sites that do not even feature the entire keyword phrase. It's time to look elsewhere because the classic SEO strategy is threadbare, dusty and expired.


someone asked about site architecture constructed around keyword phrases. But the more I think about it the more this tactic the more it resembles spam. The tactic I'm referring to is the one of creating a keyword pyramid with the big traffic keyword at the home page (top of the pyramid) and longer keyword phrases at the base of the pyramid, generally located several clicks away from the home page.

Is it time to retire Keywords as a way of organizing a web site?


I'm not saying we should abandon keywords. However, since Google is ranking pages according to the meaning of the page, because it understands what the page is about, instead of simply pattern matching, doesn't it make sense to at least stop and think of the implications?


Well... consider this. In May 2012 Google announced the Knowledge Graph. They said it was the first step in an overhaul of how Google presents the SERPs. Moving away from text strings (simple pattern matching of text in the user query matched to text on the web pages) and moving forward to things; people, places, things and all the objects that modify them. The Knowledge Graph was one small part. Search was the next shoe to drop.

In 2013 the other shoe dropped. The Hummingbird Update in 2013 introduced the ability to identify meanings. This was the first time the algo attempted to understand meanings. This goes way beyond keywords and demands that we think in terms of meanings, especially as it relates to user intent. Meanings, user intent and topics.

Google warned the SEO community they were moving way from strings of text in 2012. In 2013 Google announced that they had moved away from strings of text, it was done. So why in 2015 are we still hooked on a strategy that is more or less (more than less, I think), more or less obsolete?

It could be said (and I am the one saying it) that the date Hummingbird was released in 2013 is the official date of the death of SEO strategies that begin with keywords.
6:50 am on Dec 30, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7519
votes: 506


Keywords will remain a part, but instead of 200 points of reference it is more likely there's 2,000 points of reference these days. Only thing I know that works for ME is to never OVERDO anything ... natural keywords, associated content, clean, neat presentation, avoiding excessive bells and whistles and (and this is for my content sites might not work for yours) avoiding js if at all possible, no third party widgets (fonts, etc), and nailed down site security.

But there is no doubt that keywords are still important ... else how can a user find your content? When (using above example) spade can also be a shovel use both and if necessary hand or long handle, foot, square, rounded, narrow, steam, tons or ounces, etc. Every one of those is a keyword AND a synonym and can aid in placing your content in a query.
7:50 am on Dec 30, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 7, 2006
posts: 957
votes: 63


all cognac is brandy but not all brandy is cognac


All keywords are words, but nor all words are keywords.

What constitutes a keyword is a matter of definition. Is there anyone who thinks that generic names for the products common to their industry are unimportant? If you want to summarise "what this page is about" you have to accept that it requires fewer words than the whole page, and that the words that you end up with carry greater weight than those you leave out.

While the "keywords" meta-tag is certainly outdated - it has been for more than a decade - it doesn't mean you should forget about what the people who want to find your page are going to put in the search bar, or how search-engines will process it.

Out of interest, has anyone tried indexing two pages with identical word-content, but with one in random word-order (including title, h1, etc.)?
12:16 pm on Dec 30, 2016 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3237
votes: 195


@wilburforce If you want to have fun let's do a test right here on WebmasterWorld. Submit two threads to this Google SEO forum with your identical content but in different order. I'll lock both threads to remove the variable of user comments impacting the rankings. Then we can sit back and see how Google handles the pages.
2:03 pm on Dec 30, 2016 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3237
votes: 195


Here is an old keyword stuffing test that I can share [google.com...]
10:45 pm on Dec 30, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Mar 30, 2005
posts:12929
votes: 200


I'm spending more time on audiences (both for PPC and SEO) than keywords by a longshot.
3:29 am on Dec 31, 2016 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:8361
votes: 340


When writing detailed & engaging content, natural & relevant keywords & phrases will occur. Google rewards natural occurring keywords & phrases in the SERP.

It's the artificial keyword stuffing that has been deprecated.
3:32 pm on Dec 31, 2016 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:May 1, 2005
posts:402
votes: 1


I know someone who works for a well known in the UK computer magazine with a good reputation in the industry.
They clearly stuff their online content with keywords and still play such tricks as using such alt tags as "iphone 6 release date" on images totally unrelated to the iphone 6. Obvs it's not their only reason for ranking VERY well but it seems to be working!
4:04 pm on Dec 31, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 829
votes: 210


Obvs it's not their only reason for ranking VERY well but it seems to be working!

They may also be ranking despite these practices, and they may even rank better if they avoided these practices.

My site ranks for keywords that do not appear anywhere on the site. Those "keywords" are relevant to the topic of my site, but they are not synonyms of other terms on the site.
8:04 am on Jan 2, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:2637
votes: 80


There is an amazing about of misinformation about. I just read an article by someone who used a title like "widgets in UK" instead of "widgets in the UK" because their webmaster advised them that the former was better SEO. I am pretty sure Google would have no trouble matching a search for "widgets us" or "widgets in uk" or "uk widgets", and the incorrect English may make the page look lower quality, as may starting the article with an explanation of this rather than actual content.

From my own everyday searches it is very clear that Google has no trouble understanding common abbreviations and synonyms.
1:23 pm on Jan 2, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3125
votes: 212


graeme_p -- sometimes you have to adjust your page titles to try to counteract google's habit of changing them for display in the SERPs to something that misleads searchers and reduces your traffic.
2:32 pm on Jan 2, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14476
votes: 327


graeme_p makes an interesting observation. As I stated in 2015, Google is trying to understand the words and the concepts they are describing. That's one of the ways Google is ranking sites that do not feature the entire keyword phrase on the web page.

Words like the, a, etc. can be used by Google to help their algo understand the context of a word. There's a world of difference if a word is used as a noun or an adjective. In my opinion, being grammatically correct is part of the new SEO.

Removing stop words like "the," "a," and "and" is a practice carried over from the old SEO, when SEO was about matching search terms to keyword phrases on the web page. Think, AltaVista.

In 2015 I stated that user experience is high on Google's list. Since then Google has instituted a number of algorithmic changes with the purpose of increasing the user experience of their users such as the mobile index etc. Good grammar can be considered a part of the user experience. Easy to read text can be considered part of the user experience. Easy to understand text can be considered part of the user experience. All of that can lead to users liking your site, staying on your site, recommending your site, etc.

As I stated in 2015, anything that rings the bells for the user is likely going to ring the bell for Google.
3:00 pm on Jan 2, 2017 (gmt 0)

Full Member

10+ Year Member

joined:July 26, 2006
posts:298
votes: 9


The great Tedster said its "SEMANTICS" of the site, not a keyword on a page. God Bless him RIP
5:24 pm on Jan 2, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14476
votes: 327


I get how what I wrote could read like pie in the sky nonsense, I get it. However my focus was on reading where the algorithm was headed. It was an accurate breakdown of where the algorithm was in 2015, suggesting a break from the brainless synonym spamming (which in itself was a continuation of old school keyword term spamming). I suggested brand new SEO strategies to adapt to how keywords were being used and the part about User Experience extrapolated how that might play out in their algorithms for ranking purposes.
9:00 pm on Jan 2, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1028
votes: 213



I get how what I wrote could read like pie in the sky nonsense, I get it.

I often agree with your overarching thoughts but nitpick on the details. :) Neither then nor now did it read as 'pie in the sky nonsense'. The main problem I have, then as now, is that you, granted far less than most, refuse to let keywords go. To use an analogy: back in the beginning as children we learned our alphabet, each alpha-character had great importance. Then came words. The current keyword fixation is as if we hung onto our alpha-characters and refused to learn words.

From the very beginning - perhaps because I began on the web before AltaVista let alone Google - I have let the subject of the site determine the architecture. Every subject has it's own internal logic and natural divisions. By adding in the initial and probable subsequent audiences prior - print at the time - interactions, i.e. term differentiation, and a reasonable knowledge of synonyms content just naturally included all/most keywords that everyone soon was chasing via tools. The SEO approach to content was/is more of trying to find (short term) shortcuts to site content structure than anything else. And it mostly results in an horrible site architecture.

Since 2010 there has been an increasing use of Natural Language processing and an accompanied decreasing of the simpler index (keyword) term matching. An early sign of this, generally missed or misunderstood was that so-called stop words were no longer excluded. Also that word position began to change meaning and not just importance. Finally, as you mention, RankBrain came out into the open. Yet many/most webdevs and SEOs still don't seem to grasp the degree of change of the past 5+ years, let alone what those changes might be.

A lot of the friction is due to tool providers; many tools are based on keywords so there is an economic luddite incentive to maintain the importance and viability of keywords. Some of the articles attempting to explain the new while hyping the continuing relevance of the old is pure black humour. For the few of us that get the joke.

I'm very fortunate; fortunate in a long past experience with the web that puts events/changes in perhaps a better perspective than many, fortunate to have sites that are pretty much complete so no longer needing build time, fortunate to have a programming and database design background to bolster my webdev, fortunate to have a B&M sales/marketing background to bolster web success, etc. et al.

Where many/most webdevs have to keep their noses to the grind of building, which I know is one very deep and hyper-focussed rut (I did 5+ years of 100+ hour weeks on mine initially) I get to play out on the cutting and occasionally the bleeding edge. On things like visitor IDing and personalised contextual delivery, Long Short-Term Memory (LSTM) neural networks for analytics and site search. I now play in a different ballpark than most webdevs. However, without a solid understanding of the basics such as the shift from keywords to named entities I would be building on shifting sand as much as everyone else.

So, I get perturbed when hype and fad, myth and fallacy get spread as gospel; when fundamental change is missed, misunderstood, misapplied, or outright denied. The one thing that has been constant since day one of the internet, let alone the web, is change. To pick up on it, characterise it correctly, then implement it successfully is a regular frequent webdev necessity. And extremely hard to accomplish when the information net one relies on is skewed for some reason.

To martinibuster (and everyone else at WebmasterWorld) please keep up the good work and providing your opinion. Even where I disagree I value your commentary; at the very least it makes me think, reassess my own understanding. Thanks much.
4:01 pm on Jan 3, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:July 23, 2015
posts:206
votes: 61


@goodroi,

Absolutely critical.

Real life example. I have a competitor who gets 60K/ day visitors where we are struggling to get over Google Animals death grip at 1.5K /day on a specific niche. Because they are #1 for those specific KWs. Not that they are any better than us, maybe in some respect. I can guesstimate how much more money they are making off this. Probably in the neighborhood of $10 MILLION per year.

I am with tangor on this:
>> @tangor: Keywords will remain a part, but instead of 200 points of reference it is more likely there's 2,000 points of reference these days.
4:07 pm on Jan 3, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:July 23, 2015
posts:206
votes: 61


The main reason it SEEMS like Google has an algo that's less reliant on KWs is because of thousands of SERPs checkers they have working on analyzing and ranking results. Same for Bing.

The humans are the ones ranking sites with better usability and more readable text - and also "nicer looking" - higher. Which is a part of the algo.

Doesn't mean KWs are ever going to be dead.
4:54 pm on Jan 3, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 829
votes: 210


@smilie how do you know this about your competitor? How do you know they get 60k/day? How do you know they are #1 for those keywords? What keywords: "Widget", "blue widget", "buy blue widgets", "Best widgets", "how to use widgets"?
5:22 pm on Jan 3, 2017 (gmt 0)

Administrator

WebmasterWorld Administrator coopster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 31, 2003
posts:12547
votes: 2


How are you using or no longer using keywords in your SEO efforts?


You didn't specifically ask for external SEO so ...

My custom Content Management System includes an on-site index and search feature. The meta keywords is a great place for our clients to add keywords such as misspellings and any other content that they would rather not have displayed on the page, yet can be indexed. Let your mind go crazy here. The indexer will use this area and then in the event a visitor enters search terms that match, there you go. Oh, the really cool part is the search terms are logged and the information can be used to enhance and improve the on-site search feature.

Does it help with external SEO? :)
5:48 pm on Jan 3, 2017 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 12, 2014
posts:377
votes: 64


Well, google gives different results for geo-house and geo-home so they haven't quite figured out the meaning of some very basic words yet or that some words mean the same thing. I think it is not yet time to abandon keywords.
6:10 pm on Jan 3, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14476
votes: 327


The humans are the ones ranking sites with better usability and more readable text - and also "nicer looking" - higher


No. No. Absolutely not.

Here's what the quality raters are about: Aside from quality control purposes to rate how well the algorithm is doing and adjust the algorithm if there's a bug, there is another purpose directly related to how the algorithm becomes a quality rater itself. The humans generate data for training the algorithm. The algorithm looks for things that are in common for sites rated highly/poorly. Those signals become was is known as classifiers. This is machine learning 101.

One of the purposes of human quality raters is create a machine quality rater. One advantage is that humans are subjective and sometimes contradict each other when rating the same site. What researchers did was to take those instances and find out why the humans failed to agree then figure out how to make the machine do it better.

Everyone who does SEO must know this otherwise one risks creating a speculative reality-bubble about SEO that has nothing to do with how Google actually ranks websites.
7:38 pm on Jan 3, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:July 23, 2015
posts:206
votes: 61


>> @NickMNS: how do you know this about your competitor? How do you know they get 60k/day?

Because we do competitor research. Because we are technically top 15 in a HIGHLY competitive niche where we have to know what our competitors are up to.

>> @NickMNS: How do you know they are #1 for those keywords?

Hmm. Let's see we are selling "widgets". We sell all, "green widgets", "blue widgets", "special widgets" etc. Among these , keyword "widgets" is very generic. So there are niches. Let's say one of them is "green widgets". That niche alone , being one of the more trafficked , with all of its sub-keywords ("green home widgets", "commercial green widgets", "specialty green widgets", "green widgets guide" etc) gives them that many visitors, as they are mostly #1 for a bunch of them, at least 10 I monitor.

[edited by: smilie at 7:46 pm (utc) on Jan 3, 2017]

7:42 pm on Jan 3, 2017 (gmt 0)

Full Member

Top Contributors Of The Month

joined:July 23, 2015
posts:206
votes: 61


>> @martinibuster: No. Absolutely not.

You are going to tell me you have a way to identify usability and "prettiness" of the site via a robot?
Let's discuss our new , Google-beating startup in PM.

Because what you are describing is machine learning of KNOWN cases. You can't possibly stick it on every situation. Impossible with all the "machine learning", it is mathematically a subset.

This is why self-driving cars are going to fail in this iteration. Impossible to predict all scenarios. Once a few people are killed by them, there would be laws preventing self-driving cars with no human drivers. Sure you could sell a few IPOs and make billions and such. But the laws of physics (and legal system) are against them. So the way G is doing it is spawning a startup , pretending it's not them, and making a run for commercial and military niches, where (military) could care less if they kill a few people.

[edited by: smilie at 7:51 pm (utc) on Jan 3, 2017]

7:47 pm on Jan 3, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7519
votes: 506


AI of any kind is not ready for prime time, and might be more than a few years just to get the thing to work. Not holding my breath on this one.

g's machine learning falls into this category, and I am happy they continue to insert human corrections ac necessary.
This 53 message thread spans 2 pages: 53
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members