Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Keeping Up With Google's Evolving Technology

         

goodroi

8:55 pm on Jan 9, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



If you are struggling with Google SEO it might be due to you not keeping up with Google's evolving technology. You don't want to be using outdated SEO techniques. We all know that stuffing the meta keyword field used to be a good idea like 20 years ago but not its worthless. Have you kept up with more recent technology changes?

Have you looked at [cloud.google.com...] ? This online tool is not Google's secret search ranking formula but it is a window into Google's technical capabilities & mindset. See how Google is using NLP & AI technology to realize that different words in different combinations & orders mean different things and which things are relevant & connected? Again this tool is not Google's secret search formula but it is a great reminder of how much things have evolved & a peak into Google's mindset.

A new tool I've started using for some of my projects is [inlinks.net...] is much more practical &useful. This tool looks at your content and analyzes it for the basic data but more importantly it looks at the page from a semantic POV. It identifies content structure, potential schema opportunities, Q&As to exploit and other things we never thought about 10 years ago. We are well past trying to stuff a keyword onto a page. Forget about the simple content strategies from years ago. That is now an outdated strategy for many situations (though life is always YMMV). Today I'm seeing much more success by intertwining relevant concepts & themes that support each other with valuable content to satisfy users intent. Anticipating what they want.

IMHO we are no longer in a simple 2+2=4 SEO world and it has become much more like an ax+bx=cx/dx type of SEO world. If we don't keep up with the evolving technology we will miss the various opportunities & fall prey to the growing number of pitfalls.

NickMNS

9:15 pm on Jan 9, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This comes across as a shameless plug for inlinks.net. What is your relationship with the company/service?

goodroi

11:33 pm on Jan 9, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I'm a customer of it. I get no money from them. Nor am I connected with Google though I do link to their API which is trying to sell their NLP technology.

My point is that technology has changed and too many people haven't kept up with it. People are still asking me what keyword density or content length they should use. They don't understand that Google has evolved well past those simple factors. Too many people think that content creation is still a simple thing & what they did 15 years ago is still good enough today. They complain about having self-proclaimed great content and not ranking but the sad truth is too often their "great content" is mediocre to the rest of the world.

TL;DR It would be wise to embrace technology and better understand the capabilities of your adversaries (Google or direct competitors).

RedBar

2:04 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A couple of things, nope three things:

1. In a global marketplace using many different languages trying to use convoluted methods rather than the basic KISS principle is utterly pointless. My potential customers want to find the right product and specifications without some kiddie-script promotion. It's no wonder that G has not been a legitimate business driver for years. When I say business I mean genuine manufacturers, not drop-ship discounters.

2. What is G's obssession for websites with incomprehensible cartoon drawings instead of honest, descriptive text?

3. Why do some SEOs constantly promote changes for change sake rather than evergreen? Job justification?

goodroi

3:24 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



#1 IMHO The KISS principle is less important than Occam's Razor when it comes to SEO. As much as we may want to simplify things, the world is a lot more complicated today than it was before. Blindly following the KISS principle might mislead a webmaster into making mistakes like thinking a single website design is sufficient for everyone including desktop, mobile & tablet users. Occam's Razor tends to deliver more pragmatic solutions. For example instead of oversimplifying (like a single display for all users) or over complicating things (like separate subdomains for mobile & desktop users), it leads more to efficient solutions like a single domain with a responsive design.

#3 Google has admitted they made over 3,000 changes to search last year. It would be wise for SEOs adjust to those changes and not ignore that things change. Facts can be evergreen but internet development definitely isn't. For example meta keywords was super important 20 years ago and now its worthless. Simple html design was best practice & now responsive design is best practice for most sites. A few years ago, relatively few sites were https & now a majority of traffic is via https.

NickMNS

4:06 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@goodroi thanks for clarifiying.
I agree technology is evolving and one needs to keep up with it, like it or not, evolution waits for no one.

I have been saying for sometime now that keyword / rankings are meaningless, as there is no longer a 1 to 1 relationship between keyword and search result, and there is no way to know what the relationship is. To this point there was a post on SERountable yesterday from Bing saying exactly that, whereas Google's position was little more nuanced:
[seroundtable.com...]

The inlinks.net service is compelling. But there are two points that worry me.
- First if one uses Google's NLP solution as a guide, if I take text from my website and paste it in the relevancy score and entities that it discovers appears much lower than expected. But if I then take the actual traffic, GSC keywords, and landing pages it absolutely clear that Google has no issue understanding the nature of my website (contrary to what the NLP suggests). There is no doubt that NLP is being used by Google, but exactly how and in combination with what is not clear. For service such as the one offered by inlinks,net to be truly be effective they must have some means of determining this.
- Second, is if one starts to "design" or write content to match the assumption of a statistical algorithm (like NLP) and in turn that algo is using the content to train itself then we will essentially be causing the algo to overfit itself, thus producing garbage results and garbage content. But this state may still be long ways off.

To bring the two points together, inlinks.net states that it use search results as guide to determine what is successful content. Essentially its algorithm appear to be designed to make your content look more like your competitors content and thus at some point all the content will simply be a copy of itself. Basically it is a filter to provide more of the same. Essentially the service does the exact opposite of evolution, it cause stagnation.

This is a problem with machine learning in general. ML works great if it works at a level where it does not impact the system it is interpreting. But if it reaches a scale where it is able to measurably impact the system, and the results of the algo are feed back into it, the system will implode.

RedBar

5:06 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm confused by your comments:
might mislead a webmaster into making mistakes like thinking a single website design is sufficient for everyone including desktop, mobile & tablet users.

And then you say:
Simple html design was best practice & now responsive design is best practice for most sites.

Errr ... I know you've heard of html5 fully responsive.

All my sites are standalone and sufficient for desktop, mobile and tablets. I was the global first in my industry to go html5, your "comments" do not match up to my experiences whatsoever.

For those of us who have been around since before Google even began it's been relatively easy to keep on top of what they've been doing for the last 20+ years except for some of their totally and utterly bizarre manipulations attempting to make the realworld work their way rather than the actual reality of what goes on.

As ever I always look at everything from a global widget trade point of view therefore, quite often, this may be totally inappicable to local and national promotional strategies in whichever country however for those sites I have constructed for solely UK companies the same method has been equally successful.

goodroi

6:18 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Sorry you didn't understand. HTML5 wasn't the first version of HTML & sites designed over 5 years ago weren't using HTML5 :)

My point is still the same ... things change and it would be wise to keep up.

iamlost

10:43 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There are three good reasons for SEOs/webdevs to research and itilise machine learning in one or another of its myriad permutations:
1. To accomplish one or more tasks.
2. To acquire a new skill/knowledge set.
3. To satisfy curiosity, to have fun.

What is not a good reason is to think one is (or can) emulate some Google behaviour or algo input. I quite expect many/most to treat the emerging publicly available ML tools as they did/do the multiplicity of keyword tools, in ignorance with incompetence. And that goes equally for the tool hockers, the self-marketers, the conference hype-sters...

Cynicism aside, ML can provide a valuable toolset. However, the most critical requirement/rule is GIGO: garbage in garbage out, data rarely can be used ‘as is’ while the most important is correctly framing the question.

And the most common mistakes are an insufficiently large and diverse training dataset; large and diverse: both are necessary.

I’ve been in mucking about with NLP and ML and what have you for 2-decades, actively using it for analytics analysis for 15 years, and as part of live real time site backend for ten. It can be awesomely great, it can also be a yawning abyss; as the ancient cartographers noted: here be dragons.

tangor

11:00 pm on Jan 10, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Call me a Luddite if you like...

The world is not more complicated (we still eat, breath, breed, survive, and when possible, buy a few trinkets).

But those who would like to MANAGE the rest keep putting up roadblocks. :)

YMMV

(Life is not complicated. Information is not complicated. Chasing the all mighty dollar and power for the masses is EXTREMELY COMPLICATED ... and is a human construct that does not deal with the REALITIES of life.)

KISS remains the most perfect, content remains KING, and all the rest is window dressing managed by one set of computer assisted ideology compared to the other set of computer use ... and in the middle is a smaller sub-set of humans making those determinations.

But it is what it is and if you want to play, you have to play by the tools/rules in current force.

YMMV
.

aristotle

2:05 am on Jan 11, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The only SEO "tools" I've ever used to any extent are GSC and the old Yahoo Site Explorer.

In my opinion the two most important factors for success are knowledge of the subject and good writing skills. Also, I've always used simple design layouts and basic html coding.

Maybe it comes down to how you want to spend your time. Everyone is different in that regard.

engine

12:43 pm on Jan 13, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Knowledge is power, and understanding how things work helps. If you want to be a clock repairer it helps to understand how clocks work.
How google works is probably a lot more complicated than it was, but if people want to know what works in Google SERPs it does require knowledge, and can come from many places. One of those places is to use the tools available.

I have always believed in tests, and test again, and keep on testing. Some tests are a disaster, and others show positive improvements. However, doing the right thing in the first place is going to help.

You'd be surprised at how many sites are not using natural language.

On the subject of keywords: Abuse many years back has resulted in keyword stuffing being a waste of time. In any case, it's not natural to read keyword-stuffed text. However, keywords are important, and if you haven't got your keywords in your natural language you're wasting an opportunity. Having said that, it's only one small part of the research any SEO should undertake. Most importantly, it's about the whole topic, entities, and semantic search.