Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

2015 Google On-page SEO Ranking Factors List (Including Deprecated Factors)

         

martinibuster

1:05 pm on May 6, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I want to split off on-page from off-page and discuss solely on-page ranking factors, including the deprecated factors. What's your list of important on-page factors and those that are less important?

2015 Ranking factors
User experience metrics (all of them)
Shorter title tags
Original content
Engaging content that provides an answer, teaches, informs, is useful, delights
Original images
Quality site design
Descriptive meta description

Deprecated
Keywords
Focus on longtail phrases
Focus on ranking for specific keyword phrases
Lean code

EditorialGuy

3:43 pm on May 20, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Martinibuster: It will be interesting to see how this evolves.

It's easy to imagine Bubba's MFA site disappearing from the SERPs,, but what about the corporate-owned megasites that were built around keywords and SEO (TripAdvisor is one obvious example)? Will they change? Or, over the years, have they acquired enough real content and positive user metrics to compensate for the SEO-driven editorial strategies that made them what they are today?

martinibuster

4:25 pm on May 20, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Thought experiment
If a user requests a specific website, can that be interpreted as a vote for that site, a signal of quality even?

Take a look at how often e-how is "voted for" by users. [google.com]

Take a look at those same votes for e-how compared with user "votes" for TripAdvisor. [google.com]

What does that tell you about how much people want to see e-how versus tripadvisor?

Those are user queries.

fathom

5:43 pm on May 20, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I want to split off on-page from off-page and discuss solely on-page ranking factors, including the deprecated factors. What's your list of important on-page factors and those that are less important?


Due to the way SEO evolved I really think you need to divide your discussion into 3 parts.

On page (sort of useful for obscurity) quite useful for usability but that isn't about ranking.

Off page but On Site (these are the important things most people involved totally ignore). IMHO

Off Page (absolutely needed but won't produce ranks without the Off page but still On Site)

As you noted ranking factors are not really important the whole system fails without links which is which Google continues to have an obsession with links.

No matter - if this discussion is only about On Page (only) and not what you control within you own domain - it is a waste of my time so I will stay out of the discussion.

elguiri

6:23 pm on May 20, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



An entity is simply a person place or thing and neither of those need optimization.


I think we all know that. But neither do search engines require optimization, and yet we've been talking as if they did for nearly 20 years.

If you want to create a list of ranking factors that still exist that would please a Google engineer, then fine, but creating something so "on message" is likely to kill this forum, where people with a certain base knowledge might come looking for clues as to how they might get more traffic for their sites. I accept there are novices as well, but should we always comment just for them?)

Other ranking factors still exist - there is a large annual study by a software company that identifies nearly 40 with a significant positive correlation. If I were advising someone who knew nothing about this game, I'd tell them to ignore studies like that and just build the best page possible. The reality is that if you have an idea what those factors are, and understand the concepts of subtlety, nuance and moderation, then there's still plenty of hay to be made. If there weren't, I wouldn't be eating tonight.

webcentric

8:29 pm on May 20, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Speaking of user experience. I got on my hosting company (which I love dearly) about implementing GZIP compression on their shared servers. Today I got an email that it was done. My page speed scores went up 25 points over night. Have a 95 for desktop with mobile in the mid-80s. I'm gonna watch and see it this has any bearing on ranking in the near-or-distant future. Site already get's a "Mobile friendly" nod. Anyway, after quite a struggle with UX over the last six months or so, I think I've done about everything I can do on that front (everything that's in the realm of practicality anyway). This isn't the biggest factor (IMHO) in this discussion but I certainly think it's one of the few factors we can actually do something about.

Nutterum

6:17 am on May 21, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



OK @martinibuster I`ll bite. Now please enlighten me, how will you create a website for luxury accommodations? I am giving this specific example because I know from a fact that 1) you are not allowed to write free content on your website related to the properties you sell 2) you are not allowed to use images not provided by the luxury home owner or agent unless you shot them yourself and have the legal and written consent 3) due to the nature of your customers you are not allowed to disclose who they are or how much they paid . And this is just one niche. If I broaden the field, I can safely say that 99% of all websites that provide accommodation from expedia and tripadvisor to affiliate local resellers are in the same boat content wise, where everything is used and reused times and again. So now what? Should they close doors and walk away until they have a really unique proposition Google can benefit from?

fathom

7:57 am on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The problem here you are attempting to make organic search into free advert search, that don't deserve natural links so you need a workaround.

Organic search "ranks" is about the editorial value of a website to capture link juice (PageRank) not the mundane value of ad copy which will still rank if your website is organized correctly and why I said, On page, Off Page & then Off Site (the latter being traditionally considered as Off Page).

elguiri

9:16 am on May 21, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Nutterum, best practice in this scenario is typically to dilute the duplicate content with rich related content, and create the mother of all mash-ups. For your example, you might consider rich information about the area and its amenities. (Pull in info on local schools, local restaurants, shops etc). You could add rich demographic information for the location - use public databases or whatever you get your hands on. (All the better if it's already nice structured data). You can add royalty-free images of the area too (Google does this on its maps). That probably won't be good for ranking directly, but will have people click around on your site for longer. With a bit of head-scratching you can probably come up with a way of getting people who have lived or live in the area to give their opinion on it. Some of the sites with the most traffic on the net are sites that have largely the same kind of limitations that you describe, and they solve the problem through huge mash-ups combined with user generated content. Think Yelp, Trip advisor, Oyster (great proprietary photos added in), Zillow (great interactive maps), even Amazon.

martinibuster

12:03 pm on May 21, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Good answers, elquiri and fathom!

To expand on the e-how VS tripadvisor example above, e-how created a website based around keywords for the purpose of ranking in the SERPs. Period. Nobody wakes up and thinks, gee, I wonder what the newest e-how article is about. Nobody shares content from e-how. Nobody gets excited about e-how content at all because it wasn't built to delight, enlighten, or anything to do with humans. e-how was created using the old school "best practices" to rank well in the SERPs.

Tripadvisor has a community. I asked my wife about and she said she loves the site and direct navigates there often when planning a vacation or a weekend getaway. Please note this, and I will emphasize this: she loves the site and is enthusiastic about it. What made it so delightful? She discovered hidden beaches that weren't mentioned in any of the published travel guides she had purchased and received feedback about places we had been considering to visit. That's one example out of many.

The key point here is that my wife and no doubt others like her are enthusiastic about TripAdvisor, she finds it uniquely useful and it has become a go-to destination. That last part is important, figuring out how to become a destination for a certain kind of information. Part of that means figuring out what you do with the visitors to make them willingly stay.

Look at the Google Trends links I linked to above. Searchers rarely used the e-how name to search in Google and that trend over the years did not improve. It got worse. Why do you think less and less people searched Google using e-how's name? I suspect because e-how created a site around keywords instead of building a site around the visitors. No love shown. No love given.

Take a look at TripAdvisor's trend. There are more people searching with their name and the trend is one of growth. Every year it continues to grow. Not because they rank well. But because TripAdvisor is a destination that people like, myself included.

The above comparison between the two approaches illustrates why I chose to list User Experience as the leading ranking factor. Everything else grows from there.
2015 Ranking factors
User experience metrics (all of them)


It's not an empty build it and they will come platitude either. I hate that kind of pap and I would never waste your time with drivel and pap. These observations are based on my research into what I believe is in the Black Box and my experience building links for myself and for clients using this exact methodology. I'm being pragmatic, not pie in the sky. My clients and myself attract the best links with content that puts the site visitor as the first consideration.

EditorialGuy

2:31 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nutterum, best practice in this scenario is typically to dilute the duplicate content with rich related content, and create the mother of all mash-ups. For your example, you might consider rich information about the area and its amenities.

That approach was popular about 15 years ago. I remember seeing quite a few destination sites (especially for islands and beach destinations) which were owned by local businesses that sold property, rented villas, etc.

In many cases, the sites had better visitor information than the official tourist sites did, since local tourist offices were still clueless about the Web. I always had the impression that the site owners were genuinely passionate about their topic: They weren't just Web marketers in Silicon Valley, New York, or London who relied on spun guidebook content or crowdsourcing to provide filler for their sell pages. They were more likely to be expats living in places like Mallorca and Tenerife who knew and loved their adopted communities.

I don't see too many sites like those around today, even though "content marketing" is all the rage.

netmeg

2:38 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



(That is pretty much how it works with my sites as well, and it's also how you can avoid having the knowledge graph eat your lunch.)

Imma keep quoting Rae till it sinks in. Google doesn't want to make sites popular. Google wants to rank popular sites.

cre8pc

5:33 pm on May 21, 2015 (gmt 0)

10+ Year Member



I'm with Netmeg. How do you define "User experience metrics (all of them)"? for on-page ranking factors? How does this or does it connect with Google's Gary Illyes is at SMX Sydney saying that UX is "never" as important than the content on your pages?

lucy24

5:57 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



UX is "never" as important than the content on your pages

Can you point to a longer version? I don't understand the difference. (That is: in my lexicon, content is a subset of UX. Whether you hate a page because it takes ten minutes to load or you hate it because the words are gibberish, those are both UX. But "as important" implies that they are entirely separate things.)

fathom

6:47 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Like a link is a sign of democratic vote of approval (or disapproval) but the page of information invoked a response.

UX cannot be directly measured.

These signals are again not On Page Signals

They may be important, or not, but the old style SEO of ON/OFF Page Optimization no longer applies.

webcentric

6:52 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As lucy24 implies. User Experience is related to user expectation. Users expect fast these days and have no compunctions about hitting the back button when the page is slow. Next expectation is that they're going to find the content they're expecting to find rather having to sift through a bunch of ads (overlaid with more interstitial ads for example). Once you've satisfied the user's initial inquiry, the question becomes about whether you can keep them engaged and coming back for more. That's where martinibuster's Tripadvisor example come's into the picture. I like to think of UX as a process rather than a thing. It's a series of experiences, a process of discovery unfolding in front of the user in some satisfying way. I also like to let the user feel like they are in charge of that process rather than having someone blatantly dictating their next move for them. As webmasters, we tend to think in funnels (which is a necessary evil) but I think providing options actually makes things more interesting and engaging for the user.

Added:

Google doesn't want to make sites popular. Google wants to rank popular sites.


So is UX an on-page factor? Who cares? I thoroughly agree with the above statement and it's pretty much the only factor I pay attention to anymore. My goal is to create popular sites and that starts with giving the user a great experience.

cre8pc

7:15 pm on May 21, 2015 (gmt 0)

10+ Year Member



Barry referenced the quote from SMX here [seroundtable.com...] and there doesn't appear to be much of anything after that.

I've been haggling over the use of terms lately because they mean different things to people. UX is usability (for which there are known 2000 usability heuristics and I'm willing to bet nobody in SEO applies 2000 UX heuristics to the on page metrics), which user experience is, agreed, not measurable by search engines unless they are spying on our every move. A great user experience is even limited because, well, we limit the humans who are invited to experience our websites. Think accessibility here.

webcentric

7:58 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A poor user experience can smother good content. And Google can count the number of ads on a page and how long they take to load. Whether the poor user experience will push a page down in the rankings significantly is the question I suppose. Or can a good UX score move the page up? Maybe 2000 factors aren't being taken into account but some are for sure.

martinibuster

7:59 pm on May 21, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



...user experience is, agreed, not measurable...


So... the top heavy algorithm doesn't exist? The one that checks for excessive ads that may ruin the user experience?

No offense intended, but if you believe user experience is not measurable, then your understanding of what the search engines are capable of may not be as thorough as you may presume. Here's an example:

One method is to take log data generated from the search box (CTR, how long the session lasted, how many listings were clicked, how long before returning), data from browsers and toolbars then create models of user behavior. The machine can use that vast amount of user data to predict whether a particular web page will be satisfactory to the user for a particular query or not. In fact, the machines have been trained to be quality raters so that the function of rating web pages can be scaled. Not just in research but in many granted patents there are functions related to re-ranking the SERPs which rely on machine training and user feedback, sometimes referred to as explicit feedback and implicit feedback.

Check out what this research paper out of Microsoft [research.microsoft.com] says about how machines interpret search queries:

Ranking functions for Web search engines are typically trained by machine learning algorithms using either direct human relevance judgments or indirect judgments obtained from click-through data from millions of users. The rankings are thus optimized to this generic population of users...


They take the user data and use it to train the machines to second guess the SERPs, to learn from users how satisfied they are. This is really cool stuff that has been going on. Their machines use the data to create models of what will likely satisfy a user or not, based on the vast amount of training data available. What they do with that information is view a site and predict user satisfaction as if the a user actually viewed it and generated click data.

On that page I linked to above there's a link to a page about another study titled:
Toward Self-Correcting Search Engines: Using Underperforming Queries to Improve Search
In this paper, we present a method for automatically identifying poorly-performing query groups where a search engine may not meet searcher needs. This allows us to create coherent query clusters that help system designers generate actionable insights about necessary changes and helps learning-to-rank algorithms better learn relevance signals via specialized rankers. The result is a framework capable of estimating dissatisfaction from Web search logs and learning to improve performance for dissatisfied queries.


You see? The machines can learn from the search logs, they examine the pages that give dissatisfaction and determine the qualities of those pages that provided dissatisfaction.

I've been reading these research papers, hundreds of them, and I realized that many people in the search industry are simply unaware of many of the ways search engines detect spam, rank sites, etc. I know this because when I search for articles that have been written about some of these topics the articles simply do NOT exist because the industry is unaware of many the things the search engines do. So I'm selectively sharing this information in the hopes of raising the level of understanding. Good luck!

fathom

8:25 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Interpreting this:

So... the top heavy algorithm doesn't exist? The one that checks for excessive ads that may ruin the user experience?


Could very well be Google's interpretation of UX but saying that and then reviewing Google Adwords, Shopping, etc., suggest it isn't a factor unless they ignore their own factors for themselves.

Putting three ads across the top of your website and 8 down the side plus on most pages plus adding is comparison ads and other adverts isn't excessive... clearly they aren't scoring themselves but are you sure that this is UX?

No offense intended, but if you believe user experience is not measurable, then your understanding of what the search engines are capable of may not be as thorough as you may presume.


I want a direct quote from Google to suggest that UX is tied to algorithms for excessive ads.

Sounds like a great argument but why would Google ignore it?

martinibuster

8:41 pm on May 21, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



...why would Google ignore it?


Because they're not in the habit of handing the algorithm over on a silver platter to the hag masses of the French revolution? Hehe. :P

It may be useful, to carry that discussion forward, to define what we mean when we discuss user experience.

Sorry but that's as much homework I'm going to do for today. Although I respect you and all, you're asking good questions. Go on and do some of your own research. ;)

fathom

9:34 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So when you have no ads what is being implied then?

Maybe the implied signal devaluing you because you don't sell anything via your editorial content.

Just because Google has an algorithm does not mean that was connected to UX.

Many businesses have a blog but many don't place ads anywhere they place editorial content.

Admittedly, I don't read the quality raters handbook for clues which is a fair point to make ... but in the greater scheme of things the laws of diminishing returns apply. The more of the most beneficial variable (PageRank or something you can see today - links) are added, the lesser the impact the least beneficial variables have on the process. Many claim "If all other things were equal which site would rank better?". The problem with that theory is links are never equal. So if you ignore traditional SEO thinking; "every little bit helps" and focuses only on those things that generate measurable impact; ordered ranks will occur faster.

Occam's Razor is the other theory I rely on - a scientific and philosophic rule that entities should not be multiplied unnecessarily which is interpreted as requiring that the simplest of competing theories be preferred to the more complex or that explanations of unknown phenomena be sought first in terms of known quantities.

If ads negativity impact ranks remove them which solve any UX issue with them and great UX but it still does not imply better ranks will occur based on other ON Page factors.

My homework is done now to!

webcentric

10:38 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Crap, the dog ate my homework!

fathom

11:32 pm on May 21, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



lol :-)

martinibuster

1:44 am on May 22, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



So when you have no ads what is being implied then?... My homework is done now to!


Not quite. It's not about there being no ads. And it's not about ads at all. Nothing wrong with ads. What I said and demonstrated is that user experience counts now more than ever. I demonstrated this with the example of e-how and TripAdvisor. Then I linked to research papers demonstrating that it was indeed possible to gauge the user experience.

I don't object to differences of opinion, I encourage an intelligent response, it gives me joy. Unfortunately, your response falls short because you're differing about something I never said or suggested. Please, take a moment to read what I posted before differing with something I never said.

So no, your homework is not done. Give me a real difference of opinion. Challenge me with something intelligent, well thought out, and logical. But please don't waste my time arguing about something I never said. ;)

Selen

2:29 am on May 22, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Two points about TripAdvisor:

1.
Tripadvisor has a community.

I agree that it's all about people/community/forums etc. Without people who contribute their knowledge in a passionate way the success wouldn't be possible.

2.
Take a look at TripAdvisor's trend. There are more people searching with their name and the trend is one of growth.

It's important not to miss the fact that TripAdvisor indirectly but smartly 'manipulates' Google search rankings through national TV commercials. All of these commercials include this slogan (accompanied with a search box):

Don't Just Visit {DESTINATION_NAME}, Visit TripAdvisor {DESTINATION_NAME}

(for example: Don't Just Visit Hawaii, Visit TripAdvisor Hawaii) - [youtube.com...]

In result, thousands/millions of people go to Google and search for: tripadvisor {DESTINATION_NAME}. In time (as it has been recently confirmed), Google considers it as an important factor that the website is useful/popular/quality etc. and the domino effect starts. It works well for big companies/corporations, but small companies cannot afford to implement this marketing trick.

webcentric

2:57 am on May 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When you throw the principles of "semantic web" into this discussion (not just what's been implemented to date but where it's all headed), it's easy to envision how simple details like number of ads on a page, load time, and a million other details could be used to help infer (or define) the quality of a resource. It's easy for most of us to think of G's algos in terms of a bunch of variables plugged in and rendered on a scale (1 to 100 for example) but that's surely a gross over-simplification. I've not spent a great deal of time researching G's patents and such but I've seen enough evidence to know that G's is looking to be able to infer facts from the dataset we call the Internet. Where we see pages and links, Google sees triples (subjects, predicates and objects). There's no reason they can't take every bit of UX info they track and weave it into their analysis eventually (if not sooner). What weight it gets is anyone's guess.

"Left turn Clyde!"

fathom

4:55 am on May 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I did... the "laws of diminishing returns" and Occam's Razor which your own research supports, as in, "training machines to second guess SERPs," is that a fact today or a guess today that is indeed a ranking factor?

Speculation does count as a ranking factor at all.

"we present a method for automatically identifying poorly-performing query groups where a search engine may not meet searcher needs"... is this a fact today or more speculation.

As Selen points out national TV commercials are a contributing factor for UX which is equally implicit thus must be a ranking factor. But I doubt Google intentionally introduced that into their algorithm.

Just because Google isn't in the habit of handing the algorithm over on a silver platter to the hag masses of the French revolution... does not mean deeper research will prove that they secretively do they simply hide it well.

Nutterum

7:57 am on May 22, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



@martinibuster - thank you for continuing the conversation.

What I would like to state is that I support your ranking factors. To further your opinion I would like to point out that one of the bigger sites I do work on, does not follow the standard Book of SEO. The website has almost no outgoing links, and only very trusted inbound ones. The content while written with some SEO tricks in mind, is made for humans and provides content that while existing out there in the web is combined in a all-in-one page info. These info pages rank terribly on Google, however once the direct, e-mail and referral visitors navigate to these pages they refer to them quite often. In the end after several months, I noticed that landing pages with relevant link to the complementary info page rank way better than the stand-alone pages. So yes, I do understand and use the notion of quality complementary content. Sometimes I rank above competitor websites with hundreds of thousands of backlinks, simply because the information I provide is way easier to consume, despite the fact that the products I try to sell on the side and not necessarily better or much different.

To bring more details to point why Martinibuster is correct in his assumptions :

I rank on top 5 results for more than 200 very competitive (even though seasonal) keywords and rank in top 3 for over 1,500 long tail keywords, used by users when they have the intention to buy. At the same time I have less than 500 backlinks, almost never outlink and have very small niche social media presence. Yet I have seen over 200% growth in organic and drop in bounce rate year to year. Yes, there are competitors with blatantly spammy websites or downright duplicated content from other sources. Yes I could possibly do better if I employ various link-building techniques, like link baiting blog or t3 link networks. But I won`t. Because I trust that Google will penalize the ones that do, sooner rather than later.

My above post was to spur further conversation about the topic because I firmly believe it needs to be discussed.

The SEO is dead. Long live the SEO. :)

fathom

12:33 pm on May 22, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You also stated a fact that Google confirmed a long time ago... Link quality trumps link quantity... Separating that from this discussion links are surely based on UX that isn't just implicit that is explicit and why Google continues to have a link obsession.

While external link anchors cannot be so easily manipulated anymore on site navigation is a main driver here.

That said ... In an SEO world still dominated by links the other stuff might make a nice kool-aid project to build links but every single reference pointed out has a link component so tiring to split user actions into smaller pieces is flawed IMHO.

anallawalla

3:11 am on May 26, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



User experience metrics (all of them)
Shorter title tags
Original content
Engaging content that provides an answer, teaches, informs, is useful, delights
Original images
Quality site design
Descriptive meta description


I agree with all the above. I am monitoring a niche that is dominated by big companies and the occasional comparison site. Each of the top-ranking big players offers most of the same products, so I assume they use the same on-page tactics for each product. The content mix for a given product isn't always the same, e.g. adding a calculator widget to the page helps tremendously with search traffic, which in turn helps with ranking.

Short title tags are definitely winning. Among the top ranking pages, almost all have an average of about 8 words, including the company name.


Deprecated
Keywords
Focus on longtail phrases
Focus on ranking for specific keyword phrases
Lean code


In my niche the big players range from coding in enterprise-grade CMSs such as the efficient Adobe AEM (was CQ5) to the inefficient IBM WCM, to WordPress. I see ranking pages with 3000 lines of code down to 200. Google knows that "bloat" has some purpose and large companies have a lot of campaign tracking code, analytics code, in-line CSS/JS etc. Large companies don't have the budget or time for tidying up the whole page and Google knows that. Hence lean code would have to be a very minor factor.

I have analysed several hundred ranking pages and noted no consistency or pattern in the presence or quantity of keywords in the title element, body text, h1, h2, h3, bold, italic, internal or outbound links, meta keywords, meta description, alt tags. I am looking at one #2 ranking page where the search term is not in any of these elements! Yes, the page has many relevant variations of the term and therefore synonyms are perhaps very relevant now. I sense that from Hummingbird, where a spoken search term might differ from a written one, but both "deserve" the same SERP.

Keyword density has not been valuable for many years, but it is worth checking out TF-IDF to study keywords, particularly long-tail ones. This will help you see that long tail phrases tend to have a lower TF-IDF value, hence they don't rank well.

We will continue to obsess over ranking for money terms, or our managers will continue to demand it. I have been looking at competitor sites not just for specific keywords, but for a bank of highly searched terms, then assigned a score to each competitor based on their positions for the whole bank of keywords. This is a good way of declaring who is the "best" ranking site. So, a certain site might have just five keywords in the top 10, but another has twenty ranking keywords, but they are all low ranks, thus its overall score is lower.
This 128 message thread spans 5 pages: 128