Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

2015 Google On-page SEO Ranking Factors List (Including Deprecated Factors)

         

martinibuster

1:05 pm on May 6, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I want to split off on-page from off-page and discuss solely on-page ranking factors, including the deprecated factors. What's your list of important on-page factors and those that are less important?

2015 Ranking factors
User experience metrics (all of them)
Shorter title tags
Original content
Engaging content that provides an answer, teaches, informs, is useful, delights
Original images
Quality site design
Descriptive meta description

Deprecated
Keywords
Focus on longtail phrases
Focus on ranking for specific keyword phrases
Lean code

lucy24

5:54 pm on May 26, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



almost all have an average of about 8 words

Eight WORDS? And that counts as short? Look up at your browser's tabs. Once you've got a string of them open, you're not likely to see much more than eight CHARACTERS. (Right now, mine's at eight or nine-- depending on character width-- before it goes into ...) That, itself, is an aspect of user experience. If you've got several tabs that all say "WidgetCo..." you've got an annoyed user trying to find out which of your pages is the one they want to go back to.

Hm. Come to think of it, the one time putting your sitename first in the <title> could be useful is
:: drumroll ::
when all those tabs were opened as part of a preliminary search-engine inquiry. In this one, specific, narrow area, are search engines' interpretations of usefulness getting skewed because the only behavior they see is this particular stage of browsing?

webcentric

7:22 pm on May 26, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



In this one, specific, narrow area, are search engines' interpretations of usefulness getting skewed because the only behavior they see is this particular stage of browsing?


It's a sort-of UI dilemma I've struggle with for some time. For me, the favicon can indicate the site you're looking at so I try to get distinction into the opening of the page title. Google typically appends the site name to the end of the page title anyway (at least with my main site). In some ways, I think what shows up in the tab has influenced my titling more that SEO ever did.

cre8pc

8:09 pm on May 26, 2015 (gmt 0)

10+ Year Member



@martinibuster, the fact that you even mentioned UX and put it first was a real treat for me. (As a usability consultant who was once an SEO).

I've been saying UX is an SEO factor for 15 years. Been teaching UX at the Searchenginecollege (run by Kalena Jordan) for a dozen years so that SEO students have the basics.

I am not allowed to betray NDA's and client confidences, but it should be noted that several sites noted in this discussion have undergone usability testing as part of their journey.

Metrics are what we apply during usability and functional testing with test cases and test plans based on site and app requirements. My role in site audits is always separate in scope and delivery from the SEO's part of a site audit.

What is not done is user testing. This is where I find the truth about why people are not using a site or referring it. It's the data I collect from humans who perform tasks and answer questions that tell me what's happening. An interesting lesson from user testing...the participants tell me they never heard of the site before and become interested enough to become a new user, or recommend it (even with its issues) because the site owner cared enough to seek their feedback.

martinibuster

11:53 pm on May 26, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



...the site owner cared enough to seek their feedback.


That's a gem right there. Thanks for sharing! :)

elguiri

10:52 am on Jun 1, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



For those who don't understand how Google can analyze and extract data about a site's layout, have a look at their browser, Chrome.

Visit a web page, right click and select Inspect Element. Put your mouse over any div element in the code window and see how the area on the page is highlighted and its dimensions in pixels displayed. To the extent that Google is able to classify the content of each div, they can measure the % of visible content dedicated to advertising, images, text, widgets, maps etc. Together with your stylesheet and js files (to which they now demand access) they can calculate the % of white space around content, because they know the fonts and font sizes used, borders and the space between lines and paragraphs.

With all that data available, they can look for correlations to user experience data. Sites that don't work for users, will not, over time, work in Google.

martinibuster

12:15 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Very good post, El Guiri.

Together with your stylesheet and js files (to which they now demand access)


At first the SEO community thought it was for identifying cloaking and whatnot. Then it was noticed that bots were downloading sites as if they were browsers, rendering them. Now we know, El Guiri pointed out, that the search engines are rendering sites the same way a user will.

fathom

3:54 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Understanding does not imply a measurable ranking change. A therotical one maybe but I do SEO in a real world against real world ordered ranks where your superior understanding does not actually demonstrate anything and does not afford you superior ranks using a real world phrase that has even mediocre competition.

If you can't demonstrate the theory it isn't practice SEO.

Anecdotal evidence such as:

sit a web page, right click and select Inspect Element. Put your mouse over any div element in the code window and see how the area on the page is highlighted and its dimensions in pixels displayed. To the extent that Google is able to classify the content of each div, they can measure the % of visible content dedicated to advertising, images, text, widgets, maps etc. Together with your stylesheet and js files (to which they now demand access) they can calculate the % of white space around content, because they know the fonts and font sizes used, borders and the space between lines and paragraphs.


Is conjecture at best.

Knowing this and doing something with this knowledge are completely two different things.

It strikes me as odd the one set of factors that actual might demonstrate UX is tossed aside from this discussion so you can discuss things that the USER has no control over.

How can I use that knowledge to create a superior web page that will beat all other web pages because I perfectly captured the best user experience according to Google's data compilations or are you saying that can't be done?

I'm not interested in spectulation. I'm absolutely positive that page load speed is a factor but it isn't the best indication that users had a positive experience... Another link that demonstrate their positive experience (or negative experience) is the output of all your hair spliting but the user cannot change those so UX IMHO is demonstrated by links.

Until you can show Google has dropped its obsession with links it is the best overall gauge of UX.

aristotle

9:26 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



fathom -- what about all the sites that have popup windows popping up on top of what you're trying to read?

Also, what about the sites that freeze your browser to force you to wait for ads to load?

webcentric

9:38 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



what about all the sites that have popup windows popping up on top of what you're trying to read?

Also, what about the sites that freeze your browser to force you to wait for ads to load?


I thought that was what the back button is for. I try to click it as quickly as possible if coming from Google so they know I think their result was crap. Then I type Bing.com in my address bar and hit Enter just to emphasize the point.

fathom

9:53 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



fathom -- what about all the sites that have popup windows popping up on top of what you're trying to read?

Also, what about the sites that freeze your browser to force you to wait for ads to load?


What like [searchengineland.com...] (trying the get me to subscribe to their State of Link Building whitepaper) do you really believe Google will downgrade them or [searchenginejournal.com...] (wanting me to go to their SEJ Summit) what if no one goes but everyone clicked on the ad?

Here's my question back... what if I walked away from the computer and didn't do anything for days... would that be a good or bad UX?

webcentric

10:05 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you consistently fight your way through the ads and regularly come back to the site and spend time there, then what's to be concluded but that you want to be there and are finding something of value, bad UX or not?

There's a difference between you regularly visiting a site directly and someone landing there for the first time as a result of a search. I think the way Google weighs the engagement is going to be different just based on how they're able to collect data about the visit but also based on the fact that one method can be measured against their search product and the other can't.

fathom

10:07 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Not sure if this was a joke... but

I thought that was what the back button is for.


...and if you back buttoned out of Google that would really **** them up.

I try to click it as quickly as possible if coming from Google so they know I think their result was crap.


Are you now a bot?

Then I type Bing.com in my address bar and hit Enter just to emphasize the point.


actually [lycos.com...] would be better.

fathom

10:12 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you consistently fight your way through the ads and regularly come back to the site and spend time there, then what's to be concluded but that you want to be there and are finding something of value, bad UX or not?

There's a difference between you regularly visiting a site directly and someone landing there for the first time as a result of a search. I think the way Google weighs the engagement is going to be different just based on how they're able to collect data about the visit but also based on the fact that one method can be measured against their search product and the other can't.


So loyalty devalues your value to a website.

Me thinks this is purely conjecture.

webcentric

11:06 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Of course I'm full of conjecture but I wasn't making a joke. While you're trying unravel Google's system of weights and measures, I'm simply making common sense observations based on my own behavior and other observed behavior. Then again, my point isn't about how to rank in the SEPS with on-page techniques, it's about how rankings might be shaped by user's interactions with UX.

If I run a search, click on a result link and land on a bogged down page of ads, I will normally hit my back button pretty quickly. Given what I think we all know Google is capable of measuring, that should be an indication that the destination did not hold my interest and that I made my decision to leave rather quickly. Add to that, that Google can measure page-load times, number of ads on a page and the ratio of content to ads (for starters) and it's easy to imagine the kinds of information Google can infer about the UX of the site based on my decision to back right out. It doesn't take a doctorate in statistics to see what is possible for Google to know. The question, for some anyway, revolves around the weight of such observations (specifically UX observation in this case) and how they are weighted in the algo. I could care less. What I care about is the possibility that my behavior is sending a message to Google about the (ad bloated) resource they sent me to.

To be continued. My power is about to go out because of a thunderstorm...

[edited by: webcentric at 11:30 pm (utc) on Jun 1, 2015]

webcentric

11:25 pm on Jun 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



...hopefully, the flickering is over.

@fathom I realize that your statement about conjecture was related to a specific point (not necessarily what I'm discussing in the immediate post above). So

So loyalty devalues your value to a website. Me thinks this is purely conjecture.


I'm pretty sure I didn't say that. What I tried to say is that there's a difference in how Google can even measure
1. A click from a Google seach to a web page and
2. A direct visit to a web page that doesn't involve a Google search.

#1 can be analyzed against the quality of results on a Search Engine Results Page.
They can still measure the time on a page, bounce rate etc via GA, Adsense or other vehicles (Chrome for example) for direct visits but that information is just about the page, not how a searcher evaluated the quality of a search result.

In contrast to your conclusion about my statement, I believe devoted users are like gold. They are the foundation in my traffic strategy. It's a different process, marketing to return visitors vs, organic traffic. The former can suffer from ad blindness for example but you get the chance to develop trust and put that trust to work for you over time.

p.s. I agree that there are better example than bing.com but I also think you get the point. Backing out of a site sends one message and exiting Google for another search engine sends another.

Added: I'm not promoting any given behavior here. I'm just pointing out that this kind of thing has to happen a lot naturally and I'm guessing that it leads Google to conclusions. Call it guesswork or call it blatantly obvious.

fathom

4:10 am on Jun 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes... my statements are rhetorical rebuttals (I realize that neither you nor anyone else made such statements those are merely my interpretations of your suggestions).

There's a difference between you regularly visiting a site directly and someone landing there for the first time as a result of a search.


I rarely go to either of those websites and in fact I arrive there via Google. Since I had a stroke in Oct 2014 which left me a single hand typist I use Google to finish my queries.

My points are not to disrupt knowledge, skills or building on experience nor am I here to promote a superior agenda but to merely cast doubt on what is obvious guesswork.

To show you completely agree with me... you used guesswork as your own claim.

guess·work
noun

the process or results of guessing.
synonyms:guessing, conjecture, surmise, supposition, assumptions, presumptions, speculation, hypothesizing, theorizing, prediction.


When these processes are facts - I will embrace them. When Google reports that they have devalued PageRank and embracing UserRank.

I've been saying UX is an SEO factor for 15 years. Been teaching UX at the Searchenginecollege (run by Kalena Jordan) for a dozen years so that SEO students have the basics.


Kim I know you only by reputation but a claim of 15 years and still no proof that can be made public isn't a statement I would want to make public. That's just an opinion.

webcentric

6:48 am on Jun 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have no desire to prove anything here. There's enough circumstantial, if not physical evidence available to lead one to believe the comings and goings of visitors to the Google SERPS are being tracked in a variety of ways and that that tracking information (along with other data) is capable of suggesting conclusions (to those who have access to the data) about the quality of a site's UX.

To show you completely agree with me... you used guesswork as your own claim.


Actually, I just gave you license to call it guesswork if you want to. I'm thinking theoretically and embracing the what if's in this conversation. I'm presenting a hypothesis if you will. Prove it or disprove it if you care to.

Quoting myself: What I care about is the possibility that my behavior is sending a message to Google about the (ad bloated) resource they sent me to.


Do you contest the notion that Google is monitoring behavior? Do you contest my consideration of the "possibility" that if Google is monitoring behavior, they could easily be drawing conclusions about UX based on that monitoring?

When these processes are facts - I will embrace them


OK, but that's not really the best way to get ahead of the curve now is it? Frankly, I don't think we're really arguing here. We're simply drawing a distinction between logical conclusions and hypothetical notions. :)

fathom

8:01 am on Jun 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Do you contest the notion that Google is monitoring behavior? Do you contest my consideration of the "possibility" that if Google is monitoring behavior, they could easily be drawing conclusions about UX based on that monitoring?


I don't presume to know precisely what Google is up to... I agree they likely desire a matrix upgrade that will replace PageRank by the time their patent expires 2017 and user driven data is a likely replacement but pretending that exists for use in 2015 is equal to the fairytale story of Chicken Little: "THE SKY IS FALLING!"

PageRank works... it's proven and it continues to drive better websites to the top. The fact, that Google did a proverbial Potomac Two-Step to protect its archive from manipulation for the foreseeable future does not mean the next best thing is here already.

The opening post of this thread suggested keywords are dead including longtail phrases and you can't rank for specific keyword phrases which is a totally false suggestion... you simply can't manipulate results with external link anchors anymore.

But that isn't the same thing.

Just because you can't see the little green bar of PageRank anymore doesn't mean the replacement is being used today.

UserRank (or whatever they call it) will not likely predate Google's patent for it. Advance predictions for its use isn't overly useful until it actually happens.

I love how the community embraced keyword density and many continue to claim that if you add up all uses of a keyword and divide that by all unrelated words that will give you the magical ranking value of being TOTAL PREPOSTEROUS! ...but then all you need is some expert to claim it's usefulness [searchenginewatch.com...]

Time to bow out and let the folklore continue.

mikhailblaze

8:24 am on Jun 2, 2015 (gmt 0)

10+ Year Member



Deprecated
Keywords
Focus on longtail phrases
Focus on ranking for specific keyword phrases
Lean code


Does this mean we should all avoid using long tailed keywords now, or do we need to make necessary edits? Most of the keywords on my site (particularly the linked ones) are long tailed and I've mostly relied on them.

Robert Charlton

9:33 am on Jun 2, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Does this mean we should all avoid using long tailed keywords now?

I don't want to answer for how a lot of other posters are thinking about it, but this for me means that there are certain bits of long-tail vocabulary that pretty much mean the same thing, almost like stop words... "how do I", "how do you", "should I", "can I", "is there a way", etc... and that Google probably isn't looking at each variant the way maybe AltaVista did once upon a time, which is as a literal text string. It's about the misplaced energy of trying to get all these matches in. Instead, the effort should be about trying to get at the concepts, to think about what's really important, and to spend energy on that.

As I'm seeing it, there are a lot of ways of asking the same question, and Google is abstracting those variants into a more focused core question. To create a half dozen pages to cover all of the long-tail vocabulary variants is silly, which doesn't mean you should avoid the phrases at all. It just means, don't build your site around them, and instead try to address a range of needs and intents.

You also want to expand on the implications of a simple broad keyword phrase, like "red widgets"... What does that phrase mean as a query? Do users want descriptions, specifications, prices, their history, where to buy them, how to fix them? Most ecommerce websites are focused pretty much on keyword matching for sales, with a happy-talk sales-pitch thrown in, and nothing else.

The time has come when companies need to really think about the context of the whole shopping and ownership experience, and how they can help their users.

Last week, I wasted an hour and a half trying to get info about how to replace a part in an appliance that should have taken me two minutes to find out and do. Anybody who'd really thought out the situation of what customers do with their widgets after they buy them never would have left out that information.

So, getting back to the long-tailed keywords... you want to stop building sites around keyword variants... and to think instead about long-tailed customer experience, essentially "customer variants" instead of "keyword variants", and the range of things that different site visitors and searchers need to know and do before they deal with you and after they've bought from you.

Long-tail keywords is just one factor of many on martinbuster's list. I'm not in lock-step with the assertions about things like deprecation of lean code... but even that, as one item in a list of dope-slaps to readjust the thinking of many members here who obsess about code and not much else, I think was a gutsy and perhaps fruitful try.

For more about long-tail phrases, I just posted about them in another thread...

Keyword tracking for large content sites
https://www.webmasterworld.com/google/4750099.htm [webmasterworld.com]

Relevant to this thread, pick it up at...
...IMO, exact match long-tail data isn't a very fruitful way to target any more. It hasn't been for quite a while. Longtail SEO is not about creating content to match exact long-tail variations... [It's] going to take a familiarity with the marketplace and what the site is offering.

toidi

10:37 am on Jun 2, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



If user metrics really mattered, we would not see so many brands at the top. Corporate websites are some of the most user unfriendly and useless websites out there.

elguiri

7:15 pm on Jun 2, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Knowing this and doing something with this knowledge are completely two different things.


@Fathom, I agree with you. But where I've made money over the years has been in avoiding practices that Google can probably whack at some point in the not too distant future.

Where I've lost money is when I've seen what was coming and not done anything about it. (I had to lay off 25 staff on the back of panda - all the result of very compliant sites that had done nothing sinister getting murdered).

But right now, I'm building a pretty good little business (cross fingers, touch wood, sacrifice fatted lambs to Google gods) with niche minisites, solid content, SUBTLE SEO, great link building (content marketing, semi-industrialized) and tick-the-box user experience. i.e. mobile responsive, clean, big fonts, click bait and watching my stats. I'm aware however that this might not work forever.

seoskunk

9:39 pm on Jun 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Deprecated

Page Rank Sculpting. Does more harm than good.

martinibuster

9:18 pm on Jun 4, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Google Planning to Devalue Content Behind Interstitials [thesempost.com]

Maile Ohye from Google warned webmasters at SMX Advanced that they will also be bringing up the issue of interstitials and how pages that use them will be affected. “Interstitials are bad for users, so be aware this is something we are thinking about,” she said.

She then continued on to say that content hidden behind interstitials would be devalued.


I stick to my opinion that User Experience is imporant and double down on it. :)
User Experience is a leading on-page SEO consideration for 2015. Don't wait for Google to say something is going to change. Read the writing on the wall and get ahead of everyone else. ;)

JS_Harris

9:52 pm on Jun 4, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



2015 Ranking factors
- User experience metrics (all of them, especially speed and mobile layout)
- Title tags with a clear keyword/subject but that also use adjectives(please avoid using "insane") and form a complete sentence
- Original content OR added value to replicated content
- Lean code, I can't say strongly enough that less is more and that speed matters
- Social stickiness of site incl ease of sharing and images that are share worthy
- Schema.org, especially to breadcrumbs if you use them

Deprecated
- Keywords
- Focus on longtail phrases
- Focus on ranking for specific keyword phrases
- Meta Descriptions, I removed all meta descriptions from a large site 6 months ago and lost no rank/traffic
- Adding numbers to a post title, ie: "3244435342343 ways to *", numbers are overdone

fathom

5:02 am on Jun 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@martinibuster my plan is to stay out of this thread but two rebuttals exclusively for your most recent points and I do realize you already know this, but for everyone one else.

Rebuttal #1

You have to consider a few things with everything a Googler says.

1. The one million mile view of the Earth lacks details. That is the information you'll get from Googlers, not overly specific, rather vague and thus we as their consumers can interpret it to mean almost anything.

2. Googlers are bound by NDA with their employer, Google Inc. They will never disclose precisely "how to" so believing they just gave you the secret to their sauce by saying "any how to points," is a flaw in logic. I don't believe they outright lie but their version is tailored or has a certain slant towards their ideals that may not match the ideals of someone simply wanting to rank. My point, if Google doesn't want you to reverse engineer their signals why suggest that is precisely what they wish to show contrary to their own wisdom.

3. Trying to use analogies to convey useful facts that don't violate their NDA don't always convey the actual message they intended. Google's Public Relations is simply that ... Public Relations and not insights into their secret sauce. You can certainly do research to prove or disprove a perceived fact that they provided but they will hardly provide you the proof of that or what they might consider to do. At best you have anecdotal evidence but not a certainty for a ranking factor claim in 2015.

4. Are Googler's actually addressing your specific claim or are they actually talking about something completely different. I once asked Matt Cutts a very specific question in front of half dozen others (back in 2003) a year later Danny Sullivan heard my claim (about penalties being awarded) and cornered Matt Cutts on it... And that's when it seemed like I just made the whole thing up... Matt suggested he was only using a set of examples not real variables. Sounds plausible... But I was asking an exact question and not a vague one and the response never hinted it was completely void of reality, he never stated it was hypothetical or remotely implied saying anything useful to understanding a devaluation (penalty if you wish to continue to call it that ... but Google doesn't, so I don't anymore)

5. As we are both aware MOZ 2013 Ranking Factors made an amazing claim that Google's +1's were a ranking factor. In fact, they scored as a very high factor right behind PageRank (MOZ’s version of PageRank) and 80 experts agreed that was the case. It was much more important to be the first to claim they were a factor rather than being accurate. Shortly thereafter that was debunked by Matt Cutts [news.ycombinator.com...] Not saying this is the case here just making a point - this thread is what Eric Ward calls linkbait kool-aid [ericward.com...] . Accuracy does not matter so long as you attract links.

6. I want to trust what I read here at WebmasterWorld and I don’t trust MOZ they enjoy kool-aid too much. Using Googlers quote as anecdotal evidence is great filler but when the person states (Like Kim did here – I can’t comment about specifics due to NDA this it makes me think there is much more to the story...) making them lousy rebuttals.

I find it odd that a genuine output of UX isn’t included in this discussion. Natural links are 100% about UX (even bad experiences) even nofollow experiences. So I’m not disagreeing with any of you… but the use of back button isn’t a powerful signal, nor is bounce rate, and certainly most if not all of the proverbial 2000 UX references will likely never offset the output power of a third party linked review or testimonial and these are true factors today.

The reason I point these out your suggested evidence is also my:

Rebuttal #2

Maile Ohye... “Interstitials are bad for users”... but isn't that what occurs with a Youtube video so you can't just watch the video without being subject to billions of unwanted ads. Clearly Youtube uses Interstitials so course those are tied to videos, what about PDFs, what about Flash, Shockwave, Javascript, etc.

Ads have nothing to do with better ranks so this would be a negative signal but not based on actual UX, but perceived UX. The language does not state that if a genuine added value offer was in Interstitials and users overwhelming embraced it those superb UX would be rewarded, in fact, if the users massively enjoyed it they would share it with links but obviously devaluing even those – there is zero evidence of that today. Or is there?

Wouldn't ranks increase shortly after removing Interstitials... but what happen if actual UX declines as a result?

[edited by: fathom at 5:46 am (utc) on Jun 5, 2015]

lucy24

5:17 am on Jun 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I will normally hit my back button pretty quickly

Huh. Is that what most searchers do? I almost never use the back button; each result is opened in a separate tab. The search engine has no way of knowing how long each of those tabs stays open, or what I do with it after the first page (the one they sent me to).

Not sure the search engine even has a way of knowing when I've force-quit the browser due to beyond-the-pale atrocious behavior of some first-page result.

hasek747

7:19 am on Jun 5, 2015 (gmt 0)

10+ Year Member



Not sure the search engine even has a way of knowing when I've force-quit the browser due to beyond-the-pale atrocious behavior of some first-page result.


Even if 97% of searchers use Search Engines the same way you do (mass-opening tabs in new windows), and only 3% use the "open-then-hit-back" approach, the 3% is still entirely sufficient for Google to draw very meaningful conclusions from as to what should rank where.

fathom

7:37 am on Jun 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Even if 97% of searchers use Search Engines the same way you do (mass-opening tabs in new windows), and only 3% use the "open-then-hit-back" approach, the 3% is still entirely sufficient for Google to draw very meaningful conclusions from as to what should rank where.


Not that I disagree but how committed would they be to making that a major ranking factor that replaces their link obsession? If it can't compete with PageRank what is the point.

How committed would they be to making that share the limelight with PageRank as a secondary alternative. For all the value I hear about UX the hints about its use appears to be all negative factors not positive ones.

Leosghost

10:27 am on Jun 5, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The search engine has no way of knowing how long each of those tabs stays open, or what I do with it after the first page (the one they sent me to)

If ( the search engine is Google, and you use their chrome browser ) Google know all of that and more..
This 128 message thread spans 5 pages: 128