Welcome to WebmasterWorld Guest from 3.231.228.109

Forum Moderators: Robert Charlton & goodroi

Google Retires rel=prev/next

     
4:39 pm on Mar 21, 2019 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:26372
votes: 1035


Google has said it's retiring rel=prev/next.

I'm not sure this is one of Google's best moves, imho.

As we evaluated our indexing signals, we decided to retire rel=prev/next. Studies show that users love single-page content, aim for that when possible, but multi-part is also fine for Google Search. Know and do what's best for *your* users!

[twitter.com...]
8:24 pm on Mar 21, 2019 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12390
votes: 409


I'm not sure this is one of Google's best moves, imho.

I agree. At the same time, I'd take this as a signal about how most mobile users are viewing content.... ie, very quickly, and scrolling is easier than clicking on a link.

8:29 pm on Mar 21, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 609
votes: 98


So they retired the signal and you are suppose to do what is best for your users. I am surprised that they dropped this since they have been really pushing structured data and while it is not part of schema it helps define the structure of your site. I sure hope Google's algo is good at detecting what the previous or next page is. (I don't have high hopes)

Sure users like single page content (a hint there I guess) for things like articles but other things like product lists and long forum topics I much rather have it split up.
9:54 pm on Mar 21, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15882
votes: 875


Well, I hope they don’t expect me to rearrange that 58-chapter cheesy novel into a single vast html file, because that’s the only thing I have ever used prev and next for.
10:04 pm on Mar 21, 2019 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1339
votes: 438


I have a different perspective.
Disclaimer 1: I've never used rel=prev/next (nor any other "rel=" SE sponsored signal).
Disclaimer 2: the following is simply my opinion.

Rel=prev/next, as a G signal [webmasters.googleblog.com], was initiated in 2011 to indicate that certain page collections/flows should be thought of as one even though separate.

We all know how popular certain sites, particularly celeb tabloids, abused this process, i.e. "10 celebs with big hair": intro plus 10 more rel=next pages, each with 10KB of text, 500KB of image and 1000KB of dark pattern ads, hoping you'll mistake an ad link for the next page link. Also, as with rel=canonical, rel=prev/next was often a bandaid for poor architecture. And a few, a very few, actually used it to provide a good visitor user experience.

I can think of some possible reasons behind rel=prev/next being dropped as an indexing signal.
Google...
1. is confident that they can determine the page series connection without needing an hardcoded signal. It's been seven or eight years so this is quite possible.

2.1. has decided that the traffic drop-off rate on entering such a page series is such that treating each page as a index calculated stand alone will be a better search result.
2.2. has seen diminishing ad value in later sequence pages such that rel=prev/next page value flow is out of sync with actual ad value. Especially as many of such ads are not via G. There may be more than one target in this change.

3. that mobile users are less inclined (see 2.1, 2.2 above) than non-mobile users to put up with the crap user experience (and bloated bandwidth cost, render time...).

Of course their logic could include none to all the above. :)

Actually, that linked Twitter conversation thread was quite revealing in a sad way... ah well.

In ending I'll swing back by but multi-part is also fine for Google Search, which tends to lean me towards '1' above for sites such as lucy24's with a possibility of a side order of whoop-a** for the craptastic abusers. Time will tell.
10:31 pm on Mar 21, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10459
votes: 1093


I generally take what g suggests with a grain of salt, rarely implement it (whatever "it" might be) and have rarely had an adverse experience for failing to play their game.

Why?

Two, three years after they start something "to help publishers" they end up changing their mind.

Sadly, too many webmasters have been willing lab rats for g experimentation and are usually abused in the end when g changes their mind.
10:49 pm on Mar 21, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15882
votes: 875


Google... is confident that they can determine the page series connection without needing a hardcoded signal. It's been seven or eight years so this is quite possible.

Ya think they can figure out the sequential relationship of URLs /title/chap32, /title/chap33, /title/chap34 without benefit of "next" and "prev"?

Yeah. I expect you're right on that point. In fact it may be another case of “if they can’t figure it out unaided, then heaven help us all”.
2:04 am on Mar 22, 2019 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11842
votes: 242


indexing signals

what about discovery?

there are other reasons to use link rel=prev/next that are irrelevant to google's current whim, such as accessibility.
from the Web Content Accessibility Guidelines (WCAG) 2.1:
The objective of this technique is to describe how the link element can provide metadata about the position of an HTML page within a set of Web pages or can assist in locating content with a set of Web pages.

(source: https://www.w3.org/WAI/WCAG21/Techniques/html/H59 [w3.org])
2:20 am on Mar 22, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15882
votes: 875


Consider by analogy the late unlamented "keywords" meta. Google has been ignoring it since approximately five minutes after it was created ... but they're not going to hold it against you, are they? It's just another thing for the Does Not Matter bin.

Granted, “rel=next” does not take up many bytes, and is useful to somebody, while some of those keywords metas could be ludicrously long and were useful to nobody, but still.
2:34 am on Mar 22, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10459
votes: 1093


I suspect that g hoped the hoi polloi would solve some of their indexing problems (back then) with their "helpful new instructions" and belatedly realized they had unleashed yet another game against the black box.

Reality: Code simple. Code for the user (and yourself) and most times all will be what was desired: function, action, resolution.
12:38 am on Mar 23, 2019 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1339
votes: 438


Well... now we know that Google actually stopped using rel=prev/next as an indexing signal for more than a year...
Google's Latest Advice On Pagination & Page Series Post rel=next and rel=prev [seroundtable.com].

Shades of rel=nofollow... now that was a rollercoaster of an seo soap opera!
Poor rel=prev/next is quite boring in comparison.
Similar in that no one actually noticed... the rel that no longer barked in the night reboot...

Guess my option 1 stands.
Sadly, all the craptastic pagination abuse stands as well.
3:15 pm on Mar 25, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 609
votes: 98


So now they are saying keep it all the same? [seroundtable.com...]

I hate the pagination abuse and after searching for something the other day I see that WebMD does it as well.

With Google's I have zero confidence in their ability to find what the next page should be in all cases and should keep the prev and next tags. They can't even tell what link should be the Canonical link without a tag. I don't blame them for this as there are so many different variations, mistakes will happen.
6:03 pm on Mar 25, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2711
votes: 822



Google hasn't supported them in years
Google just realized they haven't supported them in years.
Google immediately wanted to let the community know they found this out.

This just really qisses me off...
I can accept that Google decides for what ever reason to stop using some feature. But "we just realized it", really. John Muller had recommend to me less than a year ago to implement rel/next, and I did, and it did sweet f-all. Thanks! Finally after month of not seeing results, I took action and merged my pages. I wasted month possibly lost years worth of revenue.

What is the purpose of holding those Hang-outs if the Google people don't know anything. I would be just as well off watching video from the millions of other SEO charlatans that are out there.
8:36 pm on Mar 25, 2019 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1339
votes: 438


@NickMNS: I've been on both sides of 'wrong info' so I sympathies with you, having worked to an assumed best practice, and with JohnMu having to admit to passing on outdated info.

However, given that the probable reason for discontinuation being the flag/identifier having become superfluous including it should, at worst, made no difference in indexing. Of course, whether single page, paginated multi-pages, or ordinary pages are best for visitors or for ranking is a totally other question.

My last opinion, not aimed at anyone, simply a general disclaimer, is that doing something only for a third party is usually not a best practice as it most often has limited value over time.
10:01 pm on Mar 25, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2711
votes: 822


My last opinion, not aimed at anyone, simply a general disclaimer, is that doing something only for a third party is usually not a best practice as it most often has limited value over time.

In my case it isn't doing something only for a third party. The case in point, is a type of page (many of these) that has a considerable amount of content on it, including graphs and other visual elements. These elements can take some time to load. Some users may or may not be interested in all elements. I split up the pages into three sections and then added links from the main page to the sub pages. This made the pages load much faster and only loaded the content that the users was interested in, saving time and bandwidth. The problem was that each pages (or sub-page) now only had 1/3rd the content on them and Google demoted them all. I lost nearly 60% of my traffic.

On John Mueller's recommendation, I then added the rel next/prev, and hoped that Google would see these pages one, but no luck. Unsure of the cause of my demise, 6 month after the initial change I concatenated the pages back together, but only for half the website. This allowed me to compare split versus combined. After another 4 month, I finally began to see a pattern emerge. The split pages continued to loose traffic and ranking while the combined pages were steady with an upward trend. At the beginning of this month I glued all the pages back together, I'm now waiting to see the result.

The content never changed, just the layout. User's were always able to find the content. In the split version it was much quicker than with the combined version. From a UX perspective the split version remains the better one. But UX doesn't trump Google's opinion. Basically great UX is worth squat, if Google stops sending traffic.

Basically the only reason, I followed John Mueller's recommendations was to provide a better UX to the users while still being allowing Google to understand the content. But as we now know, that isn't what happened.

And, yes I am aware that there are a wide variety of other factors that could have or could still be impacted me negatively, but this would have been one less variable to worry about.

Again, what's the point of these Google people blabbing away week after week if they don't know what is going on. This seems terribly amateurish for the Guardians of the (web)Galaxy. Not to mention their general nonchalant attitude (if not outright arrogant, specifically in the case Gary Illyes) to whole affair.
11:09 am on Mar 26, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 609
votes: 98


I always thought prev next showed Google that the content was related and followed a certain order. Example would be a form topic. It is not 10 individual pages of posts about a widget (which if the prev next signals did not exist could maybe lead to duplicate content with quotes or just content that appears some what the same). I guess these signals are not as important to Google anymore. They are not going to show the searcher the beginning of the forum topic if the info they searched about appears in the middle.

Still as NickMMS says there are other issues to consider like page speed and UI which is even more difficult with smaller screens of mobile devices. I too have found longer pages tend to perform better than shorter pages. One thing you may want to add to your longer articles is anchor links to any sub headings on the page (with a content block at the top linking to those headings). Google does display sometimes these anchors in the SERPS.