Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Re: SEO in a Cookieless World

         

JS_Harris

4:13 am on May 10, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Searchenginejournal published a lengthy piece today about a topic a lot of SEOs have on their mind, where the cookieless tracking is going and what it means - [searchenginejournal.com...]

You have to skip down halfway through it to get passed the "what is a cookie, etc" basic stuff but there are some valid points in there. As I was reading it my instinct told me that the entire article's premise was based on a wrong assumption about Google.

Just a hunch but it dawned on me that Google doesn't need to track people to know what to serve them, it gets a hint from the query itself. Google also doesn't truly care about everything on a page except the precise(and hopefuly consice) bit that answers the users query.

Hard to wrap one's head around it but I am seriously questioning if all of the content bound by a page has been untethered from that page. A 5000 word article covering all aspects of a topic doesn't answer a very specific query as much as, well, a very specific response to the query with less unrelated stuff.

I'd been reading a bunch about FAQ schema after seeing questions under more results and thought I'd run a quick test. I took a blog with roughly 40 pages and performed a site: search to confirm 40 pages indexed. I then picked a single page and added Schema markup, 20 FAQs in fact, and submited a crawl request to propagate it. 30 minutes later I performed a site command again and sure enough, search now said it had 60 pages for this 40 page site. I tried going to page 5 and 6 of results for the site command but they don't exist, only 40 articles are shown.

Clearly a bug but a telling one? Is Google treating a 40 page site with 20 FAQ sections as 60 "pages" ? Who needs cookies and tracking when you can give a single paragraph direct answer to queries, and detach them from the pages they came from, instead of sending a person to a 5000 word article which is 95% irrelevant?

SEO's have their work cut out to get on top of this change. We've gone from needing an entire site be about a subject to rank it to individual pages standing on their own and now a simple paragraph is enough, if it answers a specific query. No need to cookie entire pages if you don't care about them anymore.

Thoughts?

martinibuster

5:55 pm on May 10, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



That article is based on the false premise that the future will be "cookieless." That is incorrect.

Only third party cookies are being blocked.

[edited by: martinibuster at 6:04 pm (utc) on May 10, 2021]

martinibuster

6:00 pm on May 10, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Re:
"A 5000 word article covering all aspects of a topic doesn't answer a very specific query as much as, well, a very specific response to the query with less unrelated stuff."


Sites with a lot of "unrelated stuff" do not tend to rank in Google.
Google's Passage Ranking algorithm is supposed to help those poorly written articles but Google's Martin Splitt said [searchenginejournal.com] that a well written article will still beat a poorly written article. It's not that all of a sudden articles with huge word counts are going to flood the SERPs. Martin Splitt said that good page structure still matters.

JorgeV

6:16 pm on May 10, 2021 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Hello,

So you mean that word counts is not the most important ranking factor?! Heretic!

NickMNS

6:38 pm on May 10, 2021 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



and performed a site: search

The "site:" operator is unreliable at best. You cannot draw any conclusions from the use the site: operator. It has been messed up for years.

That article is based on the false premise that the future will be "cookieless." That is incorrect.

If the future were cookieless, then Google SERPs would be the least of your problems. First party Cookies are an essential part of web security/cryptography.

JorgeV

9:22 pm on May 11, 2021 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Hello,

Never underestimate the "creativity" of regulators. Tomorrow, they can decide that first party cookies are forbidden, I even remember, some years ago, talks, in the EU about forbidding the encryption of data, between a client and a server...

phranque

10:10 pm on May 11, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



the blockage of third party cookies is a tech industry reaction to consumer demand and has zero to do with a creative regulator.

JorgeV

10:42 pm on May 11, 2021 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



the blockage of third party cookies is a tech industry reaction to consumer demand and has zero to do with a creative regulator.

I am sorry, but I do not agree, with the EU ePrivacy Directive (2009), EU sites started to have to more or less require the consent from their visitors, this was lately enforced by the GDPR (2016, applied starting in 2018), and this year, there is no more tolerance. You have to obtain the explicit consent from each EU visitors, for all these third parties cookies which are dropped by anything , even if you disable interest based ads (Adsense), you still need to obtain the consent from the visitor for cookies, and this before any third party cookie is set. The tech industry, is just following what was imposed in the EU since 5 years. When Apple or Google are moving, this is just because they try to look proactive, whereas they are just catching up. Smaller players like Firefox, Opera, Brave, DDG, they are just seeing an opportunity.

Google for example, they clearly see all the mess it causes, in Europe to obtain the explicit consent of visitors, and how much money they loose with all those refusing the cookies, that they decided to switch to something else, and then pretend to drop third party cookies because they believe it's good thing, which by the way, can be a way to hurt concurrents which are relying on these cookies, and may be forcing them to adopt Google's new system.

phranque

11:10 pm on May 11, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



all of what you wrote is true except that you haven't referred to a regulation which mandates the blockage of third party cookies.

regarding google's attempt to bypass third-party cookies, it took no time for the tech industry to react accordingly, not because of regulation, but because of consumer demand for privacy options:
Vivaldi and Brave Begin Blocking Google FLoC [webmasterworld.com]
Make WordPress Core: Call to Treat Google's FLoC as a Security Concern [webmasterworld.com]
DuckDuckGo Extension Will Block Google's FLoC Tracking [webmasterworld.com]

not to mention:
Alternatives to Google's FLoC Tracking Proposed [webmasterworld.com]

martinibuster

4:15 am on May 12, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Tomorrow, they can decide that first party cookies are forbidden...


You wouldn't be able to log on to WebmasterWorld or purchase anything online without a cookie. Cookies are not going away.

As I said in a previous post, a cookie-less future is not happening.

engine

4:02 pm on May 12, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Just to bring a bit of a focus back on this: I agree with MB, cookies will persist (excuse the pun), but FLoC is really about adverting cookies.
It's true, many people are rejecting cookies, which is not helping advertisers, and, of course, the world's largest, Google, has most to lose.

I believe Google moving to a cohorts system is meant to appease regulators before regulation comes and forces change.