Forum Moderators: Robert Charlton & goodroi

Google is an illegal advertising monopoly - Judge rules

         

Whitey

4:34 pm on Apr 17, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Things are heating up for Google (and big tech).
[bbc.com...]

Kendo

1:30 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Whitey - there are many pros and cons and you have a hit a few already.

While it is doable, it may not be popular because most websites want exposure at any cost, and those that need to feed the current search engines will lose out... but only in surface area. They could still maintain search engine fodder as doorway pages... it is merely applying DRM to parts of ones website.

And yes, the usual problems arise:
1. as a monopoly it will be descredited by those who do such things
2. it will be plagiarised by every opportunist who has the means
3. there will eventually be too many doing the same thing
4. in the end it could evolve to be just another sewer

Whitey

2:05 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@cnvi – I hear the passion in that. There’s definitely a story worth telling—about how the original value loop of the web got broken, and how AI and monopoly power accelerated that fracture. The idea of documenting it, not just as critique but as blueprint, has tremendous merit.

But whether through storytelling, standards, or infrastructure, the challenge is threading that needle: "restoring value without destroying the openness" that made the web work in the first place. We’re at a point where old rules no longer apply, but no one’s agreed on the new ones yet.

If the commercial layer starts paying attention, because it’s their lunch being eaten too, that might be the inflection point. Until then, we're left navigating the tension between innovation and erosion, openness and control. The direction isn't settled, but the stakes are becoming clearer by the day.

@Kendo – Good points all around. Applying DRM selectively—just like a paywall or API gate, it could work in principle, especially for proprietary or high-value content. The notion of maintaining public-facing "search fodder" while protecting core value areas mirrors how media already compartmentalizes open vs. subscriber access.

That said, the issues you flagged are real. Anything that scales draws imitation, dilution, and eventually, entropy. Monopoly suspicion, copycat platforms, value saturation, it’s the same pattern we’ve seen across every digital gold rush. Even the cleanest tech can become a sewer if misused or overrun.

Maybe the lesson is this: it’s not just the system, it’s the stewardship. Technical controls can help, but if the incentives reward extraction over contribution, the outcome repeats. Without a shift in how value is assigned, and shared, we’re back to patching leaks in a ship that keeps sinking.

Kendo

2:53 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



it could work in principle

The technology already exists. Tagged pages are encrypted server side and sent to the user's browser, a specially designed web browser that can decrypt the pages while uniquely identifying the user. Thus a secure tunnel is created that cannot be accessed or exploited by any other means.

Anyway, the day is young so lets see what comments we get :-)

Whitey

3:05 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Kendo – Understood. If the encryption layer is already deployable via a custom browser with user-level ID and tunnel integrity, that’s a serious shift, more DRM than HTTP. A hardened delivery model like that could indeed create scarcity and enforce licensing in a way that the open web never could.

The real question, then, may not be can it be done, but whether there's a tipping point in content economics that makes people want it. Until the pain of zero-click AI and value loss outweighs the benefits of exposure, it’s a hard sell for most publishers.
Anyway, the day is young so lets see what comments we get :-)

Yep, it’s early in this cycle. It will be interesting to see where the next wave of comments tilts; either toward adaptation or reinvention.

cnvi

3:11 am on May 10, 2025 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Whitey “ If the commercial layer starts paying attention, because it’s their lunch being eaten too, that might be the inflection point. “

This is my basic “point”. Millions of SMB’s have been taken for by thousands - some millions $$$ many much even more by G not just by restraining trade, but allowing bots (well known by previous G employees who wrote about their displeasure w their employer) to click ads creating tremendous wealth for G.

Why are we still kicking this can around?

A few of us know the facts, know the public quotes, know who to interview, etc. It’s the biggest elephant in this room. It’s tough to cough up $ for a new commercial funded CDN.

Getting the public on board is marketing 101. We all know that here. So why aren’t we doing anything about it? I’ll organize if I get a private message from just one of you. As I’ve said I’m the inventor of USPTO 7082470 I am happy to lead I just need some help from a few other experts in this matter. All legal and professionally handled.


[edited by: not2easy at 6:58 pm (utc) on May 12, 2025]
[edit reason] Please see TOS #26 [/edit]

Kendo

3:25 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How come Google has a patent on USPTO 7082470?

Whitey

3:34 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@cnvi – You’ve articulated what many suspect but few frame so clearly: that the economic asymmetry goes far beyond just monopoly control of discovery, it touches ad economics, bot behavior, and accountability gaps that have persisted for years. The quotes, the history, the patterns… they’re all there. And it’s not just a technical problem—it’s a systemic one.

You’re right to say the story practically writes itself. And perhaps it takes someone with the perspective, patent history, and credibility you bring to start shaping it into something that sticks—whether as analysis, documentary, or policy provocation. Not everyone is positioned to do that. But you clearly are.

This thread’s persistence shows there’s appetite for clarity. Maybe that’s reason enough not to let it just fade out.

Kendo

4:07 am on May 10, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@cnvi - I will get my Amazon tech to guestimate the CDN.

cnvi

4:21 am on May 10, 2025 (gmt 0)

10+ Year Member Top Contributors Of The Month



[patents.google.com...]

USPTO 7082470 was produced by me to clear up any misunderstandings above. G just has a fancy database to catalogue US patents. That name on the patent initials JL are me.

I’m ready to rumble and I am a Paralegal at the largest Civil law firm in the southeast US so I know patent law and Civil law and how to write this story. What I need are a few experts to help me w fact checking, interviews, etc. Co authors. I want Brett Tabke too - but don’t know him personally but if any of you do he would be a huge asset writing this story as he has all of the contacts … if he doesn’t throw this thread out first lol.


[edited by: not2easy at 7:01 pm (utc) on May 12, 2025]
[edit reason] Please see TOS #26 and Charter [/edit]

not2easy

7:21 pm on May 12, 2025 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Mods Note:
You may notice that a few of these posts needed editing. I have tried to leave as much as possible intact. Two things to keep in mind: Discussing the formation of a group or calls to action against any company or person will be removed. See ToS #26: - [webmasterworld.com...]
Claims of action, flames, and calls to action against any company or person will be removed.

and from the Charter: [webmasterworld.com...]
Private Messages
Posts that request Sticky Mail contact, or that mention "I sent you a PM" and so forth are not appropriate in this forum. Those who want to have private discussion with another member should simply send a private message to that person, or assume that those who want private contact with you will reach out. This practice helps create a thread that can help many people well into the future.


New ideas are wonderful, but we do not organize groups here. As stated in our Mission Statement:
We are not here to actually do business with one another.
- [webmasterworld.com...]

Whitey

10:25 pm on May 12, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@cnvi – Between your legal credentials, the patent trail, and the growing weight of this thread, I’m starting to think we need a new genre: True Crime: Algorithm Edition. Netflix, call us.

But in all seriousness, there is a story here—one that deserves telling, not torching. If it takes a few old-school forum legends and a paralegal with a plan to tease out the systemic failures behind today’s broken discovery economy, then so be it. Let’s just do it within the lines so the mods don’t throw the book at us (again) :)

Out of curiosity—where do you all see the tipping point? Was it AI Overviews? The collapse of referral traffic? The moment bots outnumbered humans? If we mapped the decline of the open web like a stock chart, what would be the red circle and note that says “You are here”?

We can’t fix the web in one thread—but maybe we can sharpen the lens.

Kendo

12:51 am on May 13, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



build a new bold framework


I have been researching how anything can be done that would be feasible and have come up with one idea and currently building an example as POC.

But I do think that to continue this discussion it should be moved to a new topic without naming any entities, and in saying that, we may still have a problem with TOS because it will be difficult for it not to be "promotional".

cnvi

12:58 am on May 13, 2025 (gmt 0)

10+ Year Member Top Contributors Of The Month



Sadly there is no way this conversation can continue here. I can’t state why but it seems pretty obvious.

Whitey

2:42 am on May 13, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@all – It’s worth drawing attention to @brett_tabke 's new piece on SearchEngineWorld.com on the DOJ Trial and newly released documents: DOJ Trial: How Google Force-Fed Website Owners AI Scraping to Stay in Search [searchengineworld.com...]

This article digs into the DOJ trial transcripts and internal Google documents, showing how Google quietly created six “opt-out” levels for publishers. The most protective level, blocking content from AI model training, was marked a “Hard Red Line” and never made public.

What’s more revealing is that Google VP Eli Collins confirmed under oath: unless publishers block Googlebot entirely via robots.txt, their content might still be ingested for AI training. That means even if you use meta tags like NOAI or similar, unless you pull the nuclear option and block Google altogether, your content remains fair game for AI.

So while on the surface, Google says it respects robots.txt, the trial suggests there's a functional separation between search crawling and AI data ingestion. And if that’s the case, we’re not just talking about zero-click economics anymore, we’re talking about outright data extraction without viable opt-out pathways.

This could explain why more publishers are experimenting with paywalls, content cloaking, or simply giving up on Google referrals entirely.

I’d be interested in others’ technical observations here. Has anyone detected LLM-style scraping from non-Googlebot agents? Is it time we redefined “bot blocking” with AI in mind?

Kendo

2:57 am on May 13, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Actually I have seen some new scraping in my traps that leaves a trail like
example.com; https://myproduct.example.com

example.com is a most famous CMS that cannot be named
myproduct is one of my solutions which cannot be named

example.com has scraped all of my pages related to product, including news and blogs, and published them at myproduct.example.com

This I don't mind, but while the scrape is new, their archive is quite old and it didn't show up when I dug into 200 pages of search results not so long ago.

Whitey

2:49 am on May 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



18-Year Google Search Engineer Ryan Moulton Speaks Out on DOJ Antitrust Case

If you missed it, Ryan Moulton, a senior Google Search engineer who’s been shaping the ranking algorithms for over 18 years, broke protocol and shared a detailed thread on X, frustrated by how his work has been cited (and misrepresented) in the DOJ antitrust trial.

Barry Schwartz aka @rustybrick picked it up here: [seroundtable.com...]

Key Takeaways:

Moulton criticizes how courts interpret internal Google experiments and metrics, calling it "Kafka-esque" and saying they misunderstand both the scale and intent.

He argues that Google does run rigorous tests to improve search, and using those tests as evidence of lack of incentive is absurd.

He states there’s a strict firewall between search and ads — he never saw revenue data, only UX metrics.

On a personal note, he laments the broader web environment:

People expect a lot more from their search results than they used to, while the market for actually writing content has basically disappeared.

That last part hit home for me. As someone who’s watching the steady erosion of content ecosystems from affiliate sites to hobby blogs it’s sobering to hear this acknowledged by someone inside Google. If the very person optimizing results sees the incentive to produce content evaporating, where does that leave the rest of the web economy?

The question isn’t just whether Google is a monopoly. It's far more serious. Authorities will have to rely on Google to self manage, before, if ever it is able to take action using the legal system to remedy it. It's also whether Google's central role is sustainable when the open web stops participating.

cnvi

5:23 am on May 14, 2025 (gmt 0)

10+ Year Member Top Contributors Of The Month



“ This article digs into the DOJ trial transcripts and internal Google documents, showing how Google quietly created six “opt-out” levels for publishers.”

The average small business owner/ website manager doesn’t know much about these levels let alone;

- robots.txt
- link disallow
- link no follow
- how to opt out of publishing scraping
- but most have been brainwashed into thinking the only way to get traffic is from clickbait articles leading them to ad placement on the Great search engine which allows bots to click on the ads generating billions. Damaging balance sheets of millions of SMBs.
- some still don’t realize to utiilize any opt out tools or good behavior requires an invasive G my business account.

When and more so HOW did the WWW become commanded by one monopoly? why is any business required to have a GMB account ? What happened to the world wide web?

The fact remans the largest corporate crook on the planet is getting away with it because forums and conferences discussing this topic rely on its revenue to survive.

Whitey

8:48 am on May 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I thought I’d run out some further observations on these “bombshell” revelations in the DOJ trial mentioned above which has unearthed fascinating internal details about Google’s ranking systems, particularly from documents describing how core signals are developed and used.

Key Takeaways:
1.Most Signals Are Manually “Hand-Crafted”
•Aside from RankBrain and DeepRank (LLM-based), nearly every other signal used by Google is hand-tuned by engineers.
•This involves analyzing data, plotting sigmoid curves, and manually selecting thresholds—meaning Google’s ranking logic is highly human-curated, not just AI-driven.
2.ABC Signals = Anchors, Body, Clicks
•These three raw signals form the foundation of Google Search:
•A: Anchors – inbound link text pointing to a page.
•B: Body – on-page text and keyword content.
•C: Clicks – user behavior data (e.g., dwell time before bouncing back to SERPs).
•These are still at the heart of how relevance is scored today.
3.Navboost
•A major internal project led by one of Google’s senior engineers (HJ), relying heavily on clickstream and engagement data.
•This suggests behavioral signals play a much bigger role in rankings than Google publicly admits.
4.Topicality Score (T)*
•Google uses a metric that combines the ABC signals to calculate how relevant a document is to a query.
•This replaced older IR models and was in development for years until stabilizing around 5 years ago.

Why This Matters:
These insights reaffirm what many SEOs have long suspected:
•Rankings are not purely algorithmic; human hands are deeply involved in shaping outcomes.
•Click and user engagement data are core ranking factors, not peripheral ones.
•The algorithm still hinges on classic SEO components—links, content, and user interaction.

As a personal note, the “hand-crafted” nature of signals makes it even more disheartening to see content creators being sidelined in the zero-click, AI-summarized era. When judgment calls govern so much of what ranks, creators deserve more transparency and support—not obsolescence.

Full discussion and source via Seroundtable:
[seroundtable.com...]

Whitey

9:01 am on May 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Why “Hand-Crafted” Signals Matter to the DOJ Case Against Google

An important addition to the above.

The DOJ antitrust trial against Google isn’t just about market share—it’s about how Google maintains that dominance. The recent revelation that many of Google’s ranking signals are hand-crafted (i.e., manually designed and tuned by engineers) could be a major turning point in the case. Here’s why:

1. Destroys the “Algorithmic Neutrality” Argument

Google has long insisted that its search results reflect objective relevance via machine learning. But if engineers manually tweak signal weights and thresholds, then the system isn’t neutral—it’s curated. This makes it harder for Google to argue it’s simply “letting the algorithm decide.”

2. Suggests the Ability to Favor or Demote at Will

Hand-crafting signals means Google has the ability to:
•Favor its own verticals (Flights, Hotels, YouTube)
•Suppress competing sites
•Adjust rankings in ways that benefit its commercial goals

That’s editorial control, not just search optimization.

3. Validates Complaints from Publishers and Competitors

If you’ve ever lost rankings or seen inexplicable traffic drops, this might explain why. Google’s manual control gives it the power to shape entire sectors, and publishers have had no transparency or recourse.

4. Removes the “Black Box” Excuse

In the past, Google’s fallback has often been: “It’s too complex to explain.” But hand-crafted systems mean they do know how and why results are ranked. That opens the door for legal accountability.

Bottom Line:

This goes to the heart of the DOJ’s case. It’s not just about scale or success—it’s about how Google preserves dominance through opaque, human-shaped ranking logic, potentially stifling competition in the process.

Does this change how you view Google’s search neutrality?

Source: [seroundtable.com...]

tangor

7:33 pm on May 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Sad thing is that most of us knew all of the above from simple experience. SOMETHING had to be making changes to the "algo" and if it wasn't the algo it has to be HUMANS.

As far as I know machines and code do NOT know what GREED and POWER is...

Whitey

9:49 pm on May 14, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Tangor – Exactly. And that’s the sting, isn’t it?

It doesn’t matter how good your content is, how much effort you put in, or how well you align with supposed “best practices.” If manual overrides or hand-crafted signals can tip the scale, then the playing field was never level.

The myth of algorithmic meritocracy gave us something to work toward—but if behind the curtain it’s human hands tuning dials based on opaque priorities, then trust in the system is shot.

No wonder so many publishers feel gaslit.

Why Would Google Hand-Craft Ranking Signals? And Is It Legal?

Commercial Purpose?

Simple—money and control. Hand-crafted signals let Google:

•Prioritize its own properties (Flights, Hotels, YouTube, etc.)
•Increase reliance on paid ads by subtly downranking organic results
•React quickly to competitive threats
•Manage brand risk under the guise of “quality”

Is It Legal?

Maybe. But that’s exactly what the DOJ antitrust case is testing.
If Google’s manual tuning is proven to:
•Harm competition (not just competitors),
•Mislead users and publishers about “neutral algorithms,”
•Or unfairly entrench its dominance,

Then it could breach Section 2 of the Sherman Act.

Bottom line:

It’s not about whether the algorithm is smart—it’s about whether Google uses it to lock down the market. And hand-crafted signals make that a lot easier to prove.

Not a good look G

cnvi

10:14 pm on May 14, 2025 (gmt 0)

10+ Year Member Top Contributors Of The Month



@tangor…. well stated in just a few words.

Whitey

11:10 am on May 15, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Manage brand risk under the guise of “quality”

The relationship between ad spend, brand strength, and organic search visibility has never been purely coincidental. Since the mid-2000s, it’s been evident that while Google’s technical teams may operate in silos, senior management must understand the economic interplay between paid and organic channels, and how that dynamic reinforces their broader market position.

With roughly 75% of Google’s $350 billion in annual revenue driven by advertising, major brands participate in a closed-loop system: invest heavily in ads, bolster brand signals, and benefit from enhanced organic visibility, framed publicly as a function of “quality.”

This creates a structural bias favoring the largest ad spenders, enabling Google to entrench its monopoly while excluding smaller players from meaningful visibility. It’s not just self-interest, it’s systematized anti-competition under the veneer of algorithmic neutrality.

Now you have AI overviews in the mix reducing competition further.

Kendo

4:35 am on May 16, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



75% of annual revenue driven by advertising

Some say that ads pay but most just contribute regardless.

- Many are scared that they will be ignored if not contributing to Adwords
- Management courses recommend a percentage of turnover spent on advertising

I have a client who imports goods and wholesales nationally. Part of his agreement for exclusivity was that he spend $x on Google Ads. I told him that he would be wasting his money, but that was a condition of his contract. Even though he created a new website just for that one product that made first page of search results overnight and cost more than the $x, it was not accepted as contra.

His sales come from regular wholesale clients, the retailers. Most of what he sells he has exclusive dealership for. I have seen the stats... what he has to spend on advertising is an absolute waste.

Whitey

6:13 am on May 16, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have seen the stats... what he has to spend on advertising is an absolute waste.

The big players operating at scale have cross channel / tech / brand advantages and feed the ecosystem with Google. One player has managed to build a massive “walled garden” away from Google so that they are no longer seen as a top 10 player in the vertical I follow. But they are the biggest globally.

There are lessons for smaller players here, I feel.

londrum

8:03 am on May 16, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wouldn't be surprised if we all wake up one morning to find Google is suddenly on the brink. Half their numbers seem like smoke and mirrors. Theyve been making their SERPs so bad people have to search over and over to find what they want, they get funnelled back to another search page through links in AIO, and now they're even running their own ads to more search pages. And all this gets reported as users doing more searches.

Kendo

1:14 pm on May 16, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I cancelled my ad account long ago, but recently got offered $600 of free ads if I spend $600... seriously?

Whitey

9:45 pm on May 30, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



TL;DNR – Judge zeros in on Google’s AI monopoly risk
(Also relevant to the ongoing scraping/copyright debate)

As the U.S. antitrust trial against Google closes, Judge Amit Mehta is turning the focus to AI; can AI allow new competitors, or will it entrench Google further?

The DOJ proposes serious remedies:

• Break off Chrome

• End Apple-style default search deals

• Restrict Google’s AI expansion

• Force Google to share search data with rivals

Google says it’s overreach; claims it would hurt U.S. national security and that OpenAI etc. are valid competition already.

But the judge wants real, forward-looking safeguards, especially as AI becomes central to search visibility, rankings, and content usage (sound familiar?).

This ties directly to what we’ve been discussing in the scraping/copyright threads: [webmasterworld.com...]

AI-powered search is consolidating power while extracting value from publishers and creators.

Final ruling expected by August.
Source: NYPost [nypost.com...]

tangor

10:57 pm on May 30, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Final ruling expected by August.


</appropriate="Dog Days of Summer...">

Whitey

11:40 pm on Jun 4, 2025 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



</appropriate="Dog Days of Summer...">

Meanwhile the pursuit of AI search domination continues, unabated. Lobster on lobster, Cyclical consumption.

I wonder what the Judge thinks about this domination coming in AI with Google’s ubiquitous and monopolistic presence.

For the next two or three years, both those modes are going to be growing and necessary. We plan to dominate both,' says Demis Hassabis CEO DeepMind
[searchengineland.com...]
This 88 message thread spans 3 pages: 88