Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

AI Overviews in Search rolls out around the world

         

Whitey

2:34 am on Oct 29, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's rolling out.

[blog.google...]

My initial thoughts on results I saw. Clumsy. But it’s early days and I can see where it’s heading.

Featured image: webmasterworld
blog.google
AI Overviews in Search are coming to more places around the world
AI Overviews in Search are expanding to 100+ countries and multiple languages, bringing monthly users to over 1 billion.

seokees

9:18 am on Oct 29, 2024 (gmt 0)

Top Contributors Of The Month



Great news! Good job Google (AI, surefire deal to run down your company, in hopes of short term gains)

RubicCubed

10:34 am on Oct 29, 2024 (gmt 0)

Top Contributors Of The Month



Others around the world will finally understand what it's like in the USA to lose 80%+ of your organic traffic to AI Overviews. That old saying applies - misery loves company

engine

11:53 am on Oct 29, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



It's coming, so let's embrace it and see how we can use it to our advantage.

superclown2

12:34 pm on Oct 29, 2024 (gmt 0)



let's embrace it and see how we can use it to our advantage.


I'd love to - but any ideas on how?

saladtosser

12:35 pm on Oct 29, 2024 (gmt 0)

5+ Year Member Top Contributors Of The Month



I genuinely don’t see the current situation with Google, Meta, and other major USA corporations being sustainable. A small group of U.S.-based companies has effectively monopolised global digital advertising, siphoning billions in revenue from local markets worldwide and concentrating it in the United States. This shift not only undermines native publishers, who are essential to local voices and diversity, but also drains GDP from countries globally, benefiting only U.S. GDP in return. This imbalance isn’t just unfair; it’s destabilising and likely to implode as local economies and industries reach a breaking point. The dominance of a few companies at the expense of the global digital ecosystem is a short-sighted strategy bound to provoke backlash.

Whitey

12:45 pm on Oct 29, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



EU countries have been excluded. I see @rustybrick commented that he thinks rules around AI may be a factor.

[seroundtable.com...]

I'm not sure, but i did some digging in chatGPT and it came back with this:

"Google has excluded the EU from its AI overviews rollout, likely due to strict data privacy and regulatory hurdles:

1. **GDPR Compliance**:The EU’s GDPR requires strict controls on data use, especially for personalization, which Google AI relies on.

2. **Digital Services Act (DSA)**: This new law mandates accountability and transparency for digital platforms, making compliance complex for AI services.

3. **AI Act**:The EU is also preparing an AI regulatory framework, and Google might be waiting for clear guidelines before launching. ref: [artificialintelligenceact.eu...]

4. **Local Privacy Concerns**:Some EU countries are cautious about data use, so Google may want more time to prepare an EU-compliant version.

Essentially, Google seems to be taking a cautious approach to avoid regulatory challenges."

Let's see if there is any reports out there supporting this.

My personal focus is on how to embrace this as @engine mentions.

Featured image: webmasterworld
www.seroundtable.com
Google AI Overviews Now Available In Over 100 Countries For More Than 1 Billion Users
Google announced it is rolling out AI Overviews in Google Search to over 100 more countries, making it available to over one billion users and in all supported languages. This is the biggest expansion of AI Overviews yet, which was previously available in seven countries by default (without opting into it).

superclown2

1:57 pm on Oct 29, 2024 (gmt 0)



EU countries have been excluded.

I'm in the UK and hope this is correct (whilst believe me I do have great sympathy for those who are affected by this latest Google thievery of our content).

I noticed an example though when I was logged into a Google account. The article was lifted straight from one of my sites and obviously spun; and the link back to it was virtually invisible. I have to treat the claim that this is a benefit to website owners in the same way that I treat everything else that these people claim.

In the meanwhile though, even without overviews traffic to a multitude of sites I manage has been slashed over the past few months to a fraction of what came in previously.

engine

5:22 pm on Oct 29, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



@superclown2
>I'm in the UK and hope this is correct
It only applies to E.U. - UK is included in the expansion.

>I'd love to - but any ideas on how?

I'm experimenting now, and that's the way. Try and weave unique material so that it is picked up by the bot, and then, is presented. It's taking time to carry out these experiments. If everybody tries we'll see results quicker.

Whitey

11:16 pm on Oct 29, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm experimenting now, and that's the way. Try and weave unique material so that it is picked up by the bot, and then, is presented. It's taking time to carry out these experiments. If everybody tries we'll see results quicker.

Structured and unique data will help visibility, but it's no guarantee (I've seen very good sites expunged by Google during the 2023-24 period).

One thing I think I'm seeing is that AI cannot handle live data, news excepted. I mean, things like pricing and products that shift around. It's more about lifting static data and repurposing it into a layer that Google is trying to masquerade as it's own and possess as it's own asset level.

AI at a search engine level cannot handle the power required to do that at a global site by site scale IMO. This has to be done at a site level, and the best that can be done is to send Google live structured data to set off signals for Google to respond to.

This is why the sample overview results that I looked at looked very clumsy and "boring" and will have trouble competing for information versus the big brands with live and personalized sites, themselves using AI. But it's early days and my observations are limited.

Focusing SEO on bridging the gap between static and live data will be key. Also, outlying offsite data that creates value cannot be ignored by Google AI, and not necessarily picked up by Google and combined in overviews, at this stage, although this could change in the future.

No doubt Google will insert within it's gap, on overviews, advertising, but remember Google and marketers are all BS artists and will muddy the information to force greater engagement, then provide an illusion of differentiation, to lift revenues. Search is about "searching" by definition and that finding what you want is going to be about search engines supporting engagement through less than perfect answers to cause comparison search.

Google overviews, CoPilot, SearchGPT, Perplexity etc .... they're all in the same game with monetization motives whilst they compete for eyballs.

It will be interesting to see how the EU handles AI. Business and consumers need access to the internet, and there has to be recognition around Google throttling competition and access to choice with limited search engine controls. Google's navigation of this dilemma is equally interesting. Roll on SearchGPT and others.

NickMNS

11:46 pm on Oct 29, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



AI at a search engine level cannot handle the power required to do that at a global site by site scale IMO.

This isn't true. In fact very little additional work is required to include live data in AI search results. This assumes that Google is using semantic or vector search to provide search results. Which is almost certain given that Google started with Hummingbird more than a decade ago. AI search works using RAG system (Retirieval Augmented Generation) where one selects the most relevant and up to date data (selected from recently crawled results) and one feeds it to the LLM with the query, as such the LLM can use the new data to formulate its answer. I'm currently working on RAG system myself. It is pretty amazing stuff. If I can do it, then Google et al, with their hordes of engineers certainly can too.

Whitey

12:08 am on Oct 30, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This isn't true. In fact very little additional work is required to include live data in AI search results.

Thanks for challenging this statement and providing an informed view. Appreciated.

I picked up on a remark by Eric Schmidt recently where he said that the US should be friends with Canada as it has sufficient hydro electric power to the systems required to underpin AI. He also used a throway figure of $300 billion as the required funds. This was the trigger for my remarks, although it's great to have it challenged.

Eric Schmidt's comment on the projected cost of global AI infrastructure highlights an important consideration: scalability and the enormous resource demand for running AI at a truly global level. Schmidt's figure of $300 billion likely considers the massive computing power, energy, and storage needs to sustain an advanced AI-driven search engine across billions of daily queries and interactions worldwide. It implies that even if AI models are technically capable, there are real-world challenges in ensuring the infrastructure is reliable, efficient, and scalable without prohibitive costs.

He did mention something about China being 10 years behind, and that Europe was in a quagmire of legality that would slow advancement.

I did some digging with ChatGPT in response to your reply, and it said:

it’s true that advancements in RAG (Retrieval Augmented Generation) systems make real-time, live data integration feasible by pulling the most relevant information and feeding it into large language models (LLMs). This approach indeed bypasses the need for constantly training the LLM itself on vast data sets, which would be computationally unsustainable. Instead, RAG relies on selectively retrieving and combining the latest data, which is computationally less intensive than processing everything from scratch.

However, achieving the level of global scale that Google or other major engines require would necessitate a highly optimized and robust infrastructure to process billions of live retrievals simultaneously.

So while the RAG system theoretically enables more real-time answers, the costs and logistical challenges of deploying it globally on the level Schmidt envisions are substantial.


Then there's the issue of websites denying access to sensitive data that doesn't fit with their commercial objectives e.g. personalization, dynamic pricing causing Google AI an inferior UX.

My thoughts are that significant opportunities exist by engaging personalization that Google AI Overviews may lack (more inputs needed); focusing on user-centric, niche, and localized content, authority and trust in specialized, real-time content.

On balance I think Google will not be able or allowed to penetrate this data at scale, that could make for competitive advantages for site owners able to exploit these opportunities.

@NickMNS / others - how do you see this? (ChatGPT - garbage in, garbage out - maybe i overlooked something in my limited understanding)

NickMNS

1:23 pm on Oct 30, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The massive energy consumption you reference is required for the training and fine tuning of the LLM's (ie: Gemini, GPT4 etc...) as well as the further retraining and refinement of those models. The other consumer of energy is the computation required to embed or vectorize the crawled content. But the assumption I am making is that the content is being vecotrized regardless as some form of vector search is already in use for tradition search results.

One can't use LLM's alone for search because of the time and energy required retrain/fine-tune the models. It is impossible to keep up to date with current events at scale. So instead some form of RAG is used, and since you already have all the data vectorized, the marginal cost in terms of time, money and energy are likely small to negligible. As to the quality of the results, it's as you said GIGO.

NickMNS

1:32 pm on Oct 30, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My thoughts are that significant opportunities exist by engaging personalization that Google AI Overviews may lack (more inputs needed); focusing on user-centric, niche, and localized content, authority and trust in specialized, real-time content.

This is exactly correct, there is significant opportunity. IMO it is applying LLM's to private data, for example enabling a company's employees to query the company's own proprietary data in such a way that keeps the data private. You wouldn't want your employees sending company secrets to Google Gemini so that Google could then use those secrets to sell ads on behalf of your competitors.

Whitey

10:12 pm on Oct 30, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This is exactly correct, there is significant opportunity. IMO it is applying LLM's to private data, for example


I put together a summary list using a ChatGPT prompt that small businesses and bloggers could reflect on over here: [webmasterworld.com...] .

It's coming, so let's embrace it and see how we can use it to our advantage.


Hopefully it will provide some motivation to get creative and respond to Google's AI Overviews with some new perspectives.