Invest your own money, let compound effect be your leverage, and avoid debt like the plague.
- Manlobbi
Stocks A to Z / Stocks G / Alphabet (GOOG)
No. of Recommendations: 3
The multibillion-dollar advertising model that has underpinned the internet economy could “fall apart” due to the rise of generative artificial intelligence, according to the inventor of the World Wide Web.
Speaking at the FT Future of AI Summit in London yesterday, Sir Tim Berners-Lee warned that large language models (LLMs) might eventually replace humans in consuming the internet.
“If web pages are all read by LLMs, then people ask the LLM for the data and the LLM just produces the result, the whole ad-based business model of the web starts to fall apart,” he said.
This system threatens the collapse of the decades-long advertising-based model that has led to the likes of Google and Meta becoming multitrillion-dollar businesses on the back of powerful ad networks.
“Advertising relies on people actually reading web pages . . . if they all assume that a human being is reading the web page, but the LLM is reading it and the human is not, then we have a problem.”
https://digitaleditionapp.ft.com/i9DX/hw952xwn SUBSCRIPTION REQUIRED
No. of Recommendations: 13
“Advertising relies on people actually reading web pages . . . if they all assume that a human being is reading the web page, but the LLM is reading it and the human is not, then we have a problem.”
Advertising does not rely on people actually reading web pages. Most advertising occurs at the point of search (produced by the search agent itself, not the website target) rather than when people are reading the web page. The model he is thinking about was in popular use 20-30 years ago (banner links within websites).
When future LLMs interlace/combine more narrowly targeted advertising within the LLM result, owing to the user's query being more detailed than before, then the auction rates of advertising will rise. It is astonishing for me that this point is never articulated.
Google will need to retain its center-of-attention position for mainstream search, and part of that requires making sure its LLM models are on par with the best. They don't have to be the very best at all times, but at least on par, because they have so many other types of results integrated, with tremendous momentum of search habits having formed, that I believe they will remain the center of search attention 10 years from today. With the higher auction rates, Google will be bringing in a lot more income than they are today.
- Manlobbi
No. of Recommendations: 0
Hi Manlobbi,
I think Berners-Lee does get it. He is not questioning the viability of the search cum LLM engines, but the websites from which they extract the content for free.
There will still be a lot of digital advertising, for a while longer. Except almost all of the revenue will be captured by the search and LLM engines.
If humans don’t click through and visit the websites from where the content was accessed for free by the engines, then the websites will eventually all die. A minuscule number of them may survive behind paywalls. Where will the engines go in future for new content?
Regulators may well have to step in and strengthen copyright laws to ensure content creators are compensated like songwriters are when their songs are played.
-rnam
No. of Recommendations: 3
When future LLMs interlace/combine more narrowly targeted advertising within the LLM result, owing to the user's query being more detailed than before, then the auction rates of advertising will rise. It is astonishing for me that this point is never articulated.
You can almost draw a straight line from the first version of Google search to AI. Starting from word matching, early improvements included things like identifying likely misspellings ("Did you mean..."), predictive suggestions, and has since moved onto understanding meaning and intent. For example, understanding that someone asking for a python tutorial is almost certainly asking about the programming language, not the snake. This allows for searches conducted in a conversational tone as well, which in turn means better results. As search improved the more relevant the ads became, hence higher rates, hence the reason Alphabet is basically printing money.
Using LLMs is just the logical next step in the progression. The better it understands your question, the better the ad results become. If I'm not mistaken, Manlobbi made this point before LLMs were available to public, using booking airline tickets as an example of how much better search could still become. That exact example is pretty close to being realized.
No. of Recommendations: 13
If humans don’t click through and visit the websites from where the content was accessed for free by the engines, then the websites will eventually all die. A minuscule number of them may survive behind paywalls. Where will the engines go in future for new content?
We're already seeing a lot of that even without the death of the ad model.
I'm already sick of going to web sites, even from reputable knowledgeable companies, and finding little other than AI slop as content. As for a page about, say, tire compounds, and the first 200 words are a paragraph about "What is a tire?", and the last 200 under the "In Conclusion" section, it concludes you should buy tires.
Perhaps the LLM crawlers can discount a lot of that rather than get a crap feedback death loop--maybe--but the broader death of "ordinary" content web sites is accelerating for more than one reason.
I used to mock those "enthusiast" pages with black backgrounds and 12 colours of blinking text and italics, and a page hit counter at the bottom, but now I miss them. They were written by humans who cared, some fraction of whom were actually knowledgeable.
Jim