No. of Recommendations: 3
When future LLMs interlace/combine more narrowly targeted advertising within the LLM result, owing to the user's query being more detailed than before, then the auction rates of advertising will rise. It is astonishing for me that this point is never articulated.
You can almost draw a straight line from the first version of Google search to AI. Starting from word matching, early improvements included things like identifying likely misspellings ("Did you mean..."), predictive suggestions, and has since moved onto understanding meaning and intent. For example, understanding that someone asking for a python tutorial is almost certainly asking about the programming language, not the snake. This allows for searches conducted in a conversational tone as well, which in turn means better results. As search improved the more relevant the ads became, hence higher rates, hence the reason Alphabet is basically printing money.
Using LLMs is just the logical next step in the progression. The better it understands your question, the better the ad results become. If I'm not mistaken, Manlobbi made this point before LLMs were available to public, using booking airline tickets as an example of how much better search could still become. That exact example is pretty close to being realized.