Since the beginnings of internet popularization, about two and a half decades ago, any company that wanted to have a certain presence online had to worry about its positioning in search engines. This, which has not changed at all, has meant until now that you had to worry about appearing on Google and, if possible, prominently. That appearing on the first page of search results generates business is beyond all discussion.
Beyond providing minimum technical sufficiency to a page (among other things, because search engines don’t want to send traffic to broken, insecure or non-functioning pages) or paying to appear in good positions when a user searches for services like those you offer, there is SEO (Search Engine Optimization), a professional specialty that is organized around the set of techniques and best practices that are applied to a web page so that it appears higher in search engine results.
SEO has a lot to do with providing your website with clear signals (technical optimization, user experience, appropriate and well-structured content…) so that Google and other engines understand what you offer on your page and assume it’s worth recommending to people searching for topics related to it. These “topics” in the case of businesses are mainly “services or products”. Around this same thing, the improvement of organic positioning of web pages, there exists a solid and consolidated industry.
It’s no secret that LLMs are altering the way network users search for information and about this much is being written during recent months. SEO is evolving very rapidly and the emergence of generative artificial intelligence is transforming how search engines understand, select and present the information that their users request.
For a long time, internet users would go to a search engine and receive lists of links ordered by relevance. This world still exists but the popularization of LLMs, with their fluid and conversational interaction, is causing a seismic movement that is beginning to be perceptible to practically everyone who wants to pay a little attention. As always happens, catastrophist voices emerge that announce the death of SEO. What truth is there in this?
Are LLMs so different from search engines like Google or Bing?
Google’s multimillion-dollar business is based on a very basic fact: it is in its search engine where for decades we have been able to find quickly, reliably and precisely what we were looking for (from a plane ticket to a technical article or an industrial supplier, covering everything one can imagine). Search engines have known how to be useful understanding what users expect to find when we write a question or a word, navigating through monstrously large data sets and showing us relevant results in a matter of seconds. In short: we have used search engines because they were totally essential, worked very well and saved us an enormous amount of time.
SEO has been shaping much of what is written and published on digital channels. It’s more than usual that many of the texts we have to read are more designed to stimulate algorithms than to provide us with any kind of knowledge or value to us, humans. If you are a person who has certain appreciation for language and very often feel eye pain when reading content on the web, be clear that you are not alone, we are many who experience exactly the same thing and it could be said that SEO is to blame. This is anecdotal to a certain extent, but it’s an evident sample of the importance that SEO has had during all this time.
There are those who say that the SEO era has ended and that the moment of GEO is arriving (Generative-Engine Optimization) and this is a point of view that must be known because it contains many weighty arguments and also because it will gain relevance over time. As our hours working with LLMs grow, going to the search engine to find what we’re looking for is becoming a less natural action and doomed to decline. Mainly, because conversational search is much more practical and efficient in the vast majority of cases. With a ChatGPT that at the time of writing this article has more than 700 million weekly active users, the all-powerful Google, Gemini aside, is moving towards direct response and gradually moving away from providing link listings. These still appear and will possibly never stop being there, but direct responses generated by Artificial Intelligence are increasingly common and the trend towards this model is clear.
To try to better understand to what extent there are hopes placed in the market strength that LLMs will have, one only needs to look at how some funding rounds are taking place in the ecosystem of Artificial Intelligence companies, such as the latest one carried out by Anthropic, reaching a valuation of 183,000 million dollars. And this is happening with a company with current revenue of “only” 5,000 million annually. Sometimes it’s hard to know if we’re facing a bubble bigger than the dot-com ones or witnessing the birth of the great technological giants of the future, but what is clear is that these valuations either include the perspective of winning a large portion of the search engine “pie” or it would be really difficult to justify them.
The big question is: are LLMs so different from search engines?
The quick and easy answer would be to say yes, that they are very different. A more elaborate answer leads us to think that if LLMs have to replace search engines or a good part of their market share, they have to prove useful for searching quickly, precisely and reliably for those things that their users ask of them and that therefore, they have to do it based on algorithms similar to those used by search engines themselves. Therefore, the SEO battle has not ended, it is partially moving to the terrain of artificial intelligence models. And it’s not such a different battle, it’s about becoming a reliable reference for these models and having them suggest you as a source or potential provider.
Therefore, the industry dedicated to optimization with the goal of making web spaces understand algorithms well, has rope for a while. Practices change (as they iterate from time to time when a Google Core Update occurs) but the needs that have generated the SEO industry remain. Consequently, it will evolve to continue responding to these needs, balancing its activities based on how technology evolves. Today, giving increasingly greater importance to optimization understood as improving attractiveness to LLMs.
Is there less traffic on the horizon?
Faced with an informational type query, the LLM in question may cite you but it’s very possible that this fact does not trigger a visit to your web page. The user who makes this query can know almost everything they need to know about you without leaving the interface of their trusted LLM. It’s because of all this that the new challenge that many companies face is being attractive to be cited by Artificial Intelligence. This is a substantial change that opens a transition process where more traditional SEO will coexist with practices of what has been baptized as GEO and the truth is that both things are very similar, will coexist in harmony and are merging into one thing. What is happening and will continue to happen is that SEO professionals will incorporate GEO practices into their capabilities catalog. Clickbait aside, it’s the natural evolution of things.
In the heat of these changes in the positioning industry, companies and analytical tools that promise to help us be cited by LLMs have begun to appear. One of the first problems that the Artificial Intelligence model ecosystem will have to face is precisely the overabundance of questionable quality content (texts, images, videos, music…) that it is generating itself. A problem that Google itself has been dealing with for some time, precisely to protect its main asset as a source of quality references (links) that help its users perform all kinds of tasks or simply satisfy their curiosity.
It’s true, the future involves less human traffic to web pages and having this replaced by synthetic traffic from LLMs that will analyze us and decide if our pages are worthy of being cited in searches that will take place within their interface. Once this first filter is passed, human traffic will possibly be generated towards our pages. It could be said that the absolute volume of traffic will not decline, but it will mostly be “synthetic traffic” intended to assess the credibility of pages and we can also think that models will be better at this task than search engines have been so far.
Emprendedor y profesional con experiencia en sectores como las agencias digitales, la comunicación corporativa, la industria musical y las administraciones públicas. Especialista en organizaciones y desarrollo de negocio. Enfocado en la comprensión y el uso de las tecnologías digitales.