Rolling out Google Generative Search Experience
Digital marketing (and the internet in general) has a dirty little secret. Energy. Or, to be more precise, its gargantuan thirst for energy. Even before the advent of artificial intelligence, Google had a voracious appetite for electricity. It uses country-sized quantities of energy each year. Yes, we’re facing unprecedented impacts as a result of climate change. Countries are having to go to massive lengths to curtail their energy consumption. Yet, that online platform you use to search for cat memes and do the shopping is using immense amounts of electricity. As energy-industry commentator Robert Bryce has written, if Google were a country, its electricity use would rank among the top 90 countries in the world. Whatsmore, the company’s electricity use is doubling every three years (approximately). And that’s for Google as is. However, we appreciate we’re talking in generalities here. How about some context? According to Statista, Alphabet (Google’s parent company) used 15.4 GWh (Gigawatt hours) of energy in 2020. That is the same or more energy than the following countries:- Sri Lanka.
- Angola.
- Slovenia.
- Uruguay.
- Lithuania.
- Costa Rica.
- Estonia.
- Albania.
- Luxembourg.
- Jamaica.
- Malta.
The energy consumption of artificial intelligence
Okay, now we get the crux of my article. In fact, if there’s only one part of this article you’re going to remember (or screenshot and laugh about on Twitter or LinkedIn (although, other social networks are available)), it’s this part. A clever chap by the name of Alex De Vries, has crunched the numbers on integrating AI into Google search. In his paper, The growing energy footprint of artificial intelligence, published in Joule, has found that both the training stages and inference stages of the LLMs (Large Language Models) that underpin AI chat bots such as ChatGPT, use vast amounts of energy. How much? It’s difficult to gain a 100% accurate picture because the tech companies behind many of the latest AI innovations are being decidedly coy (aka, not transparent in the slightest) about their energy usage. Some clever bods have done some digging though, and have come up with some realistic looking figures. Let’s first consider the ‘training stage’ of an LLM. ChatGPT 3 reportedly consumed 1,287 MWh of electricity during its training stage.| Unit of Measure Equivalents for Electricity1 | |
| Unit | Equivalent |
| Kilowatt (kW) | 1,000 (One Thousand) Watts |
| Megawatt (MW) | 1,000,000 (One Million) Watts |
| Gigawatt (GW) | 1,000,000,000 (One Billion) Watts |
| Terawatt (TW) | 1,000,000,000,000 (One Trillion) Watts |
| Kilowatt-hours (kWh) | 1,000 (One Thousand) Watthours |
| Megawatt-hours (MWh) | 1,000,000 (One Million) Watthours |
| Gigawatt-hours (GWh) | 1,000,000,000 (One Billion) Watthours |
| Terawatt-hours (TWh) | 1,000,000,000,000 (One Trillion) Watthours |
BuT wHaT AbOuT EnErGy EfFiCiEnCy?!
Of course, the natural answer to this is energy efficiency. “If we just make LLMs more energy efficient, then that’ll solve the problem” I can hear the AI advocates shout. And, to an extent, they’re correct. It’s become almost a law of nature that any technology will become more energy efficient over time. And, indeed, Google is busily working away on this problem. As a recently filed patent revealed, Google is attempting to more efficiently allocate computing resources when training an LLM. Given the stakes (and sheer sums of brain power and money involved) I suspect Google will succeed in reducing the energy consumption of the LLMs underpinning its Search Generative Experience. But, even if Google does succeed in making LLMs more energy efficient, it will face an issue first summarised by William Stanley Jevons in 1865. Born right here in Liverpool, Jevons was tasked by the government at the time with investigating Britain’s reliance on coal (which at the time was the country’s primary energy source). His findings had implications which continue to resonate today. In short, he surmised that increased energy efficiency inexorably leads to an overall increase in energy consumption. Sounds counterintuitive right? Well, that’s why it's a paradox… But, to make things clearer, let’s take a look at a tangible example. Imagine you have an indoor tomato farm. You have an acre of space which requires heating in order to grow the tomatoes. It costs you £100 per day in electricity to provide that heat. Suddenly a clever inventor discovers a way to make heating technology that is 50% more efficient. Brilliant! It now only costs you £50 per day to heat that acre of tomatoes. The thing is… what do you do with that extra £50 that’s now burning a hole in your pocket? Well, thanks mainly to the vagaries of capitalism, it makes sense to grow two acres for £100. You’ve just cancelled out that gain in energy efficiency. You can see this in action in the digital space right now. Consider the slew of new apps and devices that ‘feature AI’. Surely, it won’t be long before we’re all using AI-powered petrol pumps and LLM-equipped refrigerators. Sure, LLMs may become less energy intensive in themselves, but as per old Jevons, the total number of LLMs (and their concomitant energy use) is undoubtedly set to skyrocket.Paying for the Generative Search Experience
Okay, so it’s my contention that if Google wants to roll out its Generative Search Experience in any meaningful way, it’s going to incur significant energy costs. According to estimates by global investment firm Morgan Stanley, ‘Google’s 3.3 trillion search queries last year (2022) cost roughly a fifth of a cent each’. What does that number look like if an AI-powered search experience rolls out? Again, back to Morgan Stanley: ‘Google… could face a $6 billion hike in expenses by 2024 if ChatGPT-like AI were to handle half the queries it receives with 50-word answers’ (emphasis mine). However, the fun doesn’t end there. All of those AI-powered searches won’t just require gobs of electricity - but costly hardware, too. As per de Vries, ‘SemiAnalysis estimated that implementing AI similar to ChatGPT in each Google search would require 512,821 of NVIDIA’s A100 HGX servers, totalling 4,102,568 GPUs’. I’m sure you don’t need me to point out that that’s a bleedin’ enormous number of servers. Oh, and the cost of all these servers? A total investment cost of $100 billion5. When you consider that Google Search generated revenues (that’s revenues, not profits) of $162.5 billion in 2022, it’s clear that the sums don’t exactly add up. Bear in mind that the aforementioned $100 billion isn’t a ‘one time’ cost either. Servers are subject to the same laws of entropy as everything else, and thus will need replacing over time. Factor in the cost of all this hardware, and its exorbitant electricity usage and as John Hennessy (chairman of Google’s parent company Alphabet) admitted to Reuters, AI-powered search could “cost 10 times more” than using traditional search tools (a.k.a. traditional Google). On the face of it, Google is dangling its Generative Search Experience in front of an eager audience of digital marketing professionals, but may not actually be able to deliver it in any meaningful way (at least not on an all-query, search-wide basis).The embodied energy problem
So, what have we learnt so far? That:- AI uses enormous amounts of energy, both in its training and inference phases.
- Thanks to Jevons Paradox (and the frankly cosmic ambitions of AI hucksters), any breakthroughs in AI energy efficiency are likely to be cancelled out by the increased utilisation and deployment of AI.
- The cost of sufficiently powerful hardware to run an AI-powered search is enormous. So much so, that it may be unfeasible for Google to roll out its Generative Search Experience in any meaningful way.
