Artificial Intelligence Power Consumption Could Match a Small Country’s by 2027

Amid global sustainability priorities, AI's high energy and resource demands raise questions about its future growth and acceptance.

Baltic DataCenter wikimedia

A recent study highlights that the energy consumption of may reach a level equivalent to the power usage of a small country in the span of just over half a decade. By 2027, artificial intelligence could be consuming between 85-134 terawatt-hours (TWh) of electricity annually. This potentially alarming figure is contingent on steady growth and interest in artificial intelligence, coupled with the continuous availability of AI chips.

Accompanying the energy consumption concern is a rising worry about the significant amount of water required to cool down the data centers that power AI-powered such as ChatGPT and Bing Chat. Every time these AI bots are used for responding to a user's query, approximately a bottle of water is spent on cooling purposes.

The Financial Strain and Falling User Base of AI Companies

, the organization driving the AI chatbot , reportedly spends up to $700,000 daily for the operation, pushing the organization to potential bankruptcy. User dissatisfaction, stemming from a perceived decline in the chatbot's intelligence, is also posing a challenge for OpenAI. The study predicts that such resource-intensive technologies could be consuming as much energy as a country the size of the Netherlands by 2027.

Similarly, by , despite substantial investment in the technology, saw stagnation in its market share for most of the year. A decline in ChatGPT's user base for three consecutive months was also noted.

Coming Short of AI Chip Demand

Alex De Vries, a PhD student at the VU Amsterdam School of Business and Economics, who conducted the study, clearly mentioned that this large-scale consumption of energy is solely reliant on the assumption that the interest in AI continues to escalate, along with the enduring availability of AI chips. However, the recent inability of , a significant manufacturer of GPUs used in AI technology, to fulfill the climbing demand for these chips threatens this scenario.

To mitigate costs and make their venture more profitable, Microsoft is readying itself to launch its maiden dedicated AI chip at the Ignite 2024, its annual developer conference.

Despite its conveniences and advancements, whether the AI industry can sustain itself amid these energy concerns remains to be seen. In a world focusing on sustainability and resource conservation, the high cost and intense resource demand of AI could potentially hinder its future development and widespread adoption.