AI data centers could consume 4.4% of all global electricity, or 1.6 billion kilowatt-hours, by 2035, News.az reports citing Bloomberg.
According to its estimates, the amount of electricity required to train and maintain AI is set to quadruple within a decade. Modern data centers are designed to consume hundreds and thousands of megawatts of electricity, whereas just 15 years ago, they ran on as little as 5 megawatts.
As the agency notes, if the AI data-centers were a country, they’d rank fourth in electricity use, just behind China, the US and India.
The publication names two main reasons for the growth of electricity consumption: with the advent of AI, computing processes have become more energy-intensive, and the scale of data processing required by AI is many times greater than that of traditional systems.