Increase font size Decrease font size Reset font size

The climate paradox: how AI can save and sabotage the planet

By Abdul Sattar Abbasi 2024-12-10
IN A world racing to combat climate change, the newest weapon in our arsenal comes with an ironic twist. Artificial Intelligence (AI), the digital saviour in the fight against global warming, has developed quite a thirst of its own.

It seems the very machines we`re building to help save our planet are guzzling water and energy at rates that have scientists worried.

Every time you ask ChatGPT a few dozen questions, it `drinks` roughly 500ml of clean water to cool the computers that powers behind its intelligence.

For Pakistan, perched precariously on the frontlines of the climate crisis, this paradox holds particular significance. With temperature records being shattered each summer and water scarcity looming, the environmental cost of these digital solutions cannot be ignored.

Essential applications At its best, artificial intelligence acts like a planetary nervous system sensing, predicting, and helping us respond to environmental threats. Google`s Flood Hub, for instance, provides flood forecasts up to seven days in advance across more than 80 countries, helping protect communities across Africa, Europe, South and Central America, and the Asia-Pacific region, including Pakistan.

As of 2023, their forecasts cover areas where over 460 million people live. In October 2023, this service expanded to the US and Canada, covering more than 800 riverbanks where over 12m people live.

These applications aren`t just impressive they`re essential. The World Meteorological Organisation estimates that improving early warning systems could reduce climate disaster damages by 30 per cent.

And AI is already delivering real climate action results. For example, Global Forest Watch uses AI and satellite imagery to create a real-time tool to monitor and combat deforestation.

But running these powerful AI systems requires enormous computing power, and their carbon footprint is equally large; according to researchers at the University of Massachusetts Amherst, training one of these models can generate carbon emissions equivalent to the lifetime emissions of five average American cars.

The water footprint is equally startling. Modern data centres use vast amounts of water for cooling up to 500,000 gallons per day for a large facility. By 2027, some experts project that AI systems could demand as much water as half the UK`s annual consumption.

Geography of AI While the benefits of AIimproved climate predictions, optimised energy systems, better disaster responses can be global, the environmental costs fall disproportionately on certain regions and communities.

In water-stressed regions like Pakistan, where every drop counts, the water demands of AI infrastructure could compete with basic needs like agriculture and drinking water.

This disparity extends beyond resources to expertise. According to a Stack Overflow survey published by the OECD, North America hosts 30pc of the world`s AI experts (who are active online), while Pakistan accounts for 0.67pc. This concentration of talent and computing power in the Global North risks creating solutions that may notfully accountforthe needs and constraints ofdevelopingnations.

But the real innovation is happening at the intersection of efficiency and design.

`Training better models can actually save more energy over time, explains Justin Burr from Google AI. `Given the number of times they`re used each day for inference, in less than a week they save more in energy than the old hand-tuned versions.

Location matters too, and the carbon intensity of AI training can vary dramatically depending on where the data centres are located.

Norway`s electric grid, for instance, produces just 29g of CO2/kWh, compared to 709g in South Africa.

Pakistan`s challenge For Pakistan, these developments present both challenge and opportunity; choices made today will shape the environmental impact for years to come.

Dr Kashif Talpur, a machine-learning researcher in UK`s Solent University, emphasises the importance of considering the full lifecycle of AI models.

`An AI model has to go through two stages: training and inferencing. Inferencing, which occurs when the AI model is in the hands of end users, constitutes about 90pc of its lifecycle. This stage incurs significantly higher costs... the more complex the problems... the more computational resources each inference requires.

`Today`s large language models, such as ChatGPT, Gemini, and various other copilots, are equipped with billions of parameters.

Maintaining and cooling the infrastructure, like data centres, adds to the ongoing costs in terms of energy consumption, hardware wear and tear, and operational expenses,` Dr Talpur notes.

For Pakistan, charting a path forward means investing not just in AI adoption, but in making that uptake sustainable. It means cultivating local talent and ensuring Pakistani voices are heard in the global dialogue about responsible AI.

Most of all, it means recognising that embracing AI`s transformative potential comes with a responsibility a duty to deploy these tools in a way that doesn`t exacerbate the very problems they`re meant to solve.

A detailed version of this article can be accessed on Dawn.com