AI has become a significant asset in the mission to tackle the biggest environmental problems we currently face. AI has already been integrated into technologies used to chart methane emissions, track air quality, measure air pollution, and to measure the ecological footprints of various products and businesses. Although the uses of AI have aided countries and businesses in their pursuit of becoming more environmentally conscious, the use of AI itself has enormous environmental ramifications. This poses a massive threat to the environment. With this issue in mind, we must ask the question: Are the ecological costs of AI worth the environmental applications of it?
What is AI and how can it be applied to environmental issues worldwide?
AI or Artificial Intelligence is a technology which enables computers and software to mimic human learning and problem-solving, as well as perform advanced functionalities like analyzing data, forming recommendations and creating proposed action plans.
The reason this kind of technology has become such a critical element in formulating solutions to environmental issues is due to AI's advanced analytical skills. AI can apply historical and current data, as well as monitor current environmental conditions to make comprehensive conclusions to worldwide situations. The advanced capabilities of AI technology currently, give hope to environmental issues including:
Climate change
Renewable energy
Biodiversity loss
Wildlife protection
Waste management
Due to its wide scope of capabilities and ability to tackle an array of issues around the world, it became a vital aspect of resolving environmental affairs.
How it impacts the environment:
Evidently, AI looks to be a valuable asset to combat environmental issues around the world, however, there are many underlying environmental costs of powering this technology which in turn puts a massive toll on our planet and resources.
AI deployments are usually housed in data centres which produce an immense amount of electronic waste, accumulating 2,600 tonnes in 2023 and are projected to increase to 2.5 million tonnes in 2030. By 2030, forecasts project that data centres will draw around 21% of the world's electricity supply. Furthermore, since there has been such a surge in the usage of AI in recent years, data centres have grown from 500,000 in 2012 to 8 million. This hike in the number of data centres over the past 12 years has not only accumulated a great deal of electronic waste to power these centres but has used an abundance of raw materials, which were needed in the construction of their technology. For example, the making of a 2 kg computer, requires 800 kg of raw materials to construct it. Also, these computers require many rare elements, usually for their microchips, which are unfortunately mined in environmentally destructive ways.
These data centres use water throughout the construction of the centres, as well as during the operations, as a way to cool their electrical components. Studies show that AI-related infrastructures will soon consume 6 times more water than Denmark, a country with a population of over 6 million people. On top of this, these data centres also require a lot of energy, usually supplied through the burning of fossil fuels, which contributes to the production of greenhouse gas emissions. Noting that one request made through ChatGPT consumes ten times as much energy as one Google Search does, the energy consumption of AI is noticeably alarming. Evidently, AI requires an absurd amount of resources to power its functionalities, which is seemingly becoming more and more concerning as its consumption of resources increases.
What are companies doing to tackle this issue?
Now that companies are gaining awareness of the energy costs of AI and learning how unsustainable its usage is for our planet, there have been several solutions to lessen AI’s consumption of energy. This includes companies optimizing hardware, cooling and power management as well as data centre designs. Companies are beginning to adopt energy-efficient servers and all-flash storage for lower power use without compromising their fast performance. Additionally, the potential integration of renewable energy sources like wind and solar may also be used to lessen the burning of fossil fuels that are currently powering data centres. Along with several other innovative strategies, these solutions enable AI operations to be more sustainable and energy-efficient while maintaining high performance and scalability.
What can we do to limit our AI energy consumption?
Although a single individual may not consume a large amount of energy consumption alone, changing tiny habits in our AI usage can make a difference. Some ways we can lessen our AI energy footprints are:
Power capping:
This limits the maximum power consumption of hardware so that it operates with a specific energy budget.
Off-peak scheduling:
When scheduling energy-intense AI tasks, work within off-peak hours when energy costs and demand are lower.
Batch processing:
Compile smaller tasks into larger batches to optimize resource utilization and reduce the energy associated with switching tasks frequently.