In addition to consuming many natural resources, artificial intelligence (AI) inadvertently pollutes the Earth. Each step of its supply chain adversely affects forests, pastureland and waterways. As it becomes more popular, its ecological impact grows exponentially, and people wonder if there is a way to make this technology sustainable.
The AI Supply Chain’s Environmental Impact
The AI supply chain starts with mineral extraction, where workers in open-pit mines extract materials like cobalt or lithium from the Earth. Companies must clear miles of land to make way for mining and machinery. The resulting habitat loss adversely affects biodiversity and migration — and this is only the first stage.
Suppliers move those minerals and rare earth elements to manufacturers for hardware fabrication, generating greenhouse gas emissions in transit. Research suggests the production phase has the greatest negative environmental impact. For reference, manufacturers need an estimated 800 kilograms of raw materials to produce a single two-kilogram computer.
Model training can begin once the hardware is assembled. This process requires an enormous amount of power, which produces greenhouse gases. Data center energy consumption was responsible for 0.3% of global carbon emissions in 2022. In the United States, most electricity generation comes from fossil fuels like natural gas, coal and petroleum.
Every user interaction consumes electricity. Like standard hardware, AI servers get hot when processing data, so they rely on air or water cooling systems, both of which use even more power. The latter is more energy-efficient but depletes a scarce natural resource.
While water covers 70% of the planet, only 3% is freshwater and roughly 66% of that is trapped in frozen glaciers. Despite being in short supply, many data center operators use this resource to cool their AI servers, evaporating up to nine liters per kilowatt-hour of energy. This is a problem because many people lack access to clean, drinkable water.
The Final Stage of the AI Supply Chain
While AI is intangible and typically exists decentralized in the cloud, it is still present in the physical world in the form of microchips and servers. Tensor processing units, central processing units and storage systems are also important components.
No hardware lasts forever. Even though mineral extraction and fabrication are expensive and time-consuming, components are replaceable, especially when backups exist. When they can no longer hold a charge or keep up with processing speed requirements, they become electronic waste destined for landfills.
The World Health Organization says this waste is one of the fastest-growing waste streams in the world, with around 62 million tonnes produced globally in 2022. Just 22.3% was formally collected and recycled. When this waste ends up in landfills in developing countries, people often scavenge for valuable parts. They burn the rest, producing toxic fumes.
Discarded electronics are hazardous because they leach heavy metals and pollutants into the ground and nearby waterways, destroying arable land and contaminating aquifers. Also, nearby wildlife and people are exposed to hazards like cadmium, mercury and lead, affecting their well-being and quality of life.
How AI’s Popularity Could Compound This Issue
Despite its ecological outlook, AI is capable of some great things for the planet. For example, it can monitor forested areas to mitigate and identify wildfires, saving countless plants, animals and people.
Its versatility makes it increasingly popular — it can be a tutor, an assistant or an advisor. Corporations have embraced it because it improves productivity and decreases costs. In major industries like oil and gas, aviation, and health care, a 1% increase in efficiency could yield $276 billion in savings over 15 years.
AI is not all good or all bad. However, recognizing its flaws is essential for improving it. It is resource intensive, so its adverse effects grow exponentially at scale. Investments are driving research and development, causing algorithms to grow larger and more sophisticated.
Typically, the bigger the model, the larger its impact. Research shows training a 175-billion-parameter model equivalent to GPT-3 would consume 1,287 megawatt-hours of electricity and produce 552.1 tons of carbon dioxide equivalent emissions over 14.8 days. This is just to get the model ready for launch and assumes no one has used it yet.
Algorithms are getting larger to accommodate larger workloads and more data centers are being built daily. As the ecological impact of the AI supply chain grows, finding a solution becomes increasingly urgent.
Is it Possible to Improve AI’s Ecological Footprint?
This technology has developed rapidly, and thousands of facilities are already dedicated to it. In 2025, there were nearly 5,450 data centers in the U.S. alone. While not all of them power AI applications, many do.
Even though tearing down data centers and closing open-pit mines is unlikely, stakeholders can still make a difference. Simply downsizing large language models will help curb the effects of mineral extraction and server use, decreasing greenhouse gas emissions.
Powering data centers with renewables will make a more dramatic difference. These sources account for 27% of data center energy consumption as of 2025, so there is room for improvement. While wind, hydroelectric and geothermal feasibility vary depending on location, photovoltaic panels are versatile since almost every corner of the planet receives sunlight.
Incorporating renewables into existing facilities will require retrofitting, but the process should be relatively easy since they are passive generators. Public-private partnerships could incentivize states to supplement local power grids with solar panels or wind turbines, supporting the transition from fossil fuels.
Even if renewables are more common, air cooling is still energy-inefficient. However, water cooling with freshwater consumes a scarce natural resource. Dielectric liquids like mineral oil can safely interact with electronic components, so server racks could be fully submerged without issue.
Can Policymakers Curb This Issue?
Even with potential solutions within reach, tech giants may not feel incentivized to take action when they have to foot the bill. Regulations are key for consistently applying a standardized, enforceable solution throughout the country.
No laws regulate AI’s environmental impact in the U.S. Currently, indicators suggest policymakers may even oppose setting rules. In a budget reconciliation measure that passed the U.S. House of Representatives in May 2025, a provision outlines banning state-level AI regulations for at least 10 years, preempting 1,000 active bills.
Federal laws supersede state laws if they conflict, so curbing the AI supply chain’s ecological impact through policymaking may be challenging. Although nonbinding recommendations and suggested safeguards are helpful, they are unenforceable. That doesn’t mean no one should take action — public perception and expectations are powerful.
People who feel strongly about this issue can call their local representatives. Even if policymakers don’t regulate this technology’s environmental impact, improving renewable energy utilization and leveraging more efficient cooling techniques are still viable solutions.
The Future Outlook
Since this technology is developing rapidly, there is no telling how its influence will evolve with absolute certainty. However, since it is quickly becoming the focal point of many industries, companies will likely build more sustainable data centers to house more AI servers. Urgently mitigating its adverse effects to allow it to grow sustainably is vital.