Artifical intelligence (AI) is the new technology sweeping every industry, AEC included. But while artificial intelligence (AI) is great for streamlining workflows and eliminating administrative work, it also has a profoundly impact on the environment. The environmental impact of AI use is now taking center stage as AI models consume significant energy, contribute to greenhouse gas emissions, and compromise our freshwater sources. The damage is projected to reach billions and affect our electrical grids if we fail to take more stringent measures to mitigate the impact.
The escalating costs of AI on the environment are skyrocketing
With the rise in AI adoption across every industry comes the rise in energy consumption. And the demand is only growing. But why does AI, in particular, use a lot of energy compared to other online activities?
The answer is simple. To adequately power complex electronics used for AI, the data centers that host them need energy. In most places, as of now, that means burning fossil fuels and producing greenhouse gases. To envision what that looks like, consider the popular AI virtual assistant ChatGPT. The International Energy Agency reports that a single request through ChatGPT consumes ten times the electricity of a regular Google search. The agency also reports that in Ireland’s tech hub alone, the increase in AI use will drive data centers to produce nearly 35% of the country’s total energy use by 2026.

The release of additional fossil fuels into the air meant thermal pollution in our water, higher production of solid wastes and hazardous materials, and, of course, local air pollution. There is also additional strain on freshwater sources from the increased demand for water consumption to cool these data centers, worsening areas already affected by droughts.
Lastly, the elevated carbon emissions in a given region pose a risk of higher levels of ozone and particulate matter, as well as potentially premature mortality in localized areas.
Energy consumption of AI models
When we zoom in further on AI use, it’s apparent that there are staggering environmental consequences even before an AI tool gets off the ground. The training process for a new AI model or large language model (LLM) involves thousands of megawatt-hours of electricity usage and hundreds of tons of carbon emissions. For scale, this is roughly equated to the carbon emissions of hundreds of American homes in just one year. Like regular AI use, the training process for AI tooling puts more strain on freshwater resources.
The impacts on our environment are expected to continue growing from here. The global AI energy demand is expected to rise to ten times its current level and exceed the annual electricity consumption of small countries like Belgium by next year. Data center energy use is expected to grow to about 6% of the total electricity consumption by 2026 in the United States. This applies even more pressure on current grid infrastructures.
Data centers are being built with massive infrastructure requirements
The temperature-controlled buildings that host data storage drives, network equipment, and servers, known as data centers, also house computing infrastructure for non-AI purposes. Amazon, for example, has over 100 centers worldwide. Each data center has approximately 50,000 servers that Amazon utilizes to support its cloud computing activities. These centers have been around for decades, but AI changes the game with the severe increase in demand.
Scientists have studied power requirements over recent years and found that, in 2022, North America’s power requirements were around 2,688 megawatts. By the end of 2023, that demand had grown to 5,341 megawatts, largely due to the increased demand for generative AI.

While not all data center usage involves AI, it’s been a significant factor in driving emissions. The increase in AI use has also increased the demand for new data centers, which require mining:
- Steel
- Concrete
- Drywall
- Lumber
- Copper, and
- Glass
Some of these materials, especially concrete, are particularly harsh on the environment to produce. The more materials are needlessly produced to meet the energy demands of AI, the greater the environmental suffering.
How to mitigate the impacts of AI and build with sustainability in mind
As of spring 2025, over 190 countries have applied recommendations to keep AI use ethical, covering ways to lessen the impact on the environment. The United States and the European Union have also established legislation to temper the environmental impact of AI, but these new policies are, as of now, few and far between.
The United Nations Environment Programme (UNEP) has recommended five things to lessen the impact of environmental harm from AI:
- Countries need to establish standardized procedures for measuring the environmental impact of AI. A lack of information is where the problem starts.
- Governments can create regulations that require AI companies to disclose the environmental impact of their products or services.
- Tech businesses can increase AI algorithm efficiency, thereby reducing energy demand, by using recycled water and other data center components where applicable.
- Countries should encourage businesses to “green” their data centers, which includes using renewable energy to offset emissions.
- Lastly, countries should integrate their AI policies into their regular environmental policies, keeping these new regulations a standardized practice.
The demand for AI is expected to continue exacerbating environmental issues, particularly in the short term. Luckily, countries and businesses are starting to consider how to lessen the damage and create more energy-efficient AI use for the future.
It would take a worldwide effort to counteract the environmental damage done by demand. Still, if countries band together on this unified priority, we might see a greener AI for the future.
Don’t forget to subscribe to our newsletter and follow us on LinkedIn to stay in the loop about how AI sweeps the architecture, engineering, and construction spaces.