Back to Blog

Carbon Accounting for AI Workloads: Measuring and Reporting GPU Emissions

NVIDIA publishing H100 PCF at 1,312 kg CO2e per 8-card baseboard (164 kg/card). Cornell study projecting 24-44M metric tons annual AI CO2 by 2030. Amazon emissions rising to 68.25M metric tons in...

Carbon Accounting for AI Workloads: Measuring and Reporting GPU Emissions

Carbon Accounting for AI Workloads: Measuring and Reporting GPU Emissions

Updated December 11, 2025

December 2025 Update: NVIDIA publishing H100 PCF at 1,312 kg CO2e per 8-card baseboard (164 kg/card). Cornell study projecting 24-44M metric tons annual AI CO2 by 2030. Amazon emissions rising to 68.25M metric tons in 2024, first increase since 2021. AI servers projected to consume 70-80% of U.S. data center electricity (240-380 TWh) by 2028.

NVIDIA published the Product Carbon Footprint for H100 baseboard with eight H100 SXM cards, estimating embodied emissions at 1,312 kg CO2e—approximately 164 kg CO2e per card.1 Memory contributes 42% of the embodied footprint, integrated circuits contribute 25%, and thermal components contribute 18%. The disclosure represents the first vendor-based assessment offering transparency into GPU environmental impact, establishing a baseline for organizations tracking AI infrastructure emissions.

The stakes for AI carbon accounting grow with each deployment. A Cornell study found that by 2030, current AI growth rates would annually emit 24 to 44 million metric tons of CO2—equivalent to adding 5 to 10 million cars to U.S. roadways.2 Amazon reported emissions rose from 64.38 million metric tons in 2023 to 68.25 million metric tons in 2024, the company's first increase since 2021, primarily driven by data centers and delivery operations.3 Google's 2023 greenhouse gas emissions increased 13% year-over-year amid the AI race.4 Organizations deploying AI infrastructure need carbon accounting practices that measure impact and identify reduction opportunities.

Understanding AI carbon footprint components

The carbon footprint of AI consists of two main components: embodied emissions from manufacturing IT equipment and constructing data centers, and operational emissions from electricity consumed during AI-related computations.5 Comprehensive carbon accounting must address both categories.

Embodied emissions

Embodied emissions occur before AI systems process a single token. Manufacturing GPUs requires energy-intensive semiconductor fabrication, rare earth extraction, and global supply chain transportation. NVIDIA's H100 PCF breakdown shows the distributed nature of embodied carbon across memory, processing, and cooling components.6

Data center construction adds embodied emissions from concrete, steel, and mechanical systems. A hyperscale data center embodies millions of tons of CO2e before becoming operational. The construction emissions amortize over facility lifetime, but high-density AI deployments may require more frequent infrastructure refresh than traditional computing.

Operational emissions

Operational emissions depend on compute intensity and electricity carbon intensity. AI servers consumed 23% of total U.S. data center electricity in 2024 and will consume 70-80% (240-380 TWh annually) by 2028 according to the 2024 U.S. Data Center Energy Usage Report.7

The carbon intensity of that electricity varies dramatically by location and time. A GPU consuming 700W in a coal-powered grid produces far more emissions than identical hardware in a renewable-powered facility. Approximately 56% of electricity consumed by U.S. data centers came from fossil fuel-burning plants, with 16% from coal.8 Geographic and temporal optimization of AI workloads offers substantial emissions reduction opportunity.

Metrics and standards

Carbon accounting for AI requires standardized metrics enabling comparison across organizations and verification of reduction claims.

Established data center metrics

Power Usage Effectiveness (PUE) measures data center infrastructure efficiency by comparing total facility power to IT equipment power.9 A PUE of 1.5 means 50% of power goes to cooling and overhead. Modern data centers target PUE below 1.2.

Data Center Infrastructure Efficiency (DCIE) inverts PUE, expressing IT power as a percentage of total power. Both metrics help optimize infrastructure but don't directly measure carbon emissions.

Carbon Usage Effectiveness (CUE) links energy use with carbon emissions (kg CO2 per kWh), accounting for electricity source.10 CUE captures the carbon intensity dimension that PUE misses.

Energy Reuse Factor (ERF) quantifies waste heat reuse, crediting facilities that supply heat to external consumers.11 District heating arrangements where data center waste heat warms buildings reduce net emissions.

AI-specific measurement challenges

GPU TDP has increased by an average of 41.5% per year (compounded) from 2021 to 2025.12 The power increase outpaces efficiency improvements, meaning next-generation GPUs consume more energy even as they process more tokens per watt.

Training versus inference emissions require separate accounting. A training run consuming months of GPU time on thousands of accelerators generates substantial one-time emissions. Inference emissions accumulate as users query the trained model over its operational lifetime. Organizations must track both phases.

Tracking and reporting tools

Several tools and frameworks support AI carbon accounting.

eco2AI

The eco2AI open-source package helps data scientists track energy consumption and equivalent CO2 emissions of machine learning models.13 The tool focuses on accurate energy tracking and regional CO2 emissions accounting, converting compute time into carbon impact based on grid carbon intensity.

Researchers integrate eco2AI into training pipelines to accumulate emissions estimates across experiments. The approach surfaces carbon cost alongside accuracy metrics, enabling carbon-aware model development decisions.

Regulatory frameworks

In early 2024, lawmakers introduced the Artificial Intelligence Environmental Impacts Act, directing EPA to study AI's environmental footprint and develop measurement standards through NIST.14 The proposed voluntary reporting system would standardize how organizations disclose AI emissions.

The EU AI Act establishes data governance requirements affecting AI sustainability reporting. Organizations deploying high-risk AI systems may face disclosure requirements including environmental impact. The regulatory trajectory suggests mandatory reporting will follow voluntary frameworks.

Reduction strategies

Cornell researchers concluded that "there isn't a silver bullet" for AI emissions reduction.15 Siting, grid decarbonization, and efficient operations work together to achieve reductions of roughly 73% for carbon and 86% for water. Effective strategies combine multiple approaches.

Geographic optimization

Data center location determines baseline carbon intensity. Facilities in regions with renewable-heavy grids produce fewer emissions than identical facilities in fossil-dependent regions. Virginia hosts more U.S. data centers (301) than any other state, followed by California (248) and Texas (221).16 Each state offers different grid carbon profiles.

Organizations with workload flexibility can route jobs to lower-carbon locations. Training runs that tolerate latency can shift to times and places where renewable generation peaks. The optimization requires carbon-aware scheduling capabilities.

Operational efficiency

Efficient operations reduce energy consumption regardless of grid carbon intensity. Schneider Electric partnered with NVIDIA to design reference architectures reducing cooling energy usage by nearly 20%.17 Similar efficiency improvements across power distribution, cooling, and compute utilization compound into significant emissions reductions.

AI workload optimization reduces compute requirements for equivalent output. Model distillation, quantization, and efficient inference frameworks like NVIDIA NIM reduce energy per inference. Efficiency improvements at the workload level multiply across millions of inference requests.

Renewable procurement

Direct renewable procurement through power purchase agreements ensures clean electricity for AI operations. Organizations can procure 24/7 carbon-free energy matching consumption hour-by-hour rather than annually, eliminating residual fossil consumption during renewable lulls.

On-site generation with solar and battery storage provides renewable power without transmission losses. Data center campuses in high-solar regions can generate substantial portions of their consumption locally.

Implementing carbon accounting

Organizations implementing AI carbon accounting should establish baseline measurements, integrate tracking into operations, and develop reduction roadmaps.

Baseline establishment

Inventory existing AI infrastructure including GPU counts, power ratings, and utilization patterns. Document data center locations and electricity sources. Calculate current operational emissions using measured power consumption and regional carbon intensity factors.

Estimate embodied emissions using vendor disclosures like NVIDIA's H100 PCF and industry averages for unlisted components. Amortize embodied emissions over expected equipment lifetime, typically 3-5 years for accelerators.

Operational integration

Integrate emissions tracking into infrastructure monitoring. GPU monitoring tools like NVIDIA DCGM provide power consumption data. Combine power telemetry with carbon intensity APIs providing real-time grid carbon information.

Introl's network of 550 field engineers support organizations implementing sustainability-focused GPU infrastructure monitoring.18 The company ranked #14 on the 2025 Inc. 5000 with 9,594% three-year growth, reflecting demand for professional infrastructure services.19

Deploying monitoring across 257 global locations requires consistent practices enabling comparable carbon accounting across facilities.20 Introl manages deployments reaching 100,000 GPUs with over 40,000 miles of fiber optic network infrastructure, providing operational scale for comprehensive emissions tracking.21

Reduction roadmaps

Set emissions reduction targets aligned with science-based targets and organizational commitments. Identify reduction opportunities across efficiency, renewable procurement, and geographic optimization. Prioritize actions by emissions impact and implementation feasibility.

Track progress against baseline and targets. Report emissions following emerging standards and regulatory requirements. Prepare for increasing disclosure requirements as AI carbon accounting matures.

The accountability imperative

Goldman Sachs Research estimates 60% of increasing data center electricity demand will come from burning fossil fuels, increasing global carbon emissions by approximately 220 million tons.22 The emissions trajectory makes carbon accounting essential rather than optional for AI organizations.

Organizations that establish robust carbon accounting now prepare for regulatory requirements while identifying efficiency opportunities that reduce both emissions and costs. The measurement foundation enables reduction strategies that would otherwise lack targeting. Carbon-aware AI infrastructure represents both environmental responsibility and operational excellence.

Key takeaways

For sustainability teams: - NVIDIA H100 embodied emissions: 1,312 kg CO2e per baseboard (8 cards); memory contributes 42%, ICs 25%, thermal 18% - AI could emit 24-44 million metric tons CO2 annually by 2030—equivalent to 5-10 million cars on US roadways (Cornell) - GPU TDP increased 41.5% CAGR (2021-2025); power growth outpaces efficiency improvements

For infrastructure architects: - AI servers: 23% of US data center electricity (2024), projected 70-80% (240-380 TWh) by 2028 - 56% of US data center electricity from fossil fuels, 16% from coal; geographic optimization critical - PUE measures infrastructure efficiency; CUE links energy use with carbon intensity (kg CO2/kWh)

For operations teams: - eco2AI open-source package tracks energy consumption and CO2 emissions of ML models by regional grid intensity - Combine GPU power telemetry (NVIDIA DCGM) with real-time carbon intensity APIs for emissions monitoring - Training vs inference require separate accounting—training is one-time, inference accumulates over operational lifetime

For compliance teams: - Artificial Intelligence Environmental Impacts Act directs EPA study and NIST measurement standards - EU AI Act establishes data governance affecting sustainability reporting for high-risk AI systems - Goldman Sachs: 60% of increasing data center electricity demand will come from fossil fuels, adding 220M tons CO2

For strategic planning: - Cornell: no "silver bullet"—siting, grid decarbonization, and efficient operations together achieve 73% carbon and 86% water reduction - Schneider Electric/NVIDIA partnership reduced cooling energy 20% through optimized architectures - Virginia leads US data center count (301), followed by California (248) and Texas (221); each offers different carbon profiles

References


SEO Elements

Squarespace Excerpt (157 characters): NVIDIA's H100 embodies 1,312 kg CO2e per baseboard. Learn carbon accounting for AI workloads covering emissions tracking, metrics, and reduction strategies.

SEO Title (57 characters): Carbon Accounting for AI: GPU Emissions Tracking Guide 2025

SEO Description (154 characters): Measure and report AI carbon footprint with GPU emissions tracking. Cover embodied vs operational emissions, eco2AI tools, and reduction strategies for data centers.

URL Slugs: - Primary: carbon-accounting-ai-workloads-gpu-emissions-tracking-2025 - Alt 1: ai-carbon-footprint-gpu-emissions-measurement-guide - Alt 2: gpu-carbon-accounting-data-center-emissions-2025 - Alt 3: ai-sustainability-metrics-carbon-tracking-reporting


  1. Interact. "Understanding GPU's Energy and Environmental Impact – Part I." Interact DC. 2024. https://interactdc.com/posts/understanding-gpus-energy-and-environmental-impact-part-i/ 

  2. Cornell Chronicle. "'Roadmap' shows the environmental impact of AI data center boom." November 2025. https://news.cornell.edu/stories/2025/11/roadmap-shows-environmental-impact-ai-data-center-boom 

  3. Smithsonian Magazine. "A.I. Is on the Rise, and So Is the Environmental Impact of the Data Centers That Drive It." 2025. https://www.smithsonianmag.com/science-nature/with-ai-on-the-rise-what-will-be-the-environmental-impacts-of-data-centers-180987379/ 

  4. Smithsonian Magazine. "A.I. Is on the Rise." 2025. 

  5. MIT News. "Explained: Generative AI's environmental impact." January 17, 2025. https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117 

  6. Interact. "Understanding GPU's Energy and Environmental Impact." 2024. 

  7. arXiv. "Environmental Burden of United States Data Centers in the Artificial Intelligence Era." November 2024. https://arxiv.org/html/2411.09786v1 

  8. arXiv. "Environmental Burden of United States Data Centers." November 2024. 

  9. Carbon Direct. "Understanding the carbon footprint of AI and how to reduce it." 2024. https://www.carbon-direct.com/insights/understanding-the-carbon-footprint-of-ai-and-how-to-reduce-it 

  10. FAS. "Measuring AI's Energy/Environmental Footprint to Access Impacts." 2024. https://fas.org/publication/measuring-and-standardizing-ais-energy-footprint/ 

  11. FAS. "Measuring AI's Energy/Environmental Footprint." 2024. 

  12. Interact. "Understanding GPU's Energy and Environmental Impact." 2024. 

  13. Springer. "eco2AI: Carbon Emissions Tracking of Machine Learning Models." Doklady Mathematics. 2022. https://link.springer.com/article/10.1134/S1064562422060230 

  14. FAS. "Measuring AI's Energy/Environmental Footprint." 2024. 

  15. Cornell Chronicle. "'Roadmap' shows the environmental impact." November 2025. 

  16. arXiv. "Environmental Burden of United States Data Centers." November 2024. 

  17. MIT News. "Responding to the climate impact of generative AI." September 30, 2025. https://news.mit.edu/2025/responding-to-generative-ai-climate-impact-0930 

  18. Introl. "Company Overview." Introl. 2025. https://introl.com 

  19. Inc. "Inc. 5000 2025." Inc. Magazine. 2025. 

  20. Introl. "Coverage Area." Introl. 2025. https://introl.com/coverage-area 

  21. Introl. "Company Overview." 2025. 

  22. MIT News. "Responding to the climate impact of generative AI." September 2025. 

  23. Frontiers. "Forecasting US data center CO2 emissions using AI models." Frontiers in Sustainability. 2024. https://www.frontiersin.org/journals/sustainability/articles/10.3389/frsus.2024.1507030/full 

Request a Quote_

Tell us about your project and we'll respond within 72 hours.

> TRANSMISSION_COMPLETE

Request Received_

Thank you for your inquiry. Our team will review your request and respond within 72 hours.

QUEUED FOR PROCESSING