Introduction
AI is becoming an increasingly large part of our day-to-day lives. With that comes energy consumption, emissions, and water usage. A lot of people worry about this — and rightly so. But we’ve noticed that very few companies actually measure their AI footprint, and even fewer do anything about it.
So we did. We looked into our numbers, realized the impact was real but smaller than expected, and decided to offset it — significantly. This post is not a victory lap. It’s a pledge, a methodology, and an invitation.
We pledge to:
- Track all AI usage in Gradr
- Be as transparent as possible regarding our impact
- Offset any AI-related emissions
This applies to Gradr’s AI usage only, not the company’s full operational footprint. Our methodology and exact pledge are outlined below.
We challenge other AI companies — especially Nordic ones — to do the same. The rest of this post explains why we think the barrier is lower than most assume.
Estimated Impact
We track everything. Every call, why it was made, what model was used, geolocation, and token usage. With this, we use a best-effort, conservative estimate to approximate our consumption.
AI providers still publish limited information on per-token energy consumption. Some have begun releasing prompt-level estimates, but these are self-reported and difficult to verify independently. We chose to base our estimates on third-party, peer-reviewed research (Luccioni et al., 2024) and public grid data instead — less precise, but more independent. We consistently choose the higher end of published ranges, and use grid-based carbon accounting rather than the market-based approach that would let us claim near-zero emissions. We believe this better reflects physical reality.
With that said, our current annual estimates are:
| Metric | Annual estimate | For reference |
|---|---|---|
| Energy | ~7,200 kWh | ~half a typical Swedish villa |
| Carbon emissions | ~225 kg CO2e | ~1 economy Stockholm → London flight |
| Water consumption | ~9,100 liters | ~60 bathtubs, or 3 days of household water |
These numbers are based on a representative week of usage from March 2026, extrapolated to a full year. For reference, that week included ~50M tokens, roughly 80% from frontier models. Usage varies seasonally (exam periods are heavier). We will update these figures at least annually.
Approach and mitigation
We offset significantly more than we emit — not because we’re certain of the exact numbers, but because we’re certain we’re not certain. A large multiplier ensures we remain net-negative even if our estimates are off by several times.
Concretely, we purchase carbon removal credits equal to 5x our estimated annual emissions (~1.1 tonnes removed vs. ~225 kg emitted), and fund water restoration equal to 25x our estimated annual water use (~227,500 liters restored vs. ~9,100 liters consumed). These contributions apply to Gradr’s AI usage only, not to the company’s full operational footprint.
Why removal, not avoidance?
Carbon credits come in two main types. Avoidance credits fund projects that prevent emissions (e.g., protecting a forest that might have been cut down). They’re cheaper, but additionality is hard to verify — a 2023 study in Science (West et al.) found that the vast majority of rainforest preservation credits did not represent real emission reductions.
Removal credits fund projects that physically remove CO2 from the atmosphere — biochar, enhanced weathering, direct air capture, biomass carbon capture. They’re more expensive, but the impact is measurable, and the removal is designed for permanent storage. We believe 1 verified tonne removed is more impactful than several uncertain tonnes avoided.
Our carbon removal is purchased through Hafslund Celsio’s biomass CCS project in Oslo, Norway — a facility that captures CO2 from waste biomass combustion and stores it permanently underground. These are forward delivery contracts: the removal is funded and scheduled, not yet physically completed. This is standard in the carbon removal market, where projects are funded ahead of execution.
Our water restoration is funded through certified Water Restoration Certificates.
We also estimate and compensate for past usage based on available data. Going forward, measurements are updated and credits purchased quarterly.
Reduction measures
Before offsetting, the most important thing is to minimize emissions in the first place. For AI inference, climate impact and cost are strongly aligned — every token we don’t generate saves both money and energy. This means we are naturally incentivized to minimize our footprint.
Steps we take include: using smaller, more efficient models where quality requirements allow; hosting on Swedish low-carbon energy grids (~98% fossil-free); caching results to avoid repeated inference; and continuously optimizing prompt length and structure.
Transparency and limitations
All figures in this post are estimates and should be understood as such. There is currently no standardized or fully reliable way to measure the environmental impact of AI use — no provider publishes per-token energy data, model architectures are proprietary, and actual per-request energy depends on GPU utilization rates we cannot observe.
We consistently use conservative assumptions: higher-end energy estimates, grid-based (not market-based) carbon accounting, and generous uncertainty margins on our offset multipliers. Usage patterns vary seasonally, and our per-token estimates could be off by a factor of 2–3.
Our estimates also cover inference only — the energy consumed when Gradr makes an AI call. They do not include the energy used to train the underlying models, which is substantial but shared across all users of those models worldwide. We have no way to meaningfully attribute our share of training costs, and excluding them is standard practice, but we acknowledge it as a limitation.
We share these numbers to be transparent about how we approach the issue, not to claim precision. The methodology will be reviewed annually, while measurements and offset purchases are updated quarterly.
Conclusion
We use AI as part of Gradr, and that comes with an environmental impact. Our approach is to measure it, reduce it where possible, and take responsibility for what remains.
For companies running AI on the Swedish grid — or any Nordic grid — the barrier to doing this is remarkably low. The emissions are small, the offsets are cheap, and the data to estimate your footprint already exists in your logs. If your AI runs on one of the cleanest grids in the world, you don’t have a good reason not to take a similar initiative.
We’d like to see more companies — especially Nordic ones — do the same. Measure, publish, offset. The methodology doesn’t have to be perfect. It just has to be honest.
Sources
- De Vries, A. (2023). “The growing energy footprint of artificial intelligence.” Joule, 7(10).
- Electricity Maps (2024). Electricity carbon intensity data.
- European Environment Agency (2024). Greenhouse gas emission intensity of electricity generation.
- Hafslund Celsio / Longship CCS project. frontierclimate.com/writing/hafslundcelsio
- Li, P. et al. (2023). “Making AI Less Thirsty.” arXiv:2304.03271.
- Luccioni, A. S., Jernite, Y., & Strubell, E. (2024). “Power Hungry Processing: Watts Driving the Cost of AI Deployment?” ACM FAccT 2024.
- Swedish Energy Agency (2024). Electricity production and energy mix in Sweden.
- Water Restoration Certificates (WRC). Bonneville Environmental Foundation. b-e-f.org/programs/water-restoration-certificates
- West, T. A. P. et al. (2023). “Action needed to make carbon offsets from forest conservation work for climate change mitigation.” Science, 381(6660).

