A comprehensive repository to explore resources aimed at reducing the environmental footprint of AI systems.
Artificial Intelligence (AI) has the potential to solve many global challenges, but its environmental impact cannot be overlooked. This repository is designed to help AI practitioners and researchers adopt sustainable practices throughout the AI development lifecycle. By implementing energy-efficient models, carbon-aware algorithms, and eco-friendly tools, we can reduce the carbon footprint and energy consumption of AI systems.
Contributions and pull requests are welcome!
These tools are designed to measure and quantify energy usage, carbon emissions, and other environmental impacts of AI workloads.
- AIPowerMeter – A library that enables monitoring energy usage of machine learning programs, using RAPL for the CPU and nvidia-smi for the GPU.
- CodeCarbon – Estimates the carbon footprint during the training of machine learning models.
- EnergyMeter – A Python module combining pyRAPL, NVIDIA-SMI, and eBPF to estimate energy consumption of CPU, memory, GPU, and storage on Linux with only three lines of code.
- pyJoules – A Python library that uses hardware measurement tools (Intel RAPL, NVIDIA GPU tools, etc.) to measure device energy consumption.
- EcoLogits – EcoLogits tracks the energy consumption and environmental impacts of using generative AI models through APIs.
- RouteLLM – RouteLLM is a framework for serving and evaluating LLM routers. Drop-in replacement for OpenAI's client (or launch an OpenAI-compatible server) to route simpler queries to cheaper models.
- Green Algorithms – Provides an online calculator to estimate the carbon footprint of workloads.
- ML CO2 Impact – Provides an online calculator to estimate the carbon footprint of AI workloads.
- EcoLogits – Estimates the environmental impacts of LLM inference.
These dashboards provide insights into the carbon footprint and energy usage of cloud workloads from major hyperscalers. The data is not real-time and is typically delayed, meaning it cannot be used for real-time actions or optimizations.
- AWS Customer Carbon Footprint Tool – Allows AWS customers to view their estimated carbon emissions associated with AWS cloud services.
- Azure Emissions Impact Dashboard – Helps Azure users estimate and monitor the carbon emissions of their Azure cloud workloads.
- Google Cloud Carbon Footprint Dashboard – Helps measure, report, and reduce cloud carbon emissions on Google Cloud.
- Software Carbon Intensity – A specification that defines a methodology for calculating the carbon emissions rate of a software system, known as its SCI score. The specification is currently being extended to cover Classical and Generative AI workloads.
- Green AI Patterns – A collection of patterns aimed at reducing carbon emissions and improving energy efficiency in AI systems, helping practitioners adopt more sustainable AI development practices.
- Can AI Be Truly Green? – Explores how AI can reduce its carbon footprint by leveraging carbon-aware software, energy-efficient hardware, and smaller models, making AI systems more environmentally sustainable.
- What Would It Take to Make AI ‘Greener’? – Discusses the energy usage of large AI models like GPT-3 and how AI-driven initiatives can reduce carbon emissions through smarter energy use and climate action.
- Greening AI: A Policy Agenda for the Artificial Intelligence and Energy Revolutions – This paper sets out a policy agenda that aims to minimize AI's energy and carbon costs while maximizing its benefits.
- A Review of Green Artificial Intelligence: Towards a More Sustainable Future – A comprehensive review of the current state of Green AI, analyzing key trends and offering strategies for reducing the environmental impact of artificial intelligence.
- Foundations of Responsible and Sustainable AI: A Beginner's Handbook – A beginner’s handbook to get started on Sustainable AI.
- Uncovering Energy-Efficient Practices in Deep Learning Training: Preliminary Steps Towards Green AI (2023) – This paper provides a comprehensive overview of techniques for reducing the energy consumption of deep learning models, including model compression, efficient architectures, and hardware-aware training.
- A Systematic Review of Green AI (2023) – A detailed review of Green AI research, covering energy efficiency, carbon footprint reduction, and ecological impacts of AI systems.
- Green AI: Exploring Carbon Footprints and Mitigation Strategies (2022) – Examines carbon footprint and energy-saving strategies for AI development and training.
- Carbon Emissions and Large Neural Network Training (2022) – This paper focuses on the computational demands of large-scale AI models and provides a detailed analysis of their carbon emissions. The authors suggest ways to optimize neural network training to reduce the carbon footprint.
- OpenCarbonEval: A Unified Carbon Emission Estimation Framework in Large-Scale AI Models (2023) – Introduces a framework for estimating carbon emissions in large-scale AI models, offering dynamic predictions by analyzing various model architectures and hardware configurations.
- MLPerf Power: Benchmarking the Energy Efficiency of Machine Learning Systems from μWatts to MWatts for Sustainable AI (2024) – This paper presents the MLPerf Power benchmark, evaluating energy efficiency across a wide range of hardware systems, from edge devices to large cloud systems, offering insights for designing energy-efficient AI solutions.
This section lists key organizations and communities focused on reducing the environmental footprint of digital technologies, including AI and software systems.
- Green Software Foundation - A non-profit focused on reducing the environmental impact of software systems by developing standards, tools, and best practices.
- Climate Change AI – An organization dedicated to harnessing AI to address climate change.