Academic publications that study environmentally sustainable artificial intelligence – Green AI. In particular, this bibliography focus on research that aims at making the development of AI systems more friendly to the environment.

Contributions are welcome! If you have suggestions for other readings consider sending a pull request or creating an issue in the Github repository! 😉

Papers

📅 2024

Jieke Shi, Zhou Yang, Hong Jin Kang, Bowen Xu, Junda He, David Lo (2024). Greening Large Language Models of Code. ICSE-SEIS.
model optimization energy estimation

This paper shows an interesting approach to compress language models of code into smaller and more efficient models. The approach consists of using optimisation to deliver a model that minimises size, FLOPs (mentioned as a proxy of time, energy consumption, and carbon emission), and maximises mod… Read more.

H Järvenpää, P Lago, J Bogner, G Lewis, H Muccini, I Ozkaya (2024). A Synthesis of Green Architectural Tactics for ML-Enabled Systems. ICSE-SEIS.

Timur Babakol, Yu David Liu (2024). Tensor-Aware Energy Accounting. ICSE-SEIS.

📅 2023

A Tundo, M Mobilio, S Ilager, I Brandić, E Bartocci, L Mariani (2023). An Energy-Aware Approach to Design Self-Adaptive AI-based Applications on the Edge. ASE.

Samarth Sikand, Vibhu Saujanya Sharma, Vikrant Kaulgud, Sanjay Podder (2023). Green AI Quotient: Assessing Greenness of AI-based software and the way forward. ASE.

Tim Yarally, Luís Cruz, Daniel Feitosa, June Sallou, and Arie van Deursen (2023). Uncovering Energy-Efficient Practices in Deep Learning Training: Preliminary Steps Towards Green AI. CAIN.

Jie You, Jae-Won Chung, Mosharaf Chowdhury (2023). Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training. NSDI ‘23.
Green computing Cloud computing Optimizing energy Tool

📅 2022

Mirazul Haque, Yaswanth Yadlapalli, Wei Yang, Cong Liu (2022). EREBA: Black-box Energy Testing of Adaptive Neural Networks. ICSE.
Energy testing Adaptive neural networks

Read more.

(2022). Green AI: do deep learning frameworks have different costs?. ICSE.
Deep Learning AI frameworks

📅 2021

Carole-Jean Wu, Ramya Raghavendra, Udit Gupta, Bilge Acun, Newsha Ardalani, Kiwan Maeng, Gloria Chang, Fiona Aga Behram, James Huang, Charles Bai, Michael Gschwind, Anurag Gupta, Myle Ott, Anastasia Melnikov, Salvatore Candido, David Brooks, Geeta Chauhan, Benjamin Lee, Hsien-Hsin S. Lee, Bugra Akyildiz, Maximilian Balandat, Joe Spisak, Ravi Jain, Mike Rabbat, Kim Hazelwood (2021). Sustainable AI: Environmental Implications, Challenges and Opportunities. FAccT ‘21: Conference on Fairness, Accountability, and Transparency.

Ana Radovanovic, Ross Koningstein, Ian Schneider, Bokan Chen, Alexandre Duarte, Binz Roy, Diyue Xiao, Maya Haridasan, Patrick Hung, Nick Care, Saurav Talukdar, Eric Mullen, Kendal Smith, MariEllen Cottman, Walfredo Cirne (2021). Carbon-Aware Computing for Datacenters. arXiv preprint.
Green computing Cloud computing Carbon-aware datacenter

This paper presents Google’s Carbon-Intelligent Computing System (CICS), that uses carbon-intensity data to shift datacenter jobs in time. Based on carbon-intensity and power cost, the system computes a metric named virtual capacity curve (VCC) that limits the amount of CPU resources avail… Read more.

Timothy Nguyen, Zhourong Chen, Jaehoon Lee (2021). Dataset Meta-Learning from Kernel Ridge-Regression. International Conference on Learning Representations (ICLR).
Dataset Distillation Data Efficiency Green Data

This paper presents a technique called Kernel Inducing Points (KIP) to generate new smaller datasets that yield similar performance when used to train a machine learning model. In other words, KIP is an algorithm for dataset distillation. Although the paper does not primarily focus on energy … Read more.

Emily M. Bender, Timnit Gebru, Angelina McMillan-Major, Shmargaret Shmitchell (2021). On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜. FAccT ‘21: Conference on Fairness, Accountability, and Transparency.
NLP Position Paper

A call for the realignment of the research directions in the NLP research community. This position paper does a reflection on current societal issues of the development of NLP models and their potential risks and harms. Mostly, it brings awareness to the fact that NLP models can reproduce or ampl… Read more.

📅 2020

Roy Schwartz, Jesse Dodge, Noah A. Smith, Oren Etzioni (2020). Green AI. Communications of the ACM.
Red AI Position Paper

📅 2019

Emma Strubell, Ananya Ganesh, Andrew McCallum (2019). Energy and policy considerations for deep learning in NLP. Arxiv.
NLP Estimating Energy Carbon Emissions

Measures the energy consumption and carbon footprint of training state-of-the-art natural language processing (NLP) models. The authors estimate energy consumption by collecting the reported time required to train those models and the average power consumption of the GPU hardware. The authors als… Read more.

Alexandre Lacoste, Alexandra Luccioni, Victor Schmidt Thomas Dandres (2019). Quantifying the carbon emissions of machine learning. Arxiv.
Estimating Energy Carbon Emissions Tool Position paper

The paper presents a tool to estimate carbon emissions: the Machine Learning Emissions Calculator. Based on the Hardware type (e.g, Tesla P100), cloud provider (e.g., Google Cloud Platform), location of the data centre, and number of hours used, it computes the total carbon emitted in kg CO2eq. T… Read more.

Eva García-Martín, Crefeda Faviola Rodrigues, Graham Riley, Håkan Grahn (2019). Estimation of energy consumption in machine learning. Journal of Parallel and Distributed Computing.
Estimating Energy Literature Review

Everything there is to know about the theory behind creating power models. The paper also shows an overview of five existing tools that help creating power models: ARM Streamline Powmon, Intel Power Gadget McPAT, PAPI. Although it describes the compatibility of the tools in terms of operating sys… Read more.