Efficiency is Not Enough: A Critical Perspective of Environmentally Sustainable AI (2309.02065v2)
Abstract: AI is currently spearheaded by ML methods such as deep learning which have accelerated progress on many tasks thought to be out of reach of AI. These recent ML methods are often compute hungry, energy intensive, and result in significant green house gas emissions, a known driver of anthropogenic climate change. Additionally, the platforms on which ML systems run are associated with environmental impacts that go beyond the energy consumption driven carbon emissions. The primary solution lionized by both industry and the ML community to improve the environmental sustainability of ML is to increase the compute and energy efficiency with which ML systems operate. In this perspective, we argue that it is time to look beyond efficiency in order to make ML more environmentally sustainable. We present three high-level discrepancies between the many variables that influence the efficiency of ML and the environmental sustainability of ML. Firstly, we discuss how compute efficiency does not imply energy efficiency or carbon efficiency. Second, we present the unexpected effects of efficiency on operational emissions throughout the ML model life cycle. And, finally, we explore the broader environmental impacts that are not accounted by efficiency. These discrepancies show as to why efficiency alone is not enough to remedy the adverse environmental impacts of ML. Instead, we argue for systems thinking as the next step towards holistically improving the environmental sustainability of ML.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.