List of articles from the MIT Lincoln Laboratory Supercomputing Center related to Datacenter Energy and Reduction of AI Training/Inference Environmental Impact
Papers:
Nature Machine Intelligence (accepted) - Neural Scaling of Deep Chemical Models [preprint]
SOCC'2023 (coming soon) - Sustainable supercomputing for AI: Experiences from GPU Power Capping at HPC Scale
SC'2023 (coming soon) - Sustainable HPC: Modeling, Characterization, and Implications of Carbon Footprint in Modern HPC Systems [preprint]
SC'2023 (coming soon) - Clover: Toward Sustainable AI with Carbon-Aware Machine Learning Inference Service [preprint]
HPEC'2023 (accepted) - From Words to Watts: Benchmarking the Energy Costs of Large Language Model Inference
HPEC'2023 (Outstanding Student Paper; accepted) - An Analysis of Energy Requirement for Computer Vision Algorithms
HPDC'2023 (best paper award) - Kairos: Building Cost-Efficient Machine Learning Inference Systems with Heterogeneous Cloud Resources
HPCA NetZero 2023 - Interventions to Reduce AI Energy Requirements
HPCA NetZero 2023 - Challenges in Building the Carbon Footprint Model for Large-Scale GPU Systems
NAACL 2022 - Great Power, Great Responsibility: Recommendations for Reducing Energy for Training Language Models
IPDPS ADOPT 2022 - Loss Curve Approximations for Fast Neural Architecture Ranking & Training Elasticity Estimation
IPDPS ADOPT 2022 - A Green (er) World for AI
IPDPS ADOPT 2022 - Energy-aware neural architecture selection and hyperparameter optimization
IEEE HPEC 2021 - Serving Machine Learning Inference Using Heterogeneous Hardware
Presentations:
Talk at TTI/Vanguard - [YouTube] [Slides]
Talk at Open Computing Project - [Slides]