Author

Leo Mei

Publication Date

Spring 2025

Degree Type

Master's Project

Degree Name

Master of Science in Computer Science (MSCS)

Department

Computer Science

First Advisor

Mark Stamp

Second Advisor

Fabio Di Troia

Third Advisor

Faranak Abri

Keywords

Deep Learning, Model Compression, Energy Conservation, Sustainable AI, Steganographic Capacity

Abstract

In recent years, neural network models have achieved phenomenal performance due to the increasing parameters and complexity of model architectures. However, these advancements come with high environmental costs as they require massive computational resources and consume substantial amounts of electricity, leading to high carbon emissions. Previous studies have demonstrated that substantial redundancies exist in large pre-trained models, and reducing these redundancies through compression would not compromise model performance. While these studies focused on retaining comparable model performance, the direct impact of compression on energy consumption when training models appears to have received little attention. By quantifying the energy usage associated with both uncompressed and compressed models, we investigate whether compressed models consume less electricity and validate whether compression is an effective strategy to reduce the environmental impact. Initially, we train models without compression and record electricity usage, which serves to establish a baseline for comparison. Then, as mentioned above, we consider three compression techniques: steganographic capacity reduction, pruning, and lowrank factorization. In each case, during training, we measure and record the electricity usage of each model, comparing energy consumption against model performance in terms of classification accuracy. By determining the minimum electricity required to maintain comparable model performance, this research will contribute to the development of sustainable artificial intelligence (AI) practices.

Available for download on Monday, May 25, 2026

Share

COinS