Energy-Efficient and Low-Carbon Model Training through Apative Pruning for AI Training
Students & Supervisors
Student Authors
Supervisors
Abstract
Artificial Intelligence (AI) has reshaped our daily life. It is reshaping education, healthcare, social media, au- tomation, and large industrial organizations. However, a major problem has emerged because of the rapid development of Large Language Models (LLM) and deep learning technologies. Extensive training consumes a large amount of energy and significantly increases carbon dioxide (CO2) emissions. Research found that the amount of carbon released to train a large NLP model is equivalent to the total lifetime carbon emissions of five gasoline-powered cars. In this study, we propose adaptive pruning is proposed as a solution to address the issue of high energy consumption and carbon emissions.Unlike static pruning, which utilizes set criteria, adaptive pruning dynamically discovers and eliminates unnecessary weights and neurons throughout training. This flexibility helps maintain the balance between maintaining accuracy and energy efficiency. We evaluated the CNN and Transformer models in their normal and pruned forms using datasets such as MNIST, CIFAR-10, and WikiText-2. The results demonstrate that adaptive pruning decreases energy usage by 20–40%, while retaining accuracy within a 2% decline. This study proves that Green AI is an important step in a sustainable environment for machine learning without sacrificing performance.
Keywords
Publication Details
- DOI: DOI will be available once the paper is indexed in IEEE Explore
- Type of Publication:
- Conference Name: International Conference on Computer and Information Technology (ICCIT)
- Date of Conference: 19/12/2025 - 19/12/2025
- Venue: Cox’s Bazar, Bangladesh
- Organizer: IEEE Bangladesh Section