DOI : https://doi.org/10.5281/zenodo.20196394
- Open Access
- Authors : Dr. Arun Kumar Gupta, Shamhira Khatoon, Vinay Kumar Sachan
- Paper ID : IJERTV15IS050966
- Volume & Issue : Volume 15, Issue 05 , May – 2026
- Published (First Online): 15-05-2026
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License:
This work is licensed under a Creative Commons Attribution 4.0 International License
Energy-Efficient and Carbon-Conscious Approaches for Sustainable Computing
Shamhira Khatoon, Vinay Kumar Sachan, Arun Kumar Gupta
School of Engineering and Technology, Department of Chemical Engineering CSJM University, Kanpur, India
Abstract – The rapid growth of computing technologies, particularly artificial intelligence and large-scale data processing, has significantly increased global energy consumption and carbon emissions. Sustainable computing has therefore emerged as an important approach to reduce the environmental impact of modern computing systems. This study examines the latest trends in sustainable computing, focusing on Green AI, energy-efficient data centers, and carbon-aware computing. Green AI emphasizes the development of machine learning models that minimize computational cost and energy usage while maintaining high performance. Energy-efficient data centers aim to optimize power utilization through advanced cooling technologies, efficient hardware, and improved resource management. Carbon-aware computing further enhances sustainability by scheduling computational workloads based on the carbon intensity of available energy sources. The study also highlights key research challenges associated with these technologies, including high energy demand of AI models, difficulties in balancing efficiency with performance, limited transparency in energy consumption data, and the complexity of integrating renewable energy into computing infrastructures. Addressing these challenges requires innovative algorithms, improved hardware design, and collaborative efforts between academia and industry. The findings of this study emphasize the importance of sustainable computing practices in reducing environmental impact while supporting the continued growth of digital technologies.
Keywords: – Sustainable computing; Green AI; Carbonaware computing; Green data centers; Energyefficient computing
INTRODUCTION
The rapid growth of digital technologies, cloud computing, and artificial intelligence (AI) has significantly increased global demand for computational resources. As modern societies rely heavily on data-driven services such as machine learning, big data analytics, and Internet-based applications, the infrastructure supporting these technologiesparticularly data centershas expanded rapidly. However, this growth has raised serious environmental concerns due to high electricity consumption and increasing carbon emissions associated with large-scale computing systems. Data centers alone consume a substantial portion of global electricity, and their energy demand is expected to increase further with the rapid development of AI-driven applications and large-scale machine learning models.
Sustainable computing has emerged as a critical research field that aims to minimize the environmental impact of digital infrastructure while maintaining high computational performance. The concept encompasses energy-efficient hardware design, optimized software algorithms, and environmentally responsible operational practices in computing systems. By integrating renewable energy sources, improving resource utilization, and adopting advanced cooling and power management techniques, sustainable computing seeks to reduce energy consumption and greenhouse gas emissions associated with computing technologies.
Among the recent developments in this domain, Green AI has gained significant attention as a framework for designing artificial intelligence systems that prioritize energy efficiency and reduced carbon footprint. Traditional AI models often require extensive computational power during training and deployment, leading to high energy consumption. Green AI focuses on optimizing algorithms, reducing computational complexity, and developing efficient training strategies that achieve high performance while minimizing energy usage and environmental impact.
Another important aspect of sustainable computing is the development of energy-efficient data centers. Data centers are the backbone of modern digital infrastructure but require significant power for server operation, cooling systems, and network equipment. Recent research emphasizes the use of energy-efficient hardware, intelligent workload management, advanced cooling technologies, and renewable energy integration to reduce power consumption. Techniques such as liquid cooling, hot-aisle/cold-aisle containment, and improved resource scheduling have been widely adopted to improve energy efficiency and reduce operational costs.
In addition, carbon-aware computing has emerged as an innovative approach that considers the carbon intensity of electricity sources when scheduling computational workloads. By dynamically shifting computing tasks to locations or time periods where renewable energy availability is higher or carbon intensity is lower, carbon-aware systems can significantly reduce the environmental impact of large-scale computing operations. This approach integrates real-time energy data, intelligent scheduling algorithms, and workload optimization techniques to achieve sustainable digital infrastructure.
Therefore, the integration of Green AI, energy-efficient data center technologies, and carbon-aware computing strategies represents a promising pathway toward environmentally sustainable digital systems. These emerging trends not only reduce the ecological footprint of computing but also contribute to improved operational efficiency and cost savings. This study explores the latest developments in sustainable computing and highlights the role of advanced technologies in achieving greener and more energy-efficient computing infrastructures.
GREEN ARTIFICIAL INTELLIGENCE
Figure 1. Growth of ICT Energy Consumption Green Artificial Intelligence
The graph illustrates the increasing trend in global electricity consumption by the Information and Communication Technology (ICT) sector over the period from 2015 to 2024. The horizontal axis represents the years, while the vertical axis shows the global ICT electricity consumption measured in terawatt-hours (TWh). The plotted data points are connected by a line to highlight the overall growth trend over time.
At the beginning of the period in 2015, the electricity consumption of the ICT sector was approximately 220 TWh. This value reflects the energy required to support computing infrastructure, communication networks, and digital services worldwide at that time. In 2016, consumption increased slightly to about 235 TWh, indicating the growing demand for digital services such as cloud computing, mobile communications, and internet-based applications.
The upward trend continues in 2017, where electricity usage reached around 250 TWh. This gradual rise can be attributed to the expansion of data centers, increased internet penetration, and rapid adoption of smartphones and connected devices. By 2018, the consumption further increased to approximately 270 TWh, reflecting the accelerating demand for computing power and digital infrastructure.
In 2019, global ICT electricity consumption rose to about 295 TWh, showing a noticeable increase compared to previous years. This growth corresponds with the rapid development of artificial intelligence (AI), big data analytics, and large-scale cloud computing platforms, which require significant computational resources. By 2020, the energy consumption reached approximately 320 TWh, partly due to the global shift toward remote working, online education, and digital communication platforms, which significantly increased internet traffic and data processing requirements.
The upward trend becomes even mor pronounced from 2021 onward. In 2021, ICT electricity consumption increased to about 350 TWh, followed by 380 TWh in 2022. This period reflects the rapid expansion of hyperscale data centers, edge computing infrastructure, and the growing use of AI-driven technologies.
The graph illustrates the increasing trend in global electricity consumption by the Information and Communication Technology (ICT) sector over the period from 2015 to 2024. The horizontal axis represents the years, while the vertical axis shows the global ICT electricity consumption measured in terawatt-hours (TWh). The plotted data points are connected by a line to highlight the overall growth trend over time.
At the beginning of the period in 2015, the electricity consumption of the ICT sector was approximately 220 TWh. This value reflects the energy required to support computing infrastructure, communication networks, and digital services worldwide at that time. In 2016, consumption increased slightly to about 235 TWh, indicating the growing demand for digital services such as cloud computing, mobile communications, and internet-based applications.
The upward trend continues in 2017, where electricity usage reached around 250 TWh. This gradual rise can be attributed to the expansion of data centers, increased internet penetration, and rapid adoption of smartphones and connected devices. By 2018, the consumption further increased to approximately 270 TWh, reflecting the accelerating demand for computing power and digital infrastructure.
In 2019, global ICT electricity consumption rose to about 295 TWh, showing a noticeable increase compared to previous years. This growth corresponds with the rapid development of artificial intelligence (AI), big data analytics, and large-scale cloud computing platforms, which require significant computational resources. By 2020, the energy consumption reached approximately 320 TWh, partly due to the global shift toward remote working, online education, and digital communication platforms, which significantly increased internet traffic and data processing requirements.
The upward trend becomes even more pronounced from 2021 onward. In 2021, ICT electricity consumption increased to about 350 TWh, followed by 380 TWh in 2022. This period reflects the rapid expansion of hyperscale data centers, edge computing infrastructure, and the growing use of AI-driven technologies.
By 2023, the consumption reached approximately 420 TWh, highlighting the substantial energy demand associated with modern digital ecosystems. The highest value on the graph appears in 2024, where ICT electricity consumption rises sharply to around 460 TWh. This indicates a nearly twofold increase compared to 2015, emphasizing the significant energy requirements of the global digital infrastructure.
This indicates a nearly twofold increase compared to 2015, emphasizing the significant energy requirements of the global digital infrastructure.
Overall, the graph clearly demonstrates a consistent and accelerating growth in ICT-related electricity consumption over the ten-year period. This trend highlights the urgent need for sustainable computing practices, including energy-efficient data centers, Green AI technologies, and carbon-aware computing strategies, to reduce the environmental impact of rapidly expanding digital systems. Improving energy efficiency and integrating renewable energy sources into ICT infrastructure will be essential to support future technological growth while minimizing carbon emissions.
ENERGYEFFICIENT DATA CENTERS
The graph illustrates the distribution of energy consumption among different components of a data center. The horizontal axis represents the major data center components, namely servers, cooling systems, storage units, networking equipment, and power systems, while the vertical axis represents the percentage share of total energy consumption (%). The bar chart clearly shows how energy usage is distributed across these different components within a typical data center infrastructure.
From the graph, it is evident that servers consume the largest share of energy, accounting for approximately 45% of the total electricity consumption in data centers. Servers perform the primary computing tasks such as processing data, running applications, and handling cloud services. Due to the continuous operation of processors, memory modules, and supporting hardware, servers require a substantial amount of electrical power. The increasing demand for cloud computing, artificial intelligence (AI), and big data analytics has further increased the energy consumption associated with server infrastructure.
The second largest contributor to energy consumption is the cooling system, which accounts for about 30% of the total energy usage. Data center servers generate a large amount of heat during operation, and efficient cooling systems are necessary to maintain optimal operating temperatures and prevent hardware failure. Cooling mechanisms such as air conditioning systems, chilled water cooling, liquid cooling, and airflow management techniques are widely used in modern data centers. Although cooling is essential for maintaining system reliability, it also represents a significant portion of the overall energy demand.
The storage systems contribute approximately 10% of the total energy consumption. These systems include hard disk drives (HDDs), solid-state drives (SSDs), and other data storage technologies that store and retrieve large volumes of digital information. As the amount of global data continues to grow rapidly due to digital services, multimedia content, and IoT devices, storage infrastructure has become an important component of data center energy consumption.
The networking equipment, which includes routers, switches, and communication devices responsible for transferring data within and outside the data center, accounts for around 8% of the total energy usage. Although networking devices consume less energy compared to servers and cooling systems, they are critical for maintaining high-speed communication and connectivity between different computing resources.
The graph illustrates the distribution of energy consumption among different components of a data center. The horizontal axis represents the major data center components, namely servers, cooling systems, storage units, networking equipment, and power systems, while the vertical axis represents the percentage share of total energy consumption (%). The bar chart clearly shows how energy usage is distributed across these different components within a typical data center infrastructure.
From the graph, it is evident that servers consume the largest share of energy, accounting for approximately 45% of the total electricity consumption in data centers. Servers perform the primary computing tasks such as processing data, running applications, and handling cloud services. Due to the continuous operation of processors, memory modules, and supporting hardware, servers require a substantial amount of electrical power. The increasing demand for cloud computing, artificial intelligence (AI), and big data analytics has further increased the energy consumption associated with server infrastructure.
The second largest contributor to energy consumption is the cooling system, which accounts for about 30% of the total energy usage. Data center servers generate a large amount of heat during operation, and efficient cooling systems are necessary to maintain optimal operating temperatures and prevent hardware failure. Cooling mechanisms such as air conditioning systems, chilled water cooling, liquid cooling, and airflow management techniques are widely used in modern data centers. Although cooling is essential for maintaining system reliability, it also represents a significant portion of the overall energy demand.
The storage systems contribute approximately 10% of the total energy consumption. These systems include hard disk drives (HDDs), solid-state drives (SSDs), and otherdata storage technologies that store and retrieve large volumes of digital information. As the amount of global data continues to grow rapidly due to digital services, multimedia content, and IoT devices, storage infrastructure has become an important component of data center energy consumption.
The networking equipment, which includes routers, switches, and communication devices responsible for transferring data within and outside the data center, accounts for around 8% of the total energy usage. Although networking devices consume less energy compared to servers and cooling systems, they are critical for maintaining high-speed communication and connectivity between different computing resources.
Finally, power systems, including power distribution units (PDUs), transformers, backup power supplies, and uninterruptible power supply (UPS) systems, consume about 7% of the total energy. These systems ensure stable power delivery and provide backup during power interruptions, which is essential for maintaining uninterrupted data center operations.
including power distribution units (PDUs), transformers, backup power supplies, and uninterruptible power supply (UPS) systems, consume about 7% of the total energy. These systems ensure stable power delivery and provide backup during power interruptions, which is essential for maintaining uninterrupted data center operations.
Figure 2. Energy Distribution in Data Centers
The table presents several important metrics used to evaluate the sustainability and energy efficiency of data centers. These metrics help researchers, engineers, and data center operators assess how efficiently resources such as energy, carbon emissions, and water are managed within computing facilities.
Power Usage Effectiveness (PUE) is one of the most widely used metrics for measuring data center energy efficiency. It represents the ratio of the total energy consumed by the entire facility to the energy used by IT equipment such as servers, storage devices, and networking systems. A lower PUE value indicates higher efficiency because more of the consumed energy is directly used for computing rather than supporting infrastructure like cooling and lighting. Ideally, a PUE value close to 1.0 represents a highly efficient data center.
Carbon Usage Effectiveness (CUE) measures the amount of carbon emissions generated per unit of IT energy consumption. This metric reflects the environmental impact of a data center in terms of greenhouse gas emissions. CUE depends on the type of energy sources used to power the facility. Data centers powered by renewable energy sources such as solar or wind typically have lower CUE values, indicating reduced carbon footprints and more sustainable operations.
Water Usage Effectiveness (WUE) evaluates the amount of water consumed by a data center for cooling purposes relative to the energy used by IT equipment. Cooling systems in large-scale data centers often require significant water resources, especially in evaporative cooling technologies. Lower WUE values indicate better water efficiency, which is important in regions where water scarcity is a concern.
Energy Reuse Effectiveness (ERE) measures the extent to which waste energy generated in a data center is reused within or outside the facility. For example, excess heat generated by servers can sometimes be captured and reused for heating nearby buildings or industrial processes. A lower ERE value indicates better reuse of energy and improved overall efficiency of the facility.
Overall, these metrics collectively provide a comprehensive framework for evaluating the sustainability performance of modern data centers. By monitoring and optimizing parameters such as PUE, CUE, WUE, and ERE, organizations can improve energy efficiency, reduce environmental impact, and promote the development of green and sustainable computing infrastructures.
Table 1. Major Sustainability Metrics for Data Centers
|
Metric |
Description |
|
PUE (Power Usage Effectiveness) |
Ratio of total facility energy to IT equipment energy |
|
CUE (Carbon Usage Effectiveness) |
Carbon emissions per unit of IT energy consumption |
|
WUE (Water Usage Effectiveness) |
Water consumption for cooling relative to IT energy use |
|
ERE (Energy Reuse Effectiveness) |
Measures energy reuse within facility |
CARBONAWARE COMPUTING
The graph illustrates the carbon intensity associated with different energy sources used for electricity generation. Carbon intensity refers to the amount of carbon dioxide (CO) emissions produced per unit of electricity generated, typically expressed in grams of CO per kilowatt-hour (gCO/kWh). The horizontal axis represents different energy sources, namely coal, natural gas, hydro, solar, and wind, while the vertical axis shows their corresponding carbon intensity values.
From the graph, it is evident that coal has the highest carbon intensity, with a value of approximately 820 gCO/kWh. This indicates that coal-based power generation releases a large amount of carbon dioxide into the atmosphere during combustion. Coal has traditionally been one of the most widely used fossil fuels for electricity production; however, it is also one of the largest contributors to greenhouse gas emissions and climate change. The high carbon intensity shown in the graph highlights the environmental concerns associated with continued reliance on coal for power generation.
The second highest carbon intensity is observed for natural gas, which produces approximately 490 gCO/kWh. Although natural gas emits significantly less carbon dioxide than coal, it is still a fossil fuel and contributes to greenhouse gas emissions. Natural gas power plants are often considered a transition energy source because they are relatively cleaner compared to coal, but they still have a notable environmental impact.
In contrast, renewable energy sources such as hydro, solar, and wind show significantly lower carbon intensity values. Hydropower has a carbon intensity of about 24 gCO/kWh, which is much lower than fossil fuels. Hydropower generates electricity through the movement of water, resulting in minimal direct carbon emissions during operation.
Similarly, solar energy has a carbon intensity of approximately 45 gCO/kWh. Although solar power itself does not emit carbon dioxide during electricity generation, some emissions occur during the manufacturing, transportation, and installation of solar panels. Nevertheless, its overall carbon footprint remains significantly lower than that of fossil fuels.
Among the energy sources presented, wind energy has the lowest carbon intensity, at around 11 gCO/kWh. Wind power generates electricity through wind turbines and produces almost no direct emissions during operation. The small carbon footprint mainly results from the manufacturing and installation of wind turbines.
Overall, the graph clearly demonstrates the significant difference in environmental impact between fossil fuel-based energy sources and renewable energy technologies. While coal and natural gas contribute substantially to carbon emissions, renewable sources such as wind, hydro, and solar provide cleaner alternatives with much lower carbon intensity. This comparison highlights the importance of transitioning toward renewable energy sources in order to reduce global carbon emissions, mitigate climate change, and support sustainable energy systems.
Figure 3. Carbon Intensity of Energy Sources
EMERGING TECHNOLOGIES SUPPORTING SUSTAINABLE COMPUTING
Emerging technologies play an important role in supporting sustainable computing, which focuses on reducing the environmental impact of computing systems while maintaining efficiency and performance. One of the key technologies is energy-efficient hardware, where mdern processors, GPUs, and memory systems are designed to consume less power and generate less heat. Another important development is the use of green data centers, which incorporate renewable energy sources such as solar and wind power, along with advanced cooling techniques and server virtualization to minimize energy consumption. Cloud computing also contributes to sustainability by allowing multiple users to share centralized computing resources, which improves server utilization and reduces the need for excessive hardware infrastructure. In addition, edge computing processes data closer to the source rather than sending it to distant data centers, thereby reducing network energy usage and latency. Artificial Intelligence (AI) is also increasingly used to optimize energy consumption by managing workloads efficiently and improving cooling systems in data centers. Furthermore, research into quantum computing and biodegradable electronic materials offers promising future solutions for reducing the energy requirements and electronic waste associated with traditional computing devices.
A conceptual graph representing sustainable computing trends can show technology advancement on the horizontal axis and environmental impact or energy consumption on the vertical axis. In traditional computing systems, energy consumption is high, but as emerging technologies such as cloud computing, edge computing, AI-based optimization, and green data centers are adopted, the environmental impact gradually decreases. This trend demonstrates how technological innovation contributes to making computing systems more energy-efficient, environmentally friendly, and sustainable over time.
RESEARCH CHALLENGES
Sustainable computing has gained significant attention due to the rapid growth of artificial intelligence and large-scale cloud infrastructure. However, several research challenges remain in the areas of Green AI, energy-efficient data centers, and carbon-aware computing. One major challenge is the high energy consumption of AI models. Modern machine learning systems require massive computational resources for training and inference, which leads to significant electricity use and increased carbon emissions. Training large AI models may consume electricity comparable to hundreds of households annually, making it difficult to maintain environmental sustainability.
Another challenge is optimizing algorithms for energy efficiency without sacrificing performance. Green AI aims to design machine learning models that achieve high accuracy while consuming less energy. Researchers must develop techniques such as model compression, pruning, and efficient training methods to reduce the computational cost of AI systems. Although such techniques can reduce emissions significantly, maintaining model accuracy and scalability remains a difficult task.
In the case of energy-efficient data centers, managing power consumption and cooling systems is a major research issue. Data centers require large amounts of energy to operate servers and maintain cooling systems that prevent overheating. High-density computing increases performance but also creates challenges in power distribution and thermal management. Developing advanced cooling technologies, efficient power usage mechanisms, and better resource allocation strategies is therefore a critical research area.
Another challenge involves carbon-aware workload scheduling. Carbon-aware computing attempts to reduce emissions by scheduling computational tasks when renewable energy is available or when the carbon intensity of the power grid is lower. However, predicting energy availability, managing distributed data centers, and balancing performance with carbon reduction are complex problems that require advanced optimization algorithms and real-time monitoring systems.
A further research challenge is the lack of transparency and data availability regarding energy usage and carbon emissions in data centers. Many technology companies treat operational information such as energy consumption and emission levels as proprietary data. This lack of transparency makes it difficult for researchers to analyze real-world energy patterns and develop effective solutions for sustainable AI infrastructure.
In addition, balancing operational performance with environmental sustainability remains a key challenge. Data centers must maintain high availability, low latency, and strong computational performance while minimizing energy consumption and carbon emissions. Achieving this balance requires interdisciplinary research involving hardware design, software optimization, renewable energy integration, and intelligent resource management.
In conclusion, although Green AI, energy-efficient data centers, and carbon-aware computing are promising approaches for sustainable computing, several research challenges remain. These include reducing AI energy consumption, improving energy-efficient algorithms, optimizing data center cooling and power usage, developing carbon-aware scheduling techniques, and
increasing transparency in energy data. Addressing these challenges is essential for building environmentally sustainable and energy-efficient computing systems in the future.
CONCLUSION
In conclusion, sustainable computing has become an essential focus in modern information technology due to the increasing energy consumption and environmental impact of large-scale computing systems. Emerging approaches such as Green AI, energy-efficient data centers, and carbon-aware computing aim to reduce carbon emissions and improve the overall efficiency of computing infrastructure. Green AI focuses on developing machine learning models that require fewer computational resources, while energy-efficient data centers emphasize optimized hardware, advanced cooling systems, and better resource management to reduce power consumption. Carbon-aware computing further supports sustainability by scheduling workloads based on the availability of low-carbon energy sources.
Despite these advancements, several research challenges remain, including balancing computational performance with energy efficiency, improving transparency in energy usage data, and developing intelligent algorithms for workload and energy management. Addressing these challenges requires collaboration between researchers, industry, and policymakers to develop innovative technologies and sustainable practices. Overall, the continued development of these technologies will play a crucial role in building environmentally responsible and energy-efficient computing systems for the future.
REFERENCES
-
Beloglazov A., Buyya R., 'Energy Efficient Resource Management in Virtualized Cloud Data Centers', Future Generation Computer Systems, Elsevier, Vol.28, No.5, pp.755768, 2015.
-
Andrae A., Edler T., 'On Global Electricity Usage of Communication Technology: Trends to 2030', Challenges Journal, MDPI, Vol.6, No.1, pp.117157, 2017.
-
Shehabi A., Smith S., Sartor D., 'United States Data Center Energy Usage Report', Lawrence Berkeley National Laboratory, Vol.1, No.1, pp.172, 2018.
-
Strubell E., Ganesh A., McCallum A., 'Energy and Policy Considerations for Deep Learning in NLP', Proceedings of ACL, Association for Computational Linguistics, pp.36453650, 2019.
-
Jones N., 'How to Stop Data Centres from Gobbling up the World's Electricity', Nature, Springer Nature, Vol.561, No.7722, pp.163166, 2020.
-
Lannelongue L., Grealey J., Bateman A., 'Green Algorithms: Quantifying the Carbon Footprint of Computation', Advanced Science, Wiley, Vol.8, No.12, pp.110, 2021.
-
Patterson D., Gonzalez J., Le Q., 'Carbon Emissions and Large Neural Network Training', arXiv preprint, arXiv.org, Vol.2104, No.10350, pp.18, 2021.
-
Bashir N., et al., 'The Environmental Footprint of Computing', Communications of the ACM, ACM Press, Vol.65, No.8, pp.5463, 2022.
-
Gupta U., et al., 'Sustainable AI: Environmental Implications of Machine Learning', ACM Computing Surveys, ACM Press, Vol.56, No.3, pp.135, 2023.
-
Shehabi A., et al., 'Data Center Energy Use Forecasts', Nature Energy, Springer Nature, Vol.9, No.2, pp.125134, 2024.
-
International Energy Agency, 'Global Data Centre Energy Outlook', IEA Publications, Vol.1, No.1, pp.150, 2024.
-
ACM SIGEnergy, 'Sustainable Computing Systems Review', ACM Press, Vol.5, No.2, pp.1032, 2025.
-
IEEE Green Computing Initiative, 'Energy Efficient Computing Technologies', IEEE Publications, Vol.12, No.1, pp.120, 2025.
-
Global Sustainable ICT Report, 'Future of Green Computing Infrastructure', International Telecommunication Union, Vol.1, No.1, pp.160, 2026.
