Generative artificial intelligence (AI) has recently emerged as the hottest topic in the tech industry, but the environmental costs associated with this technology are raising deep concerns. While the success of large language models like 'ChatGPT,' image generation AI, and similar technologies symbolizes the pinnacle of data-driven innovation, the fact that they entail massive energy consumption and a significant carbon footprint is increasingly drawing attention. As Korea strives to become a global AI hub, it cannot be exempt from these issues. Melissa Heikkilä, a journalist for MIT Technology Review, recently shed light on the carbon footprint issue of generative AI through an in-depth analytical article, based on data. According to her report and various recent academic studies, the power consumed during the training and operation of large language models is incomparably greater than that of typical IT systems. Synthesizing multiple research findings, the carbon emissions required to train a single AI model are, on average, substantial, and for some large models, the impact is even more severe. Energy consumption from data centers, in particular, is cited as the largest contributor to these carbon emissions. Experts warn that typical large data centers emit tens of thousands of tons of carbon dioxide (CO2) annually, continuously negatively impacting not only local communities but also the global environment. The issue of data center energy consumption stems from several complex factors. First, there is the computational load required for AI model training. Generative AI repeatedly performs complex mathematical operations, demanding enormous amounts of power in the process. For large language models, high-performance GPU clusters must run for days to weeks to optimize billions of parameters. This far exceeds the annual electricity consumption of an average household or small business. Second, there are the cooling systems necessary to operate these systems stably. Computers within data centers heat up to high temperatures during continuous computation, and vast resources are used globally to cool them. Cooling systems themselves account for a significant portion of a data center's total power consumption, and in some regions, large quantities of water are consumed for cooling. Leading global AI tech companies have declared their commitment to strengthening ESG (Environmental, Social, and Governance) management to address these environmental issues, but in reality, there is still a long way to go. Major tech companies in advanced countries, including the United States, claim to be actively incorporating renewable energy into their data center operations, but the pace of AI development is currently outpacing the progress of environmental policies. While the transition to renewable energy sources like solar and wind power is undoubtedly a positive step, critics argue that the exponential growth in the scale and complexity of AI models is causing overall energy consumption to surge, making it difficult to fully resolve the issue through renewable energy adoption alone. Furthermore, many companies have declared carbon neutrality but often rely on purchasing carbon offsets rather than actual reductions in emissions, leading to criticism that the real environmental improvement effects are limited. This issue is also attracting significant attention within Korea's AI industry. Korea has established itself as a global AI research and development hub, with numerous data centers being built, particularly in Seoul and the surrounding metropolitan area. Indeed, the government's announced 'Digital New Deal' policy prioritizes fostering the AI and data industries, supporting the large-scale expansion of data centers. However, the accompanying environmental responsibility is relatively lacking. In Korea, the proportion of fossil fuels like coal and natural gas in electricity generation remains high, meaning increased power consumption by data centers directly translates to increased carbon emissions. Consequently, Korea's AI industry faces a dilemma: it must simultaneously meet the critical demands of energy efficiency and sustainability. For Korea to maintain global competitiveness in AI technology while also achieving environmental goals, improving data center energy efficiency and transitioning to renewable energy are urgent tasks. Some argue that the problem can be solved through more efficient AI model design and the adoption of low-power hardware. For example, model quantization techniques, an approach that reduces the number of parameters while maintaining similar performance, are being researched. Additionally, the development of optimization algorithms to enhance efficiency during the inference process is actively underway. On the hardware front, the development of low-power chips specialized for AI computation is gaining attention, holding the potential to significantly reduce power consumption while performin
Related Articles