While the digital age has enabled an explosive increase and dissemination of information, this innovation does not always lead to positive outcomes. The rapid flow of information is transforming the function of the public sphere, and simultaneously, it is being identified as one of the risk factors facing democracy. Dr. Elena Petrova, a political sociologist at the London School of Economics (LSE), recently published an article on the LSE blog titled 'The Digital Transformation Era: New Challenges and Resilience for Democracy' (April 4, 2026). In it, she defines this as 'a contemporary challenge demanding digital resilience,' warning that democracy faces new threats in the digital environment. The most concerning elements in the digital landscape are the surge in fake news and hate content. In her article, Dr. Petrova points out that misinformation spreads much faster than real news through social media platforms, exerting greater influence among people than truth. This is not merely an issue of individual information discernment but also encompasses a structural problem where platform algorithms guide users toward extreme information. Algorithms are designed to maximize user engagement and dwell time, leading to a tendency for provocative and extreme content to be prioritized for exposure. This phenomenon is supported by numerous studies showing that misinformation significantly influenced voter decisions in major elections in the United States and Europe. Similar patterns are emerging in Korean society, with repeated instances of large-scale dissemination of misinformation during sensitive political situations like local and presidential elections, raising concerns that this could degrade the quality of democracy. Digital platforms like social media, while allowing for rapid information dissemination and accessibility, are simultaneously deepening the polarization of public opinion. Dr. Petrova emphasizes that people's information consumption tends to follow politically biased paths, and there is an increasing risk that even neutral users may gradually shift towards extreme positions. Algorithm-based recommendation systems create 'filter bubbles' by prioritizing content that aligns with users' existing beliefs, which blocks exposure to diverse perspectives and reinforces confirmation bias. She argues that 'the polarization of public opinion fundamentally obstructs the deliberation and communication required by democracy,' urging platform companies to modify their algorithms and exert greater effort in responsible content management. Democracy inherently functions through dialogue and compromise between differing opinions, and in a polarized environment, such democratic processes are difficult to operate smoothly. Strengthening platform responsibility is one of the key solutions Dr. Petrova proposes. She emphasizes that digital platform companies should not merely be technology providers but important actors shaping the public sphere, and thus must bear social responsibility. The Digital Services Act (DSA), recently introduced by the European Union (EU), is a prime example of this approach, demanding proactive measures from platform companies to prevent misinformation and hate content, strengthening the responsibility of platform operators, and requiring algorithmic transparency. This legislation aims to mitigate the impact of misinformation on the public sphere and mandates large platforms to disclose information about their content moderation policies and algorithmic operations. In contrast, Korea has not yet fully established such a policy framework. The Korean government has primarily relied on platform self-regulation, but experts are calling for a more systematic and clear legal foundation. The general academic view is that algorithmic social responsibility and comprehensive legislation to support it are essential to ensure digital fairness. The Importance of Platform Responsibility and Citizen Education Media literacy education is another key element emphasized by Dr. Petrova. She argues that technical and institutional solutions alone are insufficient, and citizens must acquire the ability to critically evaluate and rationally consume information in the digital environment. Educational programs designed to help citizens make better choices amidst the flood of information in the digital age are being implemented in various countries worldwide. Media literacy goes beyond merely judging the veracity of information; it signifies a comprehensive capability to verify information sources, recognize biases, and comparatively evaluate diverse perspectives. Dr. Petrova specifically notes that citizens who have received media literacy education tend to be less exposed to fake news and participate more actively in democratic dialogue and decision-making. Strengthening such education is urgent even in South Korea, where digital technology is highly diffused. Programs aimed at fostering information discernment,
Related Articles