Gangnam Perspective: Who Will Stop Our Addiction to Short-Form Videos?
- Input
- 2026-01-19 18:04:59
- Updated
- 2026-01-19 18:04:59

Short-form videos are addictive. Even when they are not particularly informative or entertaining, you come to your senses and realize that an hour or two has flown by. At a recent gathering, one friend said with a hollow laugh, “I was blankly watching some trivial skincare videos, and when I snapped out of it, two hours had passed.” We all burst into laughter, but it ended on a bitter note. It was because I knew I was no different.
In front of short-form content, minors’ self-control is even more precarious. From the perspective of brain science, adolescence is a period when the prefrontal cortex, which controls impulses, is not yet fully developed. When a constant stream of stimulating 15-second clips pours in, children’s brains are said to enter a “popcorn brain” state in which only the reward circuit is intensely activated. They gradually become desensitized, unable to respond unless the stimulus is even stronger.
Given this situation, governments have begun to draw a sharp sword of direct and powerful regulation. They are starting to view the issue as a social disaster and a national task that can no longer be left unattended. In November last year, Australia became the first country in the world to enact a law that completely bans the use of social networking service (SNS) platforms by those under 16, regardless of parental consent. If platforms such as TikTok, Facebook, Instagram, and X (formerly Twitter) fail to block underage accounts, they face extremely heavy penalties, including fines that can reach hundreds of billions of won. Countries across Europe, including Norway and France, are also considering similar legislation, putting pressure on the platforms.
Confronted with this regulatory sword, global big tech companies are hastily lowering their profile. It is no coincidence that YouTube recently introduced a feature to limit teenagers’ viewing of YouTube Shorts, and that Meta Platforms Inc. and TikTok have strengthened their safety features. Their moves are less an act of pure goodwill than a strategic response aimed at avoiding astronomical fines and easing the intensity of regulation.
Even so, these changes are welcome. Whatever the motivation, it is undeniable that at least a minimal line of defense is being put in place for children who had been left completely exposed.
Of course, there is controversy. Some raise fundamental concerns that such measures excessively infringe on young people’s freedom of expression, while others question their effectiveness, pointing to the possibility of bypassing restrictions through tools like a virtual private network (VPN). Yet it is also true that there is growing social consensus that, like alcohol and cigarettes, strong regulation of SNS is necessary to protect minors.
What about South Korea? Our situation is much the same. Studies have already shown that the longer teenagers use smartphones, the higher their levels of depression and the risk of extreme choices. Nevertheless, we still place more weight on “autonomy” and “parental supervision.” Even YouTube’s new features have to be turned on directly by a guardian.
This raises a question: can parents really be the main agents of control? While telling their children to put down their smartphones, adults themselves cannot let go of short-form content. Expecting every household to have the time and capacity to meticulously manage digital devices is far removed from reality.
So should the government step in, or should we prioritize respect for individual freedom? It is hard to give an easy answer, but one thing is clear: we cannot continue to neglect the problem as we are doing now.
Platform ethics, state institutions, and education at home are interlocking gears, none of which can be neglected. Yet the allure of short-form content is now too sophisticated and powerful for us to rely solely on individual self-restraint and media literacy education. In the face of a massive temptation engineered by astronomical capital and artificial intelligence (AI) algorithms, it is harsh to blame only personal willpower. Before we scold children and tell them to “stop watching,” our society must first draw a safety line that can rein in runaway technology.
yjjoe@fnnews.com Reporter