Note By Ryunsu

Dec 23, 2024

Is AI Sovereignty Just a Pipe Dream?

Ryunsu Sung avatar

Ryunsu Sung

Is AI Sovereignty Just a Pipe Dream? 썸네일 이미지

GPU Count Is National Competitiveness

While Meta Platforms, which operates user-generated content media like Facebook and Instagram, is installing more than 100,000 H100 cards in a single “cluster” inside its data centers to train its Llama 4 model, it’s said that even if you scrape together every H100 card available in Korea, you wouldn’t reach 2,000 cards in total. Global cloud providers like Meta, GCP, AWS, and Azure are finding even 100,000 cards per cluster insufficient, so they are planning and preparing to link multiple clusters via private networks to enable multi-cluster training. Once the Blackwell generation, which massively increases throughput, is delivered to customers in early 2025, the gap in compute infrastructure between Korea and the United States will widen to a completely different order of magnitude.

Nvidia GPUs at the Core of National AI Strategy
“The most important element of a national AI strategy is securing Nvidia GPUs at stable prices.” (Sung-Hyun Park, CEO of Rebellions) There are growing calls for Korea to move at the national level to secure Nvidia GPUs in order to strengthen its AI competitiveness. As the global AI boom accelerates, supply of Nvidia GPUs has been disrupted, and the argument is that the government should step in to support GPU supply for companies and researchers. Jae-Ho Yeom, president of Taejae University, Kakao CEO Shin-Ah Jung, KAIST computer science professor Hye-Yeon Oh, and Rebellions CEO Sung-Hyun Park said on the 5th at COEX in Gangnam-gu, Seoul,
MediaToday favicon
MediaToday - MediaToday

The reason I’ve been somewhat skeptical about developing a Korean-language foundation LLM (large language model) is that as long as the scaling law holds (though there is recent debate that it’s hitting limits), no matter what “magic” domestic companies pull off, they will never overcome the performance gap if U.S. big tech, backed by enormous capital, simply trains more data on their vast GPU stockpiles. That concern is steadily becoming reality. As of February 2024, Naver’s HyperClova X reportedly outperformed GPT-4 on Korean-language LLM benchmarks (KMMLU et al.), but if you compare the Korean fluency of GPT-4o and Clova X today, 4o produces more natural outputs. Tokens for the 4o model are much more expensive, but if OpenAI releases GPT-5—which it claims will have PhD-level intelligence—and a 5 mini version delivers performance similar to 4o, even the price advantage is likely to disappear.

GPT o1-Preview Scores in the Top Tier on the CSAT

GPT o1-Preview, the first large language model launched with an explicit chain-of-thought mechanism, scored within the top 4% on the 2025 Korean College Scholastic Ability Test (CSAT) in Korean. Even the lighter o1-mini model scored at a level equivalent to Grade 4, above the average test taker. On an exam like the CSAT, which demands a high level of reading comprehension, LLMs are already achieving scores that surpass the majority of humans, and they continue to evolve.

CSAT Korean scores by GPT model
CSAT Korean scores by GPT model

Some argue that we must first solve the “hallucination” problem before fully applying LLMs to real work, but they seem to willfully ignore the fact that humans also hallucinate. For mission-critical applications, you can simply wire multiple LLMs together and have them cross-check each other’s reasoning processes and outputs. AI agents have yet to demonstrate human-level job performance, but I believe that too is only a matter of time.

Because they are trained primarily on overseas data, GPT models are often criticized for high error rates on questions about Korea. Hand them an internet search engine, however, and that problem is likely to be largely resolved. Coincidentally, OpenAI recently launched ChatGPT Search, signaling its interest in the search market as well. What if GPT pulls in incorrect information from the internet? Judging by the performance of the o1-Preview model, the odds of that happening are lower than the odds that you, the reader, would make the same mistake.

Why Top Engineers Have No Reason to Work in Korea

Aravind Srinivas, CEO of Perplexity, an AI-based search engine, has repeatedly spoken publicly about how hard it is to hire talent capable of working with LLMs. One day, he had finally managed to hire an experienced Google employee, but once Google learned of the planned move, it offered four times the salary to keep that employee, and the hire fell through. On another occasion, he tried to recruit a senior AI researcher at Meta Platforms, only to be effectively turned away with the response, “Get back to me after you’ve secured 10,000 H100 cards.”

To attract top AI researchers and engineers, salary matters, but you also need to create an environment where they can actually produce research results. Imagine inviting Cho Seong-jin, one of the world’s greatest pianists, only to have a toy piano sitting alone on stage at the concert hall—no matter how much you pay him, he won’t accept. Perplexity has already raised more than 200 billion won in funding, yet it still struggles to secure talent because it’s up against the “money war” of OpenAI and U.S. big tech firms investing at the trillion-won scale. Even for Korean engineers, almost no one will choose to come work in Korea purely out of patriotism if they can receive compensation packages—salary plus stock options—worth over 1 billion won and a chance to make a quantum leap in their career elsewhere.

Why AI Sovereignty Is Overrated

The core argument seems to be that without original technological capabilities, we will inevitably be dragged around by AI powers like the United States. I don’t agree with that premise. For example, there is no country in the modern economy that lacks access to oil, which is essential today. Oil prices are not extraordinarily high; in fact, thanks to technological innovations like shale gas fracking, crude oil has become more accessible than before, and the influence of the Middle East is waning.

Right now, Google, Meta, Microsoft, OpenAI, Anthropic, and others are investing tens of trillions of won to develop better foundation AI models, but ultimately this is to monetize those models later. Monetization, by definition, means selling them externally in some form, which in turn means we will also have access. There is no Korean company that can build GPU clusters worth tens of trillions of won to develop a foundation AI model domestically. Korea does have companies like Samsung Electronics, which holds more than 100 trillion won in cash, but as a manufacturer it must spend heavily on capital expenditures, so it doesn’t have the financial leeway to buy that many GPUs. An even bigger problem is that even if you show up with money in hand, Nvidia can only produce a limited number of GPUs, so without the right network, you simply can’t get them.

David Ricardo, a giant in economics, used the theory of comparative advantage to explain how countries can still benefit from trade even when one has an absolute advantage over the other. The United States enjoys an absolute advantage over Korea in most industries, and it is focusing on its strongest relative advantage—knowledge industries—while Korea has concentrated on high value-added manufacturing such as semiconductors and heavy industry and is only now entering knowledge industries. Yet Korea still has no company with global influence in the knowledge sector. Google delivers better search results than Naver both in absolute and relative terms, which raises the question of whether Naver needs to exist at all. In effect, Naver has given up competing as a search engine and has become a commerce and media content company instead.

Every industry creates added value through abstraction. New layers are built on top of basic industries to create new sectors, and as these abstraction layers stack up, the added value grows. Amazon’s AWS cloud service is essentially a way of taking semiconductor products, layering software on top, and abstracting them into the resource we call “computing,” which it then sells. Software companies using AWS further abstract those computing resources by adding their own data-processing algorithms and user interfaces, and then resell them. Our economy as a whole is the sum of dozens or hundreds of such abstraction layers built on top of foundational industries.

It’s not only countries or companies that produce crude oil that create value. Downstream industries like refining, and chemicals that use refined products to produce materials, also generate added value. In fact, you could argue that companies capable of refining and chemical processing have higher technological value added than those that only extract crude oil. This is why we don’t need to be so obsessed with developing a homegrown AI foundation model.

Comments0

Newsletter

Be the first to get news about original content, newsletters, and special events.

Continue reading