The government will invest 826.2 billion won ($632 million) to shape an AI ecosystem in Korea by 2030 with indigenous high-end AI chips.
A pre-launch announcement for the first stage of the “K-Cloud Project” was released by Korean chipmakers, AI chip designers, cloud companies, scholars and the Ministry of Science and ICT in Pangyo, Gyeonggi, on Monday.
During the first stage of the K-Cloud Project, some 100 billion won will be spent to develop a neural processing unit (NPU) for AI data centers by 2025.
NPUs are chips that are more energy-efficient and faster in data processing compared to central processing units (CPU) or graphics processing units (GPU). NPUs specialize in AI computations such as deep learning or making predictions based on a pattern, while CPUs generalize in running computers and GPUs focus on processing graphic images. NPUs function as brains for data centers.
The three-stage K-Cloud Project was announced in December as part of the Yoon Suk Yeol administration’s efforts to strengthen the local semiconductor industry. The roadmap for stage one was disclosed in February and the involved parties were finalized last month.
A chip alliance consisting of Korean chipmakers Samsung Electronics and SK hynix, chip designers Furiosa AI, Rebellions and Sapeon, cloud service companies Naver, KT and NHN, AI startups including Upstage, Laon Road and Nota, research institutes and the Science Ministry will spearhead the project.
Public-private representatives from the consortium on Monday pledged in a joint declaration to cooperate mutually and foster AI chip talents.
“The government will strongly support the K-Cloud Project to promptly secure a reference for homemade AI chips and rise victorious in the global chip war,” Science Minister Lee Jong-ho said.
Team Korea will capitalize on memory chips as an edge to make phased developments of AI chips by 2030 and exert all efforts to acquire a world-class semiconductor technology, the science minister added.
The K-Cloud Project aims to set up data center clusters that deliver 39.9 petaflops of AI computing power based on the new NPU, 19.95 petaflops each for the public sector and private sectors. One petaflop executes one quadrillion, or thousand trillion, calculations per second.
At stage two, the K-Cloud consortium plans to create a DRAM-based low-power processing-in-memory (PIM) chip by 2028. PIM chips integrate memory and processing to reduce latency and address the von Neumann bottleneck.
The final stage will upgrade the PIM chip based on non-volatile memory and super-low energy consumption by 2030.
Fueled by the so-called “AI moment,” the AI chip market has been booming since the ChatGPT hype earlier this year. The global AI chip market is forecast to be $86.1 billion in size by 2026, growing 16 percent each year, according to market tracker Gartner.
BY DONG-JOO SOHN [email@example.com]