Will TurboQuant Burst the Semiconductor Bubble? "It Might Actually Be a Good Thing"
- Input
- 2026-03-27 14:31:15
- Updated
- 2026-03-27 14:31:15

[The Financial News] A new technology that can reduce the memory required to run Artificial Intelligence (AI) models by up to sixfold has shaken global semiconductor companies. However, many experts argue that, in the long run, this development could actually be positive for the industry.
Global semiconductor stocks hit by the "TurboQuant shock"
According to Investing.com on the 27th, the semiconductor index on the New York Stock Exchange (NYSE) plunged 4.79% on the 26th (local time) to close at 7,585.87 points. In the U.S. market, memory stocks tumbled across the board: Micron Technology, the largest DRAM producer in the U.S., fell 6.97%; SanDisk, the largest U.S. NAND flash maker, dropped 11.02%; and its parent company Western Digital slid 7.70%. As memory stocks slumped, Nvidia also sank 4.16%, and its rival Advanced Micro Devices (AMD) fell 7.49%. Intel lost 6.53%, Broadcom declined 2.95%, and Taiwan Semiconductor Manufacturing Company (TSMC) dropped 6.22%, leaving Qualcomm, which inched up 0.15%, as a rare exception among major chip stocks. Samsung Electronics and SK hynix have also been moving down in tandem.
The sell-off is widely seen as a reaction to Google LLC’s announcement of a new AI compression algorithm called TurboQuant. Google’s research team introduced a compression technique that can dramatically cut the amount of memory needed for AI. TurboQuant is reported to reduce memory usage to one-sixth of current levels while boosting AI inference speed by up to eight times. This has fueled concerns that the earnings outlook for semiconductor companies supplying memory to data centers is now under serious pressure.
Kim Dong-won, a research analyst at KB Securities, explained, "Because TurboQuant can reduce the amount of memory needed for AI inference, share prices of memory semiconductor companies have seen a sharp short-term drop."
"The market reaction right now is excessive"
However, many analysts say this downturn reflects an overreaction by the market.
For now, TurboQuant exists only at the research-paper stage, and it will likely take considerable time before it is commercialized. Lee Sung-hoon, research analyst at Kiwoom Securities, said, "We believe it is unlikely that the emergence of Google’s TurboQuant will lead to a structural slowdown in memory demand." He added, "If TurboQuant lowers AI operating costs and improves efficiency, and if many latecomers enter the AI ecosystem, we should also consider a scenario where the overall AI market expands and total memory demand ultimately increases."
He also noted that during the "DeepSeek episode" in January last year, the initial market reaction was largely one of shock, but chip stocks went on to stage another rally over the medium to long term.
Kim Dong-won pointed out, "After DeepSeek was unveiled in January 2025, Nvidia’s share price plunged in the short term but rebounded within just a month and even surpassed previous levels, which is highly instructive." He continued, "TurboQuant and DeepSeek are both meaningful as software optimization technologies for implementing low-cost, high-efficiency AI. But given the pace of AI data center investment expected over the next five years, such technologies alone have practical limits in absorbing the explosive growth in AI demand."
Kim Rokho, a research analyst at Hana Securities, also commented, "These compression technologies will not simply work in a way that reduces memory demand." He predicted, "From the perspective of memory manufacturers, they could actually open up new forms of high-performance demand. As you need to decompress and process compressed data, demand for customized High Bandwidth Memory (HBM) could expand, and the need to advance Processing-in-Memory (PIM) technology, which handles compression and decompression at the memory level, may also grow." In other words, competition may shift from sheer capacity to boosting bandwidth and processing speed.
"In the end, memory chip makers will be the winners"
Kim Dong-won stressed, "TurboQuant and other low-cost AI technologies lower the barriers to using AI and work in a way that explosively expands overall demand." He added, "This directly translates into more computation and higher memory content per system, so the biggest beneficiaries of the race to expand the AI ecosystem are expected to be memory semiconductor companies."
Youngjin Lee, a research analyst at Samsung Securities, said, "If the TurboQuant algorithm is applied, it can bring down inference costs, but demand will explode." He added, "It will enable the use of long context windows and large batch sizes without sacrificing speed or quality."
Lee Jong-wook, a research analyst at Samsung Securities, argued, "Factors that would reduce AI memory demand will mainly appear when AI functionality becomes saturated, such as a slowdown in the pace of AI service improvement, easing competition among AI model companies, and slower growth in the AI industry’s potential market." He emphasized, "DRAM and semiconductor prices, data center costs, the profitability of AI model or cloud companies, and optimization or cost reductions in AI models do not materially affect demand."
Lee Chang-min, a research analyst at KB Securities, analyzed, "TurboQuant and other low-cost AI technologies lower the barriers to AI adoption and act to explosively expand overall demand." He added, "As computation workloads and memory content increase and the AI ecosystem grows, memory semiconductor companies are likely to be the biggest beneficiaries."
fair@fnnews.com Han Young-joon Reporter