Published: 11:52, March 5, 2025
PDF View
LLM price war may lead to 'catfish effect'
By Li Jiaying

Intensifying AI competition to make cutting-edge tech cheaper, more efficient

A government employee works with an online services platform supported by DeepSeek in Beijing on Feb 26, 2025. (TAO RAN / FOR CHINA DAILY)

As large language model prices continue to decline, evolving market dynamics and intensified price competition will determine which players thrive in the coming years, said industry experts.

"Low pricing doesn't mean losses. When it comes to LLM profitability, doing more with less is entirely achievable," said Zhang Tong, a senior director analyst at market consultancy Gartner.

Zhang's remarks follow Chinese AI startup DeepSeek's revelation of its daily costs and theoretical revenue on Saturday, which has sparked intense discussion about the profitability of LLMs.

READ MORE: City officials looking for DeepSeek help

According to DeepSeek's data, during a 24-hour period from Feb 27-28, assuming an H800 GPU rental cost of $2 per hour, its total daily costs amounted to $87,072. If all tokens were priced according to its R1 model's rate, theoretical daily revenue would reach a total of $562,027, leading to a daily net profit of $474,955 and a cost-profit ratio of 545 percent.

However, several operating issues have led to discrepancies, where the company's actual revenue falls far below the theoretical figure, said Zhang.

For example, the V3 model is priced lower than the R1. With free web and app services, only certain features generate revenue, and off-peak usage, including nighttime discounts, further reduces income, he said.

"Despite the above-mentioned factors, the key takeaway is clear — high-quality LLMs coupled with efficient engineering optimizations can tap into sufficient market demand to achieve profitability," Zhang said.

He said the Chinese AI startup's cost advantages could trigger a wave of consolidation in the AI model market, where only the most efficient players could survive.

"For both businesses and individual users, the trend of declining inference costs is indisputable. Models that can't match the capabilities of V3 and R1, while being more expensive, will struggle to stay relevant. This 'catfish effect' will either push competitors to optimize or force them out of the market," said Zhang, who expects other LLM providers could face a tough road ahead due to this round of cost and profitability revelation.

ALSO READ: SOEs actively deploying DeepSeek AI models

"First, securing funding will become more challenging. Then companies will have no choice but to continuously refine their strategies to drive down costs," the analyst said.

As the price war intensifies, LLM prices are expected to witness continuous drops in the years to come. According to Gartner's prediction, by 2027, the average price of GenAI APIs with specific capabilities is expected to be less than 1 percent of their current average price.

However, Mike Fang, another senior director analyst at Gartner, emphasized that the prediction only refers to a decrease in LLM prices under the same conditions of quality, throughput and latency, and does not necessarily mean the price of the most powerful models will drop.

"As LLMs continue to evolve and upgrade, the most advanced ones will still retain a certain premium," Fang said. "The factors influencing the model prices include not only their operational costs, but also their capabilities, which may greatly impact their pricing power."

lijiaying@chinadaily.com.cn