The Pursuit of Scaling Law: China’s AI Start-ups on the Rise
China’s demand for higher computing power is on the rise, driven by the boom in generative AI. According to Zhu Yibo, head of systems at AI start-up StepFun, computing power, systems, data, and algorithms are the cores in the pursuit of the scaling law. This law states that as the size of a model and its training data increases, the model’s performance improves.
China’s AI start-ups are racing to catch up with their US peers in AI computing power.
StepFun, founded by Microsoft Asia Research Institute’s former chief scientist Jiang Daxin in 2023, is one of the many Chinese AI start-ups trying to catch up with US peers in AI computing power. It has launched the Step-1V multimodal large language model (LLM) with over 100 billion parameters, and it is testing the Step-2V model that boasts over one trillion parameters.
“Computing power, systems, data, and algorithms are the cores in the pursuit of the scaling law,” Zhu Yibo said at a media briefing in Shanghai.
The push for greater computing power stems from a belief in the scaling law, which states that as the size of a model and its training data increases, the model’s performance improves. However, Chinese AI start-ups face the common challenge of restricted access to the most advanced AI chips from US supplier Nvidia, making their quest more difficult.
Access to advanced AI chips is a major challenge for Chinese AI start-ups.
China’s domestic market has spawned at least 200 LLMs, many claiming to be among the top-performing in the world. As the demand for higher computing power continues to rise, it will be interesting to see how China’s AI start-ups will overcome the challenges and catch up with their US peers.
The future of AI computing power in China looks promising.