Chinese AI giants pivot toward proprietary models to drive revenue, performance
People walk past the Alibaba Qwen booth at the Mobile World Congress in Barcelona, Spain, March 2, 2026.
Photo: Xinhua Chinese companies including Alibaba Cloud and Zhipu AI have opted not to open-source some of their latest artificial intelligence models as they look to capture the full value of their usage through official revenue generating channels.
While none of the companies have said that they are moving away from their open-source strategies, the development reflects an industry trend where the most powerful models are growing in size, making them increasingly difficult to host on local hardware.
This week, Alibaba released three proprietary models that were all accessible only via its official cloud platform or chatbot website.
These included Qwen3.6-Plus, a model with enhanced coding abilities, and Qwen3.5-Omni, a multimodal model that can process text, audio, images and video.
The previous generation of the Omni model released in September, the Qwen3-Omni, was open-sourced.
According to an Alibaba Cloud spokesperson, the latest version will not be open-sourced as the Omni series is less popular among developers, based on download figures on open-source AI platform Hugging Face.
Alibaba Cloud is the AI and cloud computing unit of Alibaba Group Holding, owner of the South China Morning Post.
The logo of Z.ai is seen on a smartphone in this arranged photograph taken October 24, 2025.
Photo: Shutterstock Images Following the release, Qwen researcher Zheng Chujie suggested on X that the team was prioritising model size and capabilities. “I believe building flagship models with SOTA (state-of-the-art) performance is always the top priority,” he wrote. “Delivering less competent smaller models is not that important or urgent.” Since the release of DeepSeek’s R1 model in January last year, Chinese companies have dominated the global open source ecosystem, with Qwen amassing more derivative models in the developer community than both Google and Meta combined, according to figures from open-source AI platform Hugging Face.
This growth in adoption was primarily driven by Alibaba’s release of many smaller versions of its Qwen models, typically sized at several billion parameters, which can then be freely downloaded and customised by developers globally for particular use cases.
Parameters are the mathematical variables encoding a model’s “intelligence”.
Industry leading models such as Anthropic’s Claude family have trillions of parameters, but t
原文链接: 南华早报
