Alibaba's Next-Gen Qwen3.5 Model Emerges, Multiple Open-Source Releases Likely

Deep News
10 hours ago

On February 9, a new pull request for integrating Qwen3.5 into the Transformers library appeared on the HuggingFace open-source project page. Industry insiders speculate that Alibaba's next-generation foundation model, Qwen3.5, is nearing its official release.

Some observers commented that this signals the start of a "crazy February" led by Chinese large language models. According to available information, Qwen3.5 utilizes a novel hybrid attention mechanism and is highly likely to be a natively visual-language model capable of visual understanding. Developers have further uncovered that Qwen3.5 may open-source at least a 2B dense model and a 35B-A3B MoE model.

Previously, The Information reported that Qwen3.5 would be open-sourced during the Chinese New Year holiday. In early February, Tang Jie, Chief Scientist at Zhipu AI, also posted on Weibo suggesting that numerous new models, including DeepSeek v4, Qwen3.5, and GLM-5, would be launched soon.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10