Chinese artificial intelligence start-up DeepSeek has ushered in 2026 with a new technical paper, co-authored by founder Liang Wenfeng, that proposes a rethink of the fundamental architecture used to train foundational AI models.
The method – dubbed Manifold-Constrained Hyper-Connections (mHC) – forms part of the Hangzhou firm’s push to make its models more cost-effective as it strives to keep pace with better-funded US rivals with deeper access to computing power.
It also reflected the...
DeepSeek kicks off 2026 with paper signalling push to train bigger models for less
Published 2 hours ago
Source: scmp.com

Related Articles from scmp.com
9 minutes ago
3 killed in Iran as cost-of-living protests turn violent
1 hour ago
Taiwan arms, missing Chinese boy, Trump’s Jimmy Lai appeal: 7 US-China relations reads
1 hour ago
Beijing’s Venezuela plans, Mexico-China tariffs: 7 Latin America relations reads
1 hour ago
China’s Ukraine war role, EU vs Shein and Temu, Trump-Modi call: 7 global relations reads
1 hour ago
Bolsonaro denied house arrest by Brazil's Supreme Court despite health plea
1 hour ago