Chinese artificial intelligence start-up DeepSeek has ushered in 2026 with a new technical paper, co-authored by founder Liang Wenfeng, that proposes a rethink of the fundamental architecture used to train foundational AI models.
The method – dubbed Manifold-Constrained Hyper-Connections (mHC) – forms part of the Hangzhou firm’s push to make its models more cost-effective as it strives to keep pace with better-funded US rivals with deeper access to computing power.
It also reflected the...
DeepSeek kicks off 2026 with paper signalling push to train bigger models for less
Published 4 hours ago
Source: scmp.com

Related Articles from scmp.com
16 minutes ago
41 young men die in South Africa from circumcision procedures
38 minutes ago
Ukraine says it faked death of anti-Kremlin Russian fighter Kapustin
1 hour ago
Trump blames bruised hand on aspirin, denies falling asleep in meetings
1 hour ago
Hong Kong girl, 13, arrested over fight with teen at ‘Lan Kwai Fong of Gen Z’
2 hours ago
3 killed in Iran as cost-of-living protests turn violent
3 hours ago