Thursday, 1 Jan 2026
IronAxis is a U.S.-based B2B supplier of industrial equipment, instruments, machinery, food processing systems and new energy solutions for manufacturers, labs and engineering companies.
On January 1, 2026, DeepSeek announced the release of a new research paper proposing a novel neural network architecture called mHC (manifold-constrained HyperConnection). The work aims to address the training instability commonly observed in traditional HyperConnection (HC) approaches at scale, while preserving their substantial performance gains.
The paper lists three co-first authors: Zhenda Xie (Xie Zhenda), Yixuan Wei (Wei Yixuan), and Huanqi Cao. Notably, Wenfeng Liang, founder and CEO of DeepSeek, is also among the authors.
Abstract (paraphrased):
Recent advances—exemplified by HyperConnection (HC)—have extended the ubiquitous residual connection paradigm, established over the past decade, by widening residual streams and diversifying connectivity patterns. While these innovations yield significant performance improvements, the added complexity fundamentally compromises the identity-mapping property inherent to residual connections. This leads to severe training instability, limited scalability, and substantial memory-access overhead.
To tackle these issues, the team proposes manifold-constrained HyperConnection (mHC), a general-purpose framework that projects HC’s residual connection space onto a specific manifold to restore the identity-mapping behavior. This architectural refinement is paired with rigorous infrastructure-level optimizations to ensure computational efficiency.
Empirical experiments demonstrate that mHC enables stable large-scale training, delivers tangible performance enhancements, and exhibits superior scalability. The authors anticipate that mHC—serving as a flexible and practical extension of HC—will deepen the understanding of topological architecture design and offer a promising pathway for the evolution of foundation models.
Reposted for informational purposes only. Views are not ours. Stay tuned for more.