Google DeepMind’s AlphaEvolve just broke long-standing mathematical records by evolving algorithms that improved several difficult Ramsey theory problems. Moonshot AI also revealed a new transformer concept called Attention Residuals that lets models focus on earlier layers instead of blending everything equally. And researchers from Zhipu AI and Tsinghua University released GLM-OCR, a compact model built to read complex documents with tables and formulas.
AI progress is happening across several layers at once. AlphaEvolve helps advance difficult mathematics, Attention Residuals explores transformer improvements, GLM-OCR shows powerful document AI in a small model, OpenViking rethinks agent memory, and Granite Speech focuses on efficient multilingual speech systems.
Comments (0)