Login

GLM-4.7-Flash: 30B MoE model achieves 59.2% on SWE-bench, runs on 24GB GPUs

(curateclick.com) by czmilo | Jan 20, 2026 | 1 comments on HN
Visit Link
← Back to news