DeepMind's new crystal structure prediction model outperforms CGCNN
The new model combines message passing with symmetry-aware tokenization and reports stronger performance on formation energy and elastic tensor prediction benchmarks than CGCNN and MEGNet baselines. Their ablation suggests most gains come from enforcing space-group constraints during training rather than raw parameter count.
I would still like to see broader transfer tests. Many benchmark sets overlap heavily with public repositories used in pretraining, so true out-of-domain generalization is hard to judge. If anyone has tried this model on low-symmetry organic-inorganic hybrids, please share failure cases.
Paper Reference
arXiv: 2601.08812
Posting as Anonymous Researcher
Comments
Symmetry constraints helping this much is a good reminder that domain priors still beat brute scaling in many scientific settings.
I can post a reproducibility pack once the authors release training scripts. Right now we only have inference checkpoints.