If this sounds too good to be true, don't believe me, see for yourself: there is already a π§βπ³ π recipe waiting for you try atomistic-cookbook.org/examples/men...
If this sounds too good to be true, don't believe me, see for yourself: there is already a π§βπ³ π recipe waiting for you try atomistic-cookbook.org/examples/men...
The simulation is stable for a combined 1.6 ns, and the final structures make a lot of physical sense: noble gases, and at higher temperature more volatile stuff, leave the particle. Accuracy is quantitative: the MAE error on forces is ~150 meV/Γ - pretty exceptional for something so outlandish!
So let us show you just how *universal* #PET-MAD-1.5 can be. This is a movie of a parallel tempering simulation, with replicas from 300K to 3000K, of what we call a "Mendeleev cluster" - one atom each of every element from 1 to 102.
Looking forward to see what you do with the dataset, MAD-1.5, and the universal potential, PET-MAD-1.5! π: π arxiv.org/html/2603.02...; π github.com/lab-cosmo/up...
This is the result of a massive effort by @cesaremalosso.bsky.social, @filippobigi.bsky.social, @ppegolo.bsky.social, @jwasci.bsky.social, @mahrossi.bsky.social and Arslan Mazitov, as well as all the π§βπ working tirelessly on the conceptual and software infrastructure.
Now, this is just a milestone on that path, but it's already something worth sharing, so thanks to arXiv you can read about it arxiv.org/html/2603.02..., and thanks to uPET github.com/lab-cosmo/up... and metatomic, you can try already a universal MLIP trained on it.
π’ We have been working on a new universal atomistic dataset that combines the principles of MAD with a meta-GGA level of theory, so we can all simulate water that does not freeze at 500K π§ , and have all our bases covered, with reference data for every isotope with a half-life above 24 hours β’οΈ
Snapshot of a ring-polymer instanton calculation for the reaction of methane with a H radical
New recipe just landed on the atomistic-cookbook.org π§βπ³π. Thanks to @yairlitman.bsky.social for explaining how to use ipi-code.org to perform ring-polymer instanton calculations of reaction rates that include quantum tunneling effects βοΈβ‘. Check it out π atomistic-cookbook.org/examples/rin...
π’ New #preprint is out! Investigating the many flavors of last-layer #UQ, Moritz and π§βπMatthias propose a practitioners' guide on "how to train a shallow ensemble". TL;DR? for good calibration use NLL, include force, and optimize the backbone, fine-tuning for speed! ππβ‘οΈ arxiv.org/html/2602.15...
welcome!
... but it is too inflexible to learn highly-correlated fragment energies, and does not extrapolate well to different densities. If you are curious to learn more, read the whole story here pubs.aip.org/aip/jcp/arti...
Our conclusion is that the body ordered decomposition is not a very useful prior for conditioning MLIPs. An architecture such as MACE that forces a fast-converging effective decomposition learns well in the low-data regime ...
Interpolative and extrapolative performance of different architectures for intermediate H cluster densities, with and without fragments included into training.
Last but not least, of the architectures we tried, only PET πΈ could actually do something good out of learning these fragments, while MACE sees its extrapolative power degrade significantly when forced to learn the slowly-converging vacuum cluster expansion.
Changes in the learned expansion as the fraction of fragments added to the training set is increased
Third, only architectures with sufficient expressive power, such as MACE or PET, can learn the true expansion, when exposed to fragments in the training set.
Body-ordered energy decomposition for low and high-density hydrogen clusters
Body-ordered energy decomposition for different MLIP architectures. None learns spontaneously the "true" decomposition
This is a story with many layers. First, the quantum cluster expansion does not converge, mostly because the fragments have crazy geometries and highly-correlated electronic structure. Second, MLIPs exposed only to compact structures learn their own effective body-ordered decomposition.
A π§βπ dream team led by Sanggyu Chong, with some FCI help from Joonho Lee @harvard.edu, tries to answer this question, and to understand what it implies for the accuracy and transferability of MLIPs in this new π fresh off the press at @aip.bsky.social #JCP.
pubs.aip.org/aip/jcp/arti...
The body ordered expansion, equations
Many #machinelearning potentials are built (or understood) in terms of "atomic cluster expansions" that link directly to a body-ordered energy decomposition that can be computed explicitly with a sequence of electronic structure calculations. But what kind of expansion do they learn in practice? Aπ§΅
Hot off the press on hashtag @aip.bsky.social #JCP, an introduction to the metatensor ecosystem. High-quality π§βπ tools for atomistic hashtag#machinelearning - read on pubs.aip.org/aip/jcp/arti... and check it out at metatensor.org π§βπ³ π recipes here atomistic-cookbook.org/software/met...
It is a stormy day, but the COSMO retreat is going strong!
π§βπ Filippo, Arslan and Paolo doing some PET talk with our friends at the @epfl-ai-center.bsky.social ai.epfl.ch/a-new-refere...
If you want to learn about materials modeling, from DFT to MD, well marinated in a spicy ML sauce, don't miss out the @ictp.bsky.social / @nccr-marvel.bsky.social college. Details and application instructions here indico.ictp.it/event/11146/. See you in Miramare!
PET continues its victory round of benchmarks and challenges π₯π₯. And this one has a (bit far-fetched) end goal that would also make it useful! Congrats to Filippo and Cesare (and @marceldotsci.bsky.social who got a honorable mention and will also try further his LOREM model)π dtu.dk/english/news...
Sergey collects the edmx doctoral distinction prize
Congratulations to π§βπ Sergey Pozdnyakov who very deservedly won the @materials-epfl.bsky.social doctoral distinction award. A good time to go check on his papers, if you haven't read them already!
Cluster highlights in chemiscope 1.0 RC3
Release candidate 3 of chemiscope 1.0 is out, with class and range based highlighting of points. Try it, break it, report it on github.com/lab-cosmo/ch...
Fantastic news from the @snf-fns.ch, who despite the budget cuts managed to fund six new NCCRs. Looking forward to doing some cool simulations to advance separation science! actu.epfl.ch/news/a-new-n...
Pareto front for PET-OMAT models
If you're scared by the 700M parameters (you shouldn't be) there's a whole set of models from π to π¦£. You can find them all on github.com/lab-cosmo/upet !
Table showing results of a few representative universal model on the matbench leaderboard
If you got curious by the PET-OAM results a week ago, you can learn more reading up arxiv.org/abs/2601.16195. Including some general considerations on how to train and use safely an unconstrained ML potential.
You can fetch the model here github.com/lab-cosmo/upet, as easy as `pip install upet`, and then, for the ASE interface, `from upet.calculator import UPETCalculator;
calculator = UPETCalculator(model="pet-oam-xl", version="1.0.0", device="cuda")` Have fun and go break it!
Screenshot of the matbench discovery leaderboard as of 14.01.2026, showing a PET based model in the top position
Not going to make a big deal out of a benchmark table, but PET just got the top spot on matbench-discovery.materialsproject.org. And don't be fooled by the huge parameters count, it's faster and can handle larger structures than eSEN-30M π. Kudos to π§βπ Filippo, Arslan and Paolo!
Zooming in on a large-scale dataset to showcase the new adaptive resolution features in chemiscope
π’ chemiscope.org 1.0.0rc1 just dropped on pypi! We are making (a few) breaking changes to the interfaces, fixing a ton of bugs and introducing some exciting features (you can finally load datasets with > 100k points!). We'd be grateful if you test, break and report π github.com/lab-cosmo/ch...