The "publish or perish" culture must perish. Scientists need time to think.
We just published our Slow Science Manifesto, where we argue that huge changes are needed in the way we fund, publish, and evaluate science.
Read more and sign here: www.slow-science.com
20.02.2026 16:11
π 110
π 52
π¬ 2
π 8
Finally, a call to the neuroscience community: the BID is the perfect tool to study low-dimensional dynamics directly at the level of spikes in both spiking network models and data. Feel free to get in touch if you're interested in collaborating. N/N
27.01.2026 15:33
π 0
π 0
π¬ 0
π 0
We look forward to applying this approach to study intrinsic dimension in non-equilibrium systems and neural representations in quantized neural networks. 4/N
27.01.2026 15:32
π 1
π 0
π¬ 1
π 0
We find that the BID scales sublinearly in the spin-glass phase, with changes of scaling exponents sitting at phase transitions. 3/N
27.01.2026 15:30
π 0
π 0
π¬ 1
π 0
We directly link the BID estimator for the intrinsic dimensionality of the spin dynamics to the overlap distribution in finite-size systems. 2/N
27.01.2026 15:28
π 0
π 0
π¬ 1
π 0
Not despondent to share our new work arxiv.org/abs/2601.17427 with Santiago Acevedo and Cristopher Erazo from SISSA, where we show that Binary Intrisic Dimension (BID) [ nature.com/articles/s42... ] is able to detect phase transitions in the Hopfield model. 1/N
27.01.2026 15:27
π 8
π 1
π¬ 1
π 0
Interested in doing a Ph.D. to work on building models of the brain/behavior? Consider applying to graduate schools at CU Anschutz:
1. Neuroscience www.cuanschutz.edu/graduate-pro...
2. Bioengineering engineering.ucdenver.edu/bioengineeri...
You could work with several comp neuro PIs, including me.
27.09.2025 20:30
π 52
π 30
π¬ 1
π 4
PhD Position: Theory of Learning in Artificial and Biologically Inspired Neural Networks | Radboud University
Do you want to work as a PhD candidate Theory of Learning in Artificial and Biologically Inspired Neural Network? Check our vacancy!
Please RT - Open PhD position in my group at the Donders Center for Neuroscience, Radboud University.
We're looking for a PhD candidate interested in developing theories of learning in neural networks.
Applications are open until October 20th.
For more info: www.ru.nl/en/working-a...
22.09.2025 17:17
π 14
π 13
π¬ 2
π 1
Home
The school will open the thematic period on Data Science and will be dedicated to the mathematical foundations and methods for high-dimensional data analysis. It will provide an in-depth introduction ...
Just got back from a great summer school at Sapienza University sites.google.com/view/math-hi... where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested: aleingrosso.github.io/_pages/2025_...
15.09.2025 11:57
π 15
π 2
π¬ 0
π 1
With all the sad developments in the US - go study in the Netherlands: relatively low tuition (and adequate job search visa after graduating) for high-quality programs like this one in Neurophysics or Cognitive Neuroscience at the neuroscience Donders hub
27.05.2025 19:57
π 6
π 2
π¬ 0
π 1
Fantastic. Congrats Will.
15.05.2025 15:54
π 1
π 0
π¬ 1
π 0
Statistical Mechanics of Transfer Learning in Fully Connected Networks in the Proportional Limit
Tools from spin glass theory such as the replica method help explain the efficacy of transfer learning.
Our paper on the statistical mechanics of transfer learning is now published in PRL. Franz-Parisi meets Kernel Renormalization in this nice collaboration with friends in Bologna (F. Gerace) and Parma (P. Rodondo, R. Pacelli).
journals.aps.org/prl/abstract...
01.05.2025 16:13
π 7
π 2
π¬ 0
π 0
External seminar - Alessandro Ingrosso (Radboud University, NL) | QBio
:
π³π± For the next FRESK seminar, Alessandro Ingrosso (Radboud University, NL) will give a lecture on "Statistical mechanics of transfer learning in the proportional limit"
@aingrosso.bsky.social
More info on Qbio's website ! ‡οΈ
qbio.ens.psl.eu/en/events/ex...
28.03.2025 18:51
π 0
π 1
π¬ 0
π 0
Announcing our StatPhys29 Satellite Workshop "Molecular biophysics at the transition state: from statistical mechanics to AI" to be held in Trento, Italy, from July 7th to 11th, 2025: indico.ectstar.eu/event/252/.
Co-organized with Raffaello Potestio and his lab in Trento.
11.03.2025 13:38
π 2
π 0
π¬ 0
π 0
Density of states in neural networks: an in-depth exploration of...
Learning in neural networks critically hinges on the intricate geometry of the loss landscape associated with a given task. Traditionally, most research has focused on finding specific weight...
Our paper on density of states in NNs is now published in TMLR. We show how the loss landscape in simple learning problems can be characterized by Wang-Landau sampling. A nice collaboration with the Potestio Lab in Trento, at the interface between ML and soft-matter.
openreview.net/forum?id=BLD...
18.02.2025 13:20
π 3
π 0
π¬ 0
π 0
Event-based backpropagation on the neuromorphic platform SpiNNaker2
Neuromorphic computing aims to replicate the brain's capabilities for energy efficient and parallel information processing, promising a solution to the increasing demand for faster and more efficient ...
π€ π§ π§ͺ New #Preprint Alert! Imagine AI systems that can learn and adapt on-chip while displaying minimal energy usage. We've just made a step towards unlocking the final piece of the puzzle needed to deploy neuromorphic at scale using SpiNNaker2! (1/8)
arxiv.org/abs/2412.15021
28.01.2025 20:06
π 28
π 12
π¬ 1
π 1
New paper with @leonlufkin.bsky.social and @eringrant.bsky.social!
Why do we see localized receptive fields so often, even in models without sparisity regularization?
We present a theory in the minimal setting from @aingrosso.bsky.social and @sebgoldt.bsky.social
13.12.2024 10:49
π 29
π 8
π¬ 0
π 0
Excess kurtosis strikes back.
13.12.2024 08:56
π 5
π 0
π¬ 0
π 0