it's sad that the UK doesn't have Mormons, because they would be incredible at Taskmaster
it's sad that the UK doesn't have Mormons, because they would be incredible at Taskmaster
What are good examples of meta-training *plastic* neural networks (either optimizing plasticity rules or network structure) to address neurosci questions?
Stuff like @tpvogels.bsky.social & @bconfavreux.bsky.social’s rule search, Tyulmankov Neuron 2021, our stuff on transitive inference etc.
Google Meet is down
A million meetings cancelled
The people exult
Less flippantly:
With llms acting as intelligent mutation operators, meta-optimization moves from purely quantitative exploration of pre-defined spaces (hyperparam search) to *qualitative* proposals (defining new loss functions, identifying target hyperparams, etc.)
tl;dr: DeepMind automates the Graduate Student Descent algorithm!
The way most CS/AI has worked for decades too
Really struck at how @tyrellturing.bsky.social model of hippocampus (yesterday at Cosyne) is almost symmetric to the very similar Whittington/Fiete lab models.
Latter is based on grid cells, former has no grid cells at all!
Max Bennett’s keynote was, predictably, spectacular 🤯
I’m off to Montréal for Cosyne 2025. Say hi if you see me!
First Added First Out seems to be very popular on social media these days, not sure why
Now out in @NatureNeuro!
Animals can do Transitive Inference (TI): quickly learn, and reconfigure, global order from partial info (learn A>B, B>C,C>D, immediately know A>D).
But currently no neural model explains the observed features of TI.
So we asked the machine to find one for us -and it did!
It was bound to happen eventually!
I’m at NeurIPS! Say hi if you want to talk about meta-learning, computational neuroscience, plastic neural networks, or any combination of the above.
There’s a special place in hell for departments that require letters of recommendation for all job applicants rather than just those who make some initial cut.