Merry Christmas everyone π
Merry Christmas everyone π
Day 21: Read 2 studies on neural correlates of memory, specifically focussing knowledge retrieval and updating knowledge. #neuroscience #deeplearning
Day 20: Crossed a major hurdle today. Now, my sleep model can tell the accuracy of wakefulness, REM and NREM sleep using EEG+EMG signals. #100DaysOfCode #neuroscience #deeplearning
Day 19: Today I read 2 blog posts on Understanding GPU Memory by Pytorch
Read them here t.co/v4NC15dWrq and here t.co/o9dNDsMJhI. I love how simple and direct they are written, no bs. #100DaysOfCode #DeepLearning
Day 18: I was consumed by a jax issue #100DaysOfCode
Day 17: More than month since I posted my progress. Been busy with applications, job and lab work! Mostly worked on pipeline to infer and analyse latent sleep states. Hoping to make more progress before I go on vacation next week #100DaysOfCode #deeplearning
Day 16: Reviewed matrix calculus (gradients, jacobiens, scalar and jacobian chain rule). Also worked on improving brain2image retrievals using SigLIP, now have decent performance. #100DaysOfCode #neuroimaging #neuroscience
And slow fluctuation Ca+ spikes, 10-100msΒ that also contribute to the electric field outside neurons. What's interesting is that they can spread throughout the neuron, even crossing different layers of input. They are relatively large and long-lasting and can be measured outside the neuron.
Also read about fast potential Na+ spikes that generate the strongest currents across the neuronal membrane and contribute substantially to high frequency components of the LFP even though they happen in short bursts, 2ms because of the synchronous action potentials from many neurons.
Day 15: Theory of extracellular fields and currents, EEG recording from the scalp, ECoG recorded by the subdural grid electrodes from the cortical surface and LFP and contributors of extracellular fields(ionic processes from fast action potentials to slowest fluctuations)#100DaysOfCode #neuroskyence
Day 14: Learned new things related to signal processing; spent a lot of time inspecting hpc and pfc data to understand why the 3d plot was not making sense; turns out you need to look at it from the correct angle (feels silly). Also learned about fft and pwelch analysis. #100DaysOfCode #neuroskyence
Day 13: Attempted to improve the brain2image forward and backward retrievals by making the labels soft in the SigLIP loss function and somehow made it worse. I will continue tomorrow! #100DaysOfCode #neuroskyence
Day 12: Trained mcRBM model again after minor changes and increasing the batch size from 256 to 16384 on an A4000; took 7 mins to run 10K epochs. #100DaysOfCode #neuroskyence
Day 11: Trained mcRBM model on the sleep scoring dataset. mcRBM is an energy-based model used for unsupervised learning. #100DaysOfCode #neuroskyence
Day 10: In the MedARC journal club, today we discussed the BrainLM foundation fMRI model introduced by Van Dijk Lab.
Preprint: t.co/MUobqXULfb
#100DaysOfCode #neuroskyence
Upon sliding this kernel function over the manifold, we're essentially adding the values of the kernel function at different points, weighted by the data at those points to capture information about patterns or features on the manifold.
And Instead of a grid, we have a "kernel function" to how to weight or combine information from nearby points on the manifold.
On a manifold, there's no Cartesian coordinates like (x, y). Instead, there's a local coordinate systems and at each point on the manifold, we describe how to measure distances and directions around that point. These help in defining the equivalent of the Cartesian coordinate.
Day 9: Convolutions on manifolds! Okay, in image convolutions we have a grid of numbers (the filter) that we slide over an image to perform operations like edge detection. On a manifold, we need to adapt this because there's no natural grid structure. #100DayOfCode #AcademicSky
Also, riemannian metric that are rules for measuring distances and angles on a manifold. And, geodesic which is the most efficient route between two points on a manifold.
What I learned today about manifolds is basically a lot of prerequisite concepts like tangent vectors which are like little arrows that tell you how to move around on that curved surface on the manifold.
Manifolds help represent high-dimensional data in a lower-dimensional space or latent space and diffusion models operate in the latent space which can be thought of as a representation of the data where the underlying structure is preserved.
Day 8: The concept of manifolds is essential to how diffusion models work and I never really understood them before and after spending a few hours today I only understand them to a small extent. Maybe a few more days! Let's see what I know now #100DayOfCode #AcademicSky
Also, riemannian metric that are rules for measuring distances and angles on a manifold. And, geodesic which is the most efficient route between two points on a manifold.
What I learned today about manifolds is basically a lot of prerequisite concepts like tangent vectors which are like little arrows that tell you how to move around on that curved surface on the manifold.
Manifolds help represent high-dimensional data in a lower-dimensional space or latent space and diffusion models operate in the latent space which can be thought of as a representation of the data where the underlying structure is preserved.
There's still work on this I need to do for eg. making bias learnable.
Reading the open clip code really helped me. I think I understand how it works much better!
train loss
test loss
Day 7: Made a lot of progress on SigLIP loss function I have been working on. I am able to see some decent performance metrics on brain2image retrievals and vice-versa (Test Bwd Pct Correct: 0.6875 Test Fwd Pct Correct: 0.8906). #100DaysOfCode #neuroscience #neuroskyence #deeplearning