Day n: Why was 6 scared of 7? CAUSE SEVEN ATE NINE!!! Working on latent spaces in autoencoders. Fun. #100DaysOfCode
Day n: Why was 6 scared of 7? CAUSE SEVEN ATE NINE!!! Working on latent spaces in autoencoders. Fun. #100DaysOfCode
Day 65: Yesterday I did some coding on paper to understand how to design my next experiment! #100DaysOfCode
Day 64: Variational autoencoder session - trying to disentangle the latent variables (supervised version). PLEASE WORK. #100DaysOfCode
Structural extraction of structures.
Day 63: attended a really questionable python introduction and learned about concept bottleneck models!! #100DaysOfCode
Day 62: RELOCATION OF NODES IS WORKING NOW. A mega mess, but well. Functional. Gotta improve the indexing though. #100DaysOfCode
Those Tuesday morning messages from your bio prof give you a little smile.
"This seems to be a fishing e-mail - do not react to it."
P.S. No phishing and no fishing, just an automated notification from moodle. ๐ฃ๐๐ ๐ก๐ฆ
Day 61: Doctorate meeting in he morning was interesting and fun as always!! Working on the literature review. ๐คฉ #100DaysOfCode
And finally, a useful version!! More or less hand made ...
other versions!!
Day 60: chatgpt will take over the world, also chatgpt: struggles to build a latex table for me. #100DaysOfCode
Day 59: Refactoring this mess of a code. Functional: sure, funny: maybe, pretty: definitely not. #100DaysOfCode
Is it normal that every time you touch your laptop a new screw falls out? I am at screw number 3 now.
only ever used that to play solitaire with my grandmother, so cannot judge ๐
Christina, but close enough ๐
We should have stopped at 98 ๐ญ
and then closer to 3am it did start to work
I don't know which day, but I am sharing the things that don't work at 2am, and haven't worked for the past two days (or ever) ๐ #100DaysOfCode
Haven't had that problem before ๐ But sounds similarly magical ๐คฉ
Day 57: Classic moment of "It works and nobody knows why". The training of my model should break when there is no connection between green and yellow filters. Well, it doesn't. #100DaysOfCode
Day 56: Doctorate meeting went well - taked about a multi-objective loss. Prof said, if the project was a funnel, we are at the red arrorw. I think that's something positive! Also, for the next "sprint": two P's - Passion and a little bit of Panic! #100DaysOfCode
Day 55: OMG, I did it, the loss function doesn't throw any errors anymore. Differentiable now, understood how .to(device) is handled in lightning and learned more about gradients and how graphs are computed. #100DaysOfCode
Day 54: Today, I realised that I overcomplicated my whole loss function. Instead of just looping through the paramerters saved in the model, I am crawling through the network. At least I know it now and can fix the mess! ๐คฉ #100DaysOfCode
Day 53: Fundus images didn't work, so I will get into enface OCT - using some slabs. Hence, I will get into layer segmentation next!! #100DaysOfCode
My absolutely most hated Error when using python. Similar to C's Segmentation fault (core dumped).
Day 52: Turns out my loss term wasn't differentiable. Thanks goes out to my amazing PhD team for spotting it! #100DaysOfCode
Day 51: Looking at my neural network layers with deep dream! #100DaysOfCode
thank youuu!! ๐
Day 50: Dataloader for Kandinsky patterns is done! Now I need to find a loss term that makes my network not only sparse but also modular! So in love with my PhD team, they help me verbalise my problems ๐ค #100DaysOfCode
Day 49: Meeting with my Doctoral supervisor and advisors was fun as always! Great new ideas for how to proceed - mainly dataset and loss term related. #100DaysOfCode