π€ Iβd love to hear a deep dive on Radioheadβs use of tech, like the Ondes Martenot and other electronic instruments/production β could even pull in Cole Cuchna as a guest!
Could also see a broader βtech in musicβ episode!
π€ Iβd love to hear a deep dive on Radioheadβs use of tech, like the Ondes Martenot and other electronic instruments/production β could even pull in Cole Cuchna as a guest!
Could also see a broader βtech in musicβ episode!
After that Spotify section (and the theme song episode a while back!), Iβm wondering when we might get a full Radiohead HF episode π π π
Oh man and Step Functions as Agentic orchestration systems
Any plans for a ModernBERT variant that has RTD as a pretraining objective, to potentially optimize for classification tasks?
And as the authors note in the paper, ModernBERT may not be best the best (yet) for classification tasks, with its MLM-only training objective (as seen with its GLUE results)
The long context retrieval and efficiency benefits make ModernBERT a no brainer for IR in RAG apps, but it is interesting that it comparatively excels most in multi-vector retrieval (ie in comparison with GTE-en-MLM)
ModernBERT future directions Iβm esp excited about β
1. Multi-vector retrieval (ie ColBERT)
2. New version of ModernBERT trained with a combo MLM and RTD objectives for classification
ModernBERT and GLiNER (and to a maybe lesser extent, GraphRAG) are definitely the advances I expect to build with the most in 2025