Full write-up in the blog post.
twosmallfish.vc/portfolio-hi...
#EdgeAI #OnDeviceAI #VoiceAI #TimeSeriesAI #Semiconductors #StateSpaceModels #Robotics #Wearables #AR #HealthTech #Automotive #VentureCapital
State Space Models Accelerate Recommendations, LLMs Boost Quality
The article notes that state space models accelerate recommendation engines and large language models improve content quality. Read more: getnews.me/state-space-models-accel... #statespacemodels #llms #recommendations
State‑Space Duality Expanded to Diagonal Models for Faster Sequences
The paper expands Structured State‑Space Duality to diagonal state matrices, preserving the O(T) recurrence vs O(T²) attention trade‑off when an SSM matches masked attention. getnews.me/state-space-duality-expa... #statespacemodels #attention
In-Training Compression Improves Efficiency of State Space Models
In‑training compression trims SSM hidden dimensions during training, preserving performance while speeding up optimization; paper submitted Oct 2025. Read more: getnews.me/in-training-compression-... #statespacemodels #modelcompression
Damped Oscillatory State‑Space Models Boost Long‑Range Learning
D‑LinOSS adds learnable damping to oscillatory state‑space models, improving long‑range. It hits state‑of‑the‑art results on 50 k‑token tasks; paper posted on arXiv May 2025. Read more: getnews.me/damped-oscillatory-state... #dlinoss #statespacemodels
Memory Length Drives Learning in State Space Models, Study Finds
Researchers find that giving state space models the longest memory horizon improves gradient descent, and fixing recurrent weights matches or exceeds adaptive versions. Read more: getnews.me/memory-length-drives-lea... #statespacemodels #memorylength
Aligning Inductive Bias Boosts Data Efficiency in State Space Models
Task-Dependent Initialization (TDI) aligns an SSM's spectral bias with a dataset's power spectrum, improving accuracy in low-data regimes. Sep 26 2025. Read more: getnews.me/aligning-inductive-bias-... #taskdependentinitialization #statespacemodels
Tired of #AI search hallucinations?
The root cause often lies in the architecture behind most AI models: the #Transformer.
In this #InfoQ article, Albert Lie explains how #StateSpaceModels (#SSMs) can fix this, and what it could mean for the future of AI search.
Read now: bit.ly/4nojJrZ
#LLMs
New episode is out!
We covered some of the most practical and under-discussed tools in #Bayesian econometrics:
🔷 #DynamicRegression
🔷 #StateSpaceModels
🔷 Predictively consistent priors
🔷 Bayesian R²
🔷 Whether AI could help us elicit better priors
I am looking forward to presenting this work in Nashville! Let's connect at CVPR 2025 in person. Would love to chat about dynamic graph modeling, efficient state propagation, and the future of ML Models.
#CVPR2025 #MachineLearning #TimeSeries #StateSpaceModels #DeepLearning #Nashville #Algorithms
Mamba(2) is great but not without flaws. Combining with Attention can aleviate its shortcommings. If you want to learn more about current SSM-Attention Hybrids I compiled a extensive review with cons and pros:
n1o.github.io/posts/ssm-tr...
#AI #Mamba #StateSpaceModels #LLM
Huge thanks to my incredible co-authors Nicola Cirone, Antonio Orvieto, Cristopher Salvi, and Terry Lyons!
#NeurIPS2024 #MachineLearning #DeepLearning #StateSpaceModels
🧵6/6