Home New Trending Search
About Privacy Terms
#
#StateSpaceModels
Posts tagged #StateSpaceModels on Bluesky
Preview
Portfolio Highlight: ABR's Funding Round - Two Small Fish Edge AI has been a key pillar of our Advanced Computing Hardware investments and a core part of our thesis for a long time. It is the same arc I wrote about in The Next Data Centre: Your Phone a while...

Full write-up in the blog post.

twosmallfish.vc/portfolio-hi...

#EdgeAI #OnDeviceAI #VoiceAI #TimeSeriesAI #Semiconductors #StateSpaceModels #Robotics #Wearables #AR #HealthTech #Automotive #VentureCapital

0 1 0 0
State Space Models Accelerate Recommendations, LLMs Boost Quality

State Space Models Accelerate Recommendations, LLMs Boost Quality

The article notes that state space models accelerate recommendation engines and large language models improve content quality. Read more: getnews.me/state-space-models-accel... #statespacemodels #llms #recommendations

0 0 0 0
State‑Space Duality Expanded to Diagonal Models for Faster Sequences

State‑Space Duality Expanded to Diagonal Models for Faster Sequences

The paper expands Structured State‑Space Duality to diagonal state matrices, preserving the O(T) recurrence vs O(T²) attention trade‑off when an SSM matches masked attention. getnews.me/state-space-duality-expa... #statespacemodels #attention

0 0 0 0
In-Training Compression Improves Efficiency of State Space Models

In-Training Compression Improves Efficiency of State Space Models

In‑training compression trims SSM hidden dimensions during training, preserving performance while speeding up optimization; paper submitted Oct 2025. Read more: getnews.me/in-training-compression-... #statespacemodels #modelcompression

0 0 0 0
Damped Oscillatory State‑Space Models Boost Long‑Range Learning

Damped Oscillatory State‑Space Models Boost Long‑Range Learning

D‑LinOSS adds learnable damping to oscillatory state‑space models, improving long‑range. It hits state‑of‑the‑art results on 50 k‑token tasks; paper posted on arXiv May 2025. Read more: getnews.me/damped-oscillatory-state... #dlinoss #statespacemodels

0 0 0 0
Memory Length Drives Learning in State Space Models, Study Finds

Memory Length Drives Learning in State Space Models, Study Finds

Researchers find that giving state space models the longest memory horizon improves gradient descent, and fixing recurrent weights matches or exceeds adaptive versions. Read more: getnews.me/memory-length-drives-lea... #statespacemodels #memorylength

0 0 0 0
Aligning Inductive Bias Boosts Data Efficiency in State Space Models

Aligning Inductive Bias Boosts Data Efficiency in State Space Models

Task-Dependent Initialization (TDI) aligns an SSM's spectral bias with a dataset's power spectrum, improving accuracy in low-data regimes. Sep 26 2025. Read more: getnews.me/aligning-inductive-bias-... #taskdependentinitialization #statespacemodels

0 0 0 0
Post image

Tired of #AI search hallucinations?

The root cause often lies in the architecture behind most AI models: the #Transformer.

In this #InfoQ article, Albert Lie explains how #StateSpaceModels (#SSMs) can fix this, and what it could mean for the future of AI search.

Read now: bit.ly/4nojJrZ

#LLMs

0 0 0 0

New episode is out!
We covered some of the most practical and under-discussed tools in #Bayesian econometrics:
🔷 #DynamicRegression
🔷 #StateSpaceModels
🔷 Predictively consistent priors
🔷 Bayesian R²
🔷 Whether AI could help us elicit better priors

1 0 0 0
Post image Post image Post image

I am looking forward to presenting this work in Nashville! Let's connect at CVPR 2025 in person. Would love to chat about dynamic graph modeling, efficient state propagation, and the future of ML Models.

#CVPR2025 #MachineLearning #TimeSeries #StateSpaceModels #DeepLearning #Nashville #Algorithms

2 0 1 0
Mamba(2) and Transformer Hybrids: An Overview Abstract Link to heading We have already looked into Mamba and Mamba2. In terms of efficiency, with their linear complexity and the absence of Key-Value cache, they are a significant improvement over ...

Mamba(2) is great but not without flaws. Combining with Attention can aleviate its shortcommings. If you want to learn more about current SSM-Attention Hybrids I compiled a extensive review with cons and pros:
n1o.github.io/posts/ssm-tr...
#AI #Mamba #StateSpaceModels #LLM

0 0 0 0

Huge thanks to my incredible co-authors Nicola Cirone, Antonio Orvieto, Cristopher Salvi, and Terry Lyons!

#NeurIPS2024 #MachineLearning #DeepLearning #StateSpaceModels

🧵6/6

2 0 0 0