We're probably at peak research science. Low level stuff is getting automated, coding is basically reviewing, plots are beautiful, everyone gets a brainstorming buddy. Work is mostly ideation/planning. Soon it might become mostly meetings/auditing. Hope I'm wrong, work is pretty fun right now!
05.02.2026 12:51
π 1
π 0
π¬ 0
π 0
Humming in denial as Material 3 takes over all my screens
04.11.2025 00:34
π 2
π 0
π¬ 1
π 0
Atrapanubes is such a good Chilean beer. Great taste, great art, great name.
16.03.2025 23:25
π 2
π 0
π¬ 0
π 0
Gotta wait until he double-crosses Indiana Jones to steal the Holy Grail, I'm afraid.
09.03.2025 14:02
π 107
π 2
π¬ 3
π 0
We've upgraded Le Chat and it's blazing fast right now!
Also available for Android and iOS as of today
mistral.ai/en/news/all-...
06.02.2025 18:07
π 11
π 2
π¬ 3
π 0
mistral-small
Mistral Small 3 sets a new benchmark in the βsmallβ Large Language Models category below 70B.
Mistral Small 3 is also available on many partner platforms:
- Ollama: ollama.com/library/mist...
- Kaggle: kaggle.com/models/mistr...
- Fireworks: fireworks.ai/models/firew...
- Together: together.ai/blog/mistral...
And many more soon!
30.01.2025 21:17
π 6
π 1
π¬ 0
π 0
Mistral Small 3 Base model
huggingface.co/mistralai/Mi...
30.01.2025 21:17
π 0
π 0
π¬ 1
π 0
Mistral Small 3 architecture is optimised for latency while preserving high quality
30.01.2025 21:17
π 0
π 0
π¬ 1
π 0
Mistral Small 3
Apache 2.0, 81% MMLU, 150 tokens/s
We're releasing Mistral Small 3!
- 24B params, 81% MMLU
- Latency optimized: 150 tokens/s
- Competitive with Llama-3.3 70B, Qwen-2.5 32B, GPT4o-mini
- Apache 2.0
mistral.ai/news/mistral...
30.01.2025 21:17
π 49
π 7
π¬ 7
π 1
What people are going to do with AGI
26.01.2025 16:30
π 95
π 9
π¬ 2
π 1
Screen cap from one of the Thor movies featuring a dark haired pale skinned woman as Thor's sister Hela. She has her hand out stopping Thor's hammer (MjΓΆlnir) in mid air. The hammer is labeled "It's basic biology". Hela is labeled "Advanced Biology"
I know, but it's just an application of one of my favorite memes:
21.01.2025 19:07
π 499
π 65
π¬ 7
π 7
agent swarm framework aces spatial reasoning test
25.12.2024 16:59
π 130
π 32
π¬ 5
π 2
Inventors of flow matching have released a comprehensive guide going over the math & code of flow matching!
Also covers variants like non-Euclidean & discrete flow matching.
A PyTorch library is also released with this guide!
This looks like a very good read! π₯
arxiv: arxiv.org/abs/2412.06264
10.12.2024 08:35
π 109
π 26
π¬ 1
π 1
Building Machine Learning Systems for a Trillion Trillion Floating Point Operations
YouTube video by Jane Street
Jane Street, a quant trading firm has a very good YouTube channel. For comparison, DeepSeek is also a quant trading firm.
They recently published a video on "Building Machine Learning Systems for a Trillion Trillion Floating Point Operations".
Link: www.youtube.com/watch?v=139U...
09.12.2024 17:26
π 36
π 7
π¬ 1
π 0
AI Scientists: here is a technology that will automate your grunt work so you can spend more time with your kids
AI Ads: here is a technology that will automate spending time with your kids
03.12.2024 22:35
π 5
π 2
π¬ 0
π 0
A dataset of 1 million or 2 million Bluesky posts is completely irrelevant to training large language models.
The primary usecase for the datasets that people are losing their shit over isn't ChatGPT, it's social science research and developing systems that improve Bluesky.
28.11.2024 18:57
π 251
π 39
π¬ 8
π 5
Arxiv sharing reminder
pdf β
abs β
26.11.2024 08:42
π 249
π 41
π¬ 9
π 2
In fact, statistical malpractice is the main driver of progress in machine learning. At some point, we need to come to terms with this.
22.11.2024 14:40
π 52
π 5
π¬ 3
π 6
French Revolution: Cyclists Now Outnumber Motorists In Paris
Official measurements have found that Paris is rapidly becoming a city of cyclists.
READ: β3,337 Parisians were equipped with GPS trackers to record their journeysβ¦for journeys from the outskirts of Paris to the center, the number of cyclists now far exceeds the number of motorists, a huge change from just 5 years ago.β
Evidence of leadership.
www.forbes.com/sites/carlto...
19.11.2024 19:11
π 1122
π 322
π¬ 15
π 72
Comparison table of various AI models across different benchmarks: Mathvista, MMMU, ChartQA, DocVQA, VQAv2, AI2D, and MM MT-Bench. Models are categorized into Open Weights, Closed, and Unreleased. Key models include Pixtral Large, Llama-3.2 90B, Gemini-1.5 Pro, GPT-4o, Claude-3.5 Sonnet, Llama-3.1 505B, and Grok-2. The table shows measured and reported performance scores, highlighting differences in model capabilities across various tasks. Pixtral Large excels in Mathvista, DocVQA, AI2D and MM MT-Bench benchmarks.
Pixtral Large:
- 123B decoder, 1B vision encoder, 128K sequence length
- Frontier multimodal model
- Maintains text performance of Mistral Large 2
HF weights: huggingface.co/mistralai/Pi...
Try it: chat.mistral.ai
Blog post: mistral.ai/news/pixtral...
18.11.2024 17:56
π 4
π 0
π¬ 0
π 0
Two announcement cards from the Mistral AI team, dated November 18, 2024. The first card announces 'Mistral has entered the chat' with a brief description: 'Search, vision, ideation, coding... all yours for free.' The second card announces 'Pixtral Large' with the description: 'Pixtral grows up.' Both cards feature an orange 'Read More' button.
We have 2 new big updates today at Mistral:
- New Le Chat: With canvas, web search, image understanding and generation & more - and free!
- Pixtral Large, our Frontier 124B open weight multimodal model that powers it.
Try it: chat.mistral.ai
Blog post: mistral.ai/news/mistral...
18.11.2024 17:56
π 15
π 1
π¬ 2
π 1
Diffusion is spectral autoregression
A deep dive into spectral analysis of diffusion models of images, revealing how they implicitly perform a form of autoregression in the frequency domain.
There seems to be some renewed interest in making this work in the ML/AI space, so I'm here as well π
Here's my latest blog post for good measure, about how diffusion models of images perform autoregression in frequency space: sander.ai/2024/09/02/s...
When I write more, I'll share here as well!
15.11.2024 18:57
π 28
π 4
π¬ 1
π 3
Quick thread in response to a question on token packing practices when pretraining LLMs!
07.11.2024 18:21
π 9
π 3
π¬ 0
π 0