To learn more, read Eliezer and Nate's recent NYT bestseller: ifanyonebuildsit.com
@intelligence.org
For over two decades, the Machine Intelligence Research Institute (MIRI) has worked to understand and prepare for the critical challenges that humanity will face as it transitions to a world with artificial superintelligence.
To learn more, read Eliezer and Nate's recent NYT bestseller: ifanyonebuildsit.com
Senator Sanders met with Eliezer Yudkowsky, Nate Soares, Daniel Kokotajlo, and Jeffrey Ladish to discuss the extinction threat posed by the race to build superhuman AI systems.
ConseguΓ el tuyo acΓ‘: www.planetadelibros.us/libro-si-alg...
Ya disponible en espaΓ±ol: "Si alguien la crea, todos moriremos" de Eliezer Yudkowsky y @so8res.bsky.social.
El bestseller del New York Times sobre por quΓ© la superinteligencia artificial es una amenaza para la humanidad y por quΓ© la carrera por crearla debe detenerse.
On BBC, MIRI CEO @malo.online discusses the dispute between Anthropic and the DoW:
"I really worry about the big questions of how we'll coordinate to set regulation, and potentially coordinate internationally[...] This is not a good first test."
Learn more about the proposal here: techgov.intelligence.org/blog/new-rep...
This week at the IASEAI conference, MIRI researcher @pbarnett.bsky.social discusses how and why the Technical Governance Team's proposed international agreement could halt the development of superintelligence.
Check out the full interview here:
www.cnn.com/2026/02/26/t...
CNN's Paula Newton asks: Which is the greater threat? Governments racing to militarize AI, or companies racing to monetize it?
MIRI President Nate Soares says there are dangers from both, but also a third danger: AI systems themselves could become smart enough to defy control.
I was on ABC News Live last week!
My first live interview. I think it went pretty well, especially considering I only had ~20 mins notice :p
Final Update: From ~$450k earlier today, weβre now down to just over $250k left in unclaimed matching funds!
4 hours left to go, and by golly it looks like weβve got a real shot at securing all the matching.
Thanks everyone! Happy New Year π
With just under 7 hours to go, weβre now down below $300k of unclaimed matching funds!
Update 2: Weβre down to ~$450k left of unclaimed matching funds, with just over 12 hours to go!
Thanks to all those who stepped up in the last couple of days to close the gap by ~$500k. β€οΈ
PSA: Itβs worth reaching out to old donors, because sometimes this happens π
Update: Weβve received over $250k since this was posted.
~$700k in matching funds remaining.
And of course, to everyone whoβs already donated, including the >100 first-time MIRI donors who gave during the fundraiser, thank you!
If youβre looking for other ways to help, sharing this thread, or quote-posting it with why you chose to support us would mean a lot.
If youβd like to help us secure as much of the remaining matching funds as we can, weβd be grateful for your support.
(If you donβt see your preferred donation method, including cryptocurrencies, reach out to us at development@intelligence.org.)
Why is this real counterfactual matching?
The funds come from a Survival and Flourishing Fund matching pledgeβnot a traditional grant.
You can learn more about SFFβs matching pledges here:
Donations to MIRI before Jan 1 are high-leverage. Weβve got ~$1.6M in 1:1 matching from SFF, over half of which has yet to be claimed!
This is real counterfactual matching: whatever doesnβt get matched by the end of Dec 31, we donβt get. π§΅
βIf Anyone Builds It, Everyone Diesβ was recently added to the New Yorker's βThe Best Books of the Year So Farβ list!
newyorker.com/best-books-2...
βIf Anyone Builds It, Everyone Diesβ coauthor Nate Soares recently chatted with Major Garrett on @cbsnews.com.
@hankgreen.bsky.social rarely does interviews or 30+ min long videos.
His latest video, an hour+ long interview with Nate Soares about βIf Anyone Builds It, Everyone Dies,β is a banger. My new favorite!
www.youtube.com/watch?v=5CKu...
In the Bay Area? Come join Nate Soares, in conversation with Semafor Tech Editor Reed Albergotti, about Nate's NYT bestselling book βIf Anyone Builds It, Everyone Dies.β
ποΈ Tuesday Oct 28 @ 7:30pm at Mannyβs in SF.
Get your tickets:
Academy Award winning director Kathryn Bigelow is reading βIf Anyone Builds It, Everyone Dies.β
From an interview in The Guardian by Danny Leigh: www.theguardian.com/film/2025/oc...
βThe book uses parables, very well told, to argue that evolutionary processes are not predictable, at least not easily. [...] I came away far more concerned than I had been before opening the book.β
www.forbes.com/sites/billco...
Todayβs episode of The Ezra Klein Show.
The researcher Eliezer Yudkowsky argues that we should be very afraid of artificial intelligenceβs existential risks.
www.nytimes.com/2025/10/15/o...
youtu.be/2Nn0-kAE5c0?...
Michael talks with Nate Soares, co-author of "If Anyone Builds It, Everyone Dies", on the risks of advanced artificial intelligence. Soares argues that humanity must treat AI risk as seriously as pandemics or nuclear war.
Hear the #bookclub #podcast π§π https://loom.ly/w1hBbWM