Will ~ Data Center Drainer's Avatar

Will ~ Data Center Drainer

@wboler

Democratic Socialist, Angry, AuDHD, Veteran, PhD Dropout, AI, Math, Engineering, Gamer, Thinker, Aging, Griping, Lefty, Snowflake

2,049
Followers
1,195
Following
8,448
Posts
18.11.2024
Joined
Posts Following

Latest posts by Will ~ Data Center Drainer @wboler

Show me on the doll where the internet threatened you AES

07.03.2026 01:25 πŸ‘ 1 πŸ” 1 πŸ’¬ 0 πŸ“Œ 0

Even if I get barred from using AI, jokes on you, I can still code without it fuckers

07.03.2026 01:24 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Inside a NEW AI Cluster - Tour with NVIDIA B200
Inside a NEW AI Cluster - Tour with NVIDIA B200 YouTube video by ServeTheHome

Here's a happy tour of an AI data center in Columbus, Ohio. There's a part where you can hear how loud it is. 32kW per rack, is extreme compute density, and will only get worse.

youtu.be/N5AJJ0tAoxc?...

06.03.2026 19:19 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

There’s also a fun part where he’s asked about getting his company into crypto and bitcoin and he’s like hey buddy we got into *bitcoin*, not *crypto*. Sure man that rocks.

06.03.2026 17:52 πŸ‘ 229 πŸ” 5 πŸ’¬ 4 πŸ“Œ 0

Help, we made a democratic platform and can't figure out why it turned socialist

06.03.2026 18:33 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Probably worse than that, because at some point you can probably reason with Charlie

06.03.2026 18:31 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

My blood pressure went up 10 points after a social media post.

06.03.2026 18:18 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
hallucination noun (COMPUTERS)
 
[ C ]
false information that is produced by an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human):
If the chatbot is used in the classroom as a teaching aid, there is a risk that its hallucinations will enter the permanent record.
Because large language models are designed to produce coherent text, their hallucinations often appear plausible.
She discovered that the articles cited in the essay did not exist, but were hallucinations that had been invented by the AI.
 
[ U ]
the fact of an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) producing false information:
The system tends to make up information when it doesn’t know the exact answer – an issue known as hallucination.
Is it possible to solve the problem of AI hallucination?
 SMART Vocabulary: related words and phrases
(Definition of hallucination from the Cambridge Advanced Learner's Dictionary & Thesaurus Β© Cambridge University Press)

hallucination noun (COMPUTERS) [ C ] false information that is produced by an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human): If the chatbot is used in the classroom as a teaching aid, there is a risk that its hallucinations will enter the permanent record. Because large language models are designed to produce coherent text, their hallucinations often appear plausible. She discovered that the articles cited in the essay did not exist, but were hallucinations that had been invented by the AI. [ U ] the fact of an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) producing false information: The system tends to make up information when it doesn’t know the exact answer – an issue known as hallucination. Is it possible to solve the problem of AI hallucination? SMART Vocabulary: related words and phrases (Definition of hallucination from the Cambridge Advanced Learner's Dictionary & Thesaurus Β© Cambridge University Press)

We can try to come up with other terms and such, but "hallucination" is the term that's stuck, has been published, has been used for decades, and is trying to be understood outside of LLMs. But it's already in Cambridge's dictionary, so it's a moot point.

06.03.2026 18:17 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Hallucination is actually the correct way to think about this, because the models are just making stuff up based on probability via their generative process. They're drawing lines in a latent space, connecting grains of sand of data, and in between those points are hallucinations.

06.03.2026 18:17 πŸ‘ 0 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

It's basically like saying "my dream was wrong about the invasion or Normandy." No, it was just a dream and has no connection to reality, which is basically what LLMs are actually doing. The generative models are putting pieces together to interpolate a space, but when it extrapolates it fails.

06.03.2026 18:17 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

the existing information manifold, aka, the models leaves its surface, it "hallucinates" stuff. It's not necessarily right or wrong, because it is specifically designed to model the probability space of the data set, which the data set itself may not map well to reality, either.

06.03.2026 18:17 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Part of my understanding with hallucinations dives into the idea of information and latent space manifolds. The model attempts to map the information space to its internal representation, but where those voids or misrepresentations exist within the latent space which does not match with...

06.03.2026 18:17 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
Hallucination (artificial intelligence) - Wikipedia

Sorry, I wasn't trying to dunk.

I worked in this area, and was studying hallucinations for my PhD. We used the term hallucination in a different sense, it's not something that tech bros made up to feel special, is all I'm saying.

en.wikipedia.org/wiki/Halluci...

06.03.2026 18:17 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Imagine that.... a WHITE POLITICIAN wants to put a POLLUTING DIESEL POWERED DATA CENTER in a BLACK NEIGHBORHOOD. Color me surprised....

bsky.app/profile/mrwi...

06.03.2026 16:05 πŸ‘ 5 πŸ” 2 πŸ’¬ 1 πŸ“Œ 0

The interesting thing about AI is how committed executives are to using it after it measurably fails catastrophically at its goals. Like there seems to be no limit to how badly it can fuck up and still be heralded as the future of technology

06.03.2026 14:14 πŸ‘ 257 πŸ” 58 πŸ’¬ 15 πŸ“Œ 9
Preview
Hogsett’s economic development org supports Metrobloks data center The mayor chairs the board of the organization backing a proposed data center in Martindale Brightwood.

Several tech companies want to build data centers in Indianapolis.

Mayor Joe Hogsett has taken a neutral position on data centers, but a letter from his economic development organization in support of a proposed data center in Indianapolis sends a different signal πŸ“‘

06.03.2026 14:10 πŸ‘ 1 πŸ” 2 πŸ’¬ 0 πŸ“Œ 1

Hogsett, retire bitch.

06.03.2026 15:04 πŸ‘ 5 πŸ” 4 πŸ’¬ 0 πŸ“Œ 0

They fund it in Gaza. They're doing it to Venezuelan fisherman. They're doing it in Iran. They're doing it in the streets and in concentration camps to immigrants in the US. They've done it to US citizen protestors on the streets in Minnesota.

There are no limits to what they are capable of.

06.03.2026 13:31 πŸ‘ 914 πŸ” 221 πŸ’¬ 4 πŸ“Œ 2
The Guys Who Invented Q-Tips
The Guys Who Invented Q-Tips YouTube video by Ryan George

youtu.be/BZ7IEWwQ4Cg?...

06.03.2026 02:01 πŸ‘ 6 πŸ” 1 πŸ’¬ 2 πŸ“Œ 0

I bought one of those ear cameras to check to see if my ear had impacted wax. No, both were perfectly clear. I've been using q tips for over 30 years, they can eat my wax

06.03.2026 15:52 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Ear doctor: Don’t use Q-Tips
Me already imagining putting a screw driver I found in the driveway in my ear when I get home: I womt

06.03.2026 00:50 πŸ‘ 3559 πŸ” 485 πŸ’¬ 36 πŸ“Œ 11

I need people to understand, this technology has no real way to understand when it's right or wrong. It's not rationalizing. It's just predicting what word is more likely to come next. Then they use a bunch of similar technology to analyze that output. Hallucinations is what it's fully doing.

06.03.2026 15:47 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Hallucination is an actual term we use in the field, when the model incorrectly interpolates the information space and leaves the manifold. It's different from "getting it wrong" because there is no mechanism in the LLM to get it right or wrong.

06.03.2026 15:45 πŸ‘ 0 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

I had a pet rock as a kid and it died. Devastated me so much I tried to bury it in the yard but I found a mass grave of other pet rocks. I realized I must’ve stumbled upon some crime, or tragedy at least. Perhaps a battle site from before rocks were domesticated?

06.03.2026 14:57 πŸ‘ 92 πŸ” 11 πŸ’¬ 1 πŸ“Œ 1

I don’t think LLMs would take off without this kind of userbase, because they’re the ones talking to ChatGPT and Grok all day every day because they think they’re reaching Jesus levels of insight. This is why the technology gets called a scam, because the industry is definitely scamming these people

06.03.2026 14:06 πŸ‘ 570 πŸ” 42 πŸ’¬ 10 πŸ“Œ 2

Once again it pisses me off to no end to see what these techbro assholes have done to legitimate subfields of Computer Science.

LLMs should just be one of many pieces of AI/ML that y'all who are not computer scientists simply don't ever have to think about.

06.03.2026 15:26 πŸ‘ 10 πŸ” 1 πŸ’¬ 2 πŸ“Œ 0

It will probably be released at the same time in the same data dump, so you're better to bet in alphabetical order

06.03.2026 15:40 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Join my new predictive market platform Douchero to place a wager on which one of these guys gets caught with CSAM first.

06.03.2026 10:18 πŸ‘ 1005 πŸ” 172 πŸ’¬ 21 πŸ“Œ 2

My dad was one of the first people to realize the internet could be useful for organizing and had their entire org on compuserve when it was still DOS-based! I was on CompuServe forums as a 7 year old. Like, I was basically groomed to loooooove tech and I still hate this GenAI shit.

06.03.2026 04:34 πŸ‘ 379 πŸ” 27 πŸ’¬ 10 πŸ“Œ 0

My dad was an early adopter, and he was hired at his first job because he could figure out tech that nobody in Indianapolis knew. He would bring home computer parts to teach me how to build them. I worked as an AI researcher, and even I hate this shit.

06.03.2026 15:34 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0