Show me on the doll where the internet threatened you AES
Show me on the doll where the internet threatened you AES
Even if I get barred from using AI, jokes on you, I can still code without it fuckers
Here's a happy tour of an AI data center in Columbus, Ohio. There's a part where you can hear how loud it is. 32kW per rack, is extreme compute density, and will only get worse.
youtu.be/N5AJJ0tAoxc?...
Thereβs also a fun part where heβs asked about getting his company into crypto and bitcoin and heβs like hey buddy we got into *bitcoin*, not *crypto*. Sure man that rocks.
Help, we made a democratic platform and can't figure out why it turned socialist
Probably worse than that, because at some point you can probably reason with Charlie
My blood pressure went up 10 points after a social media post.
hallucination noun (COMPUTERS) [ C ] false information that is produced by an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human): If the chatbot is used in the classroom as a teaching aid, there is a risk that its hallucinations will enter the permanent record. Because large language models are designed to produce coherent text, their hallucinations often appear plausible. She discovered that the articles cited in the essay did not exist, but were hallucinations that had been invented by the AI. [ U ] the fact of an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) producing false information: The system tends to make up information when it doesnβt know the exact answer β an issue known as hallucination. Is it possible to solve the problem of AI hallucination? SMART Vocabulary: related words and phrases (Definition of hallucination from the Cambridge Advanced Learner's Dictionary & Thesaurus Β© Cambridge University Press)
We can try to come up with other terms and such, but "hallucination" is the term that's stuck, has been published, has been used for decades, and is trying to be understood outside of LLMs. But it's already in Cambridge's dictionary, so it's a moot point.
Hallucination is actually the correct way to think about this, because the models are just making stuff up based on probability via their generative process. They're drawing lines in a latent space, connecting grains of sand of data, and in between those points are hallucinations.
It's basically like saying "my dream was wrong about the invasion or Normandy." No, it was just a dream and has no connection to reality, which is basically what LLMs are actually doing. The generative models are putting pieces together to interpolate a space, but when it extrapolates it fails.
the existing information manifold, aka, the models leaves its surface, it "hallucinates" stuff. It's not necessarily right or wrong, because it is specifically designed to model the probability space of the data set, which the data set itself may not map well to reality, either.
Part of my understanding with hallucinations dives into the idea of information and latent space manifolds. The model attempts to map the information space to its internal representation, but where those voids or misrepresentations exist within the latent space which does not match with...
Sorry, I wasn't trying to dunk.
I worked in this area, and was studying hallucinations for my PhD. We used the term hallucination in a different sense, it's not something that tech bros made up to feel special, is all I'm saying.
en.wikipedia.org/wiki/Halluci...
Imagine that.... a WHITE POLITICIAN wants to put a POLLUTING DIESEL POWERED DATA CENTER in a BLACK NEIGHBORHOOD. Color me surprised....
bsky.app/profile/mrwi...
The interesting thing about AI is how committed executives are to using it after it measurably fails catastrophically at its goals. Like there seems to be no limit to how badly it can fuck up and still be heralded as the future of technology
Several tech companies want to build data centers in Indianapolis.
Mayor Joe Hogsett has taken a neutral position on data centers, but a letter from his economic development organization in support of a proposed data center in Indianapolis sends a different signal π‘
Hogsett, retire bitch.
They fund it in Gaza. They're doing it to Venezuelan fisherman. They're doing it in Iran. They're doing it in the streets and in concentration camps to immigrants in the US. They've done it to US citizen protestors on the streets in Minnesota.
There are no limits to what they are capable of.
I bought one of those ear cameras to check to see if my ear had impacted wax. No, both were perfectly clear. I've been using q tips for over 30 years, they can eat my wax
Ear doctor: Donβt use Q-Tips
Me already imagining putting a screw driver I found in the driveway in my ear when I get home: I womt
I need people to understand, this technology has no real way to understand when it's right or wrong. It's not rationalizing. It's just predicting what word is more likely to come next. Then they use a bunch of similar technology to analyze that output. Hallucinations is what it's fully doing.
Hallucination is an actual term we use in the field, when the model incorrectly interpolates the information space and leaves the manifold. It's different from "getting it wrong" because there is no mechanism in the LLM to get it right or wrong.
I had a pet rock as a kid and it died. Devastated me so much I tried to bury it in the yard but I found a mass grave of other pet rocks. I realized I mustβve stumbled upon some crime, or tragedy at least. Perhaps a battle site from before rocks were domesticated?
I donβt think LLMs would take off without this kind of userbase, because theyβre the ones talking to ChatGPT and Grok all day every day because they think theyβre reaching Jesus levels of insight. This is why the technology gets called a scam, because the industry is definitely scamming these people
Once again it pisses me off to no end to see what these techbro assholes have done to legitimate subfields of Computer Science.
LLMs should just be one of many pieces of AI/ML that y'all who are not computer scientists simply don't ever have to think about.
It will probably be released at the same time in the same data dump, so you're better to bet in alphabetical order
Join my new predictive market platform Douchero to place a wager on which one of these guys gets caught with CSAM first.
My dad was one of the first people to realize the internet could be useful for organizing and had their entire org on compuserve when it was still DOS-based! I was on CompuServe forums as a 7 year old. Like, I was basically groomed to loooooove tech and I still hate this GenAI shit.
My dad was an early adopter, and he was hired at his first job because he could figure out tech that nobody in Indianapolis knew. He would bring home computer parts to teach me how to build them. I worked as an AI researcher, and even I hate this shit.