Training on AI output should clearly be legal, just as training on published books or public social networks content is. Fair use and socially beneficial goes both ways
Training on AI output should clearly be legal, just as training on published books or public social networks content is. Fair use and socially beneficial goes both ways
Thanks for reporting it, but I felt their view it was an "attack" was stated as a fact, when it can also be seen as very similar as their own crawling and scraping methods, just on different content
AI companies scraped the whole internet, and even made some websites too expensive to run because of this. They use Google search to improve their prompts input.
We should challenge the view that their toolsβ output canβt be used by anyone to train models, as it is logical that we should
@rebeccabellan.bsky.social hi! Iβm a bit surprised that you took the framing of companies using Anthropic AI generation to train their models as attacks without challenging it. Anthropic trained on downloaded and scanned books, copyrighted materials, when AI answers bear no copyright at all.
Test it here: juliendorra.com/blobkeyboard
Students using the Blob Keyboard (2005) simulator to learn about the iPhone OS design story and the strange paths of prototyping π
@kocienda.bsky.social
Invent machines for imagining. Sway between control and the unexpected. SLOP Gala lets you create an outfit no one has seen before you. There's millions of fits to craft. Some might be ridiculous. Others, brilliant. And you decide which is which! try it: slopgala.com
Allez, je l'affirme, j'ai créé la meilleure IA avant-garde de mode π
Et elle est super simple Γ utiliser : on choisis des inspis qu'on aime, on les mixe, et on compose une tenue jamais vue, par personne avant nous !
slopgala.com
I think I created the best avant garde fashion AI, and itβs super easy to use: pick inspos you like, mix them, and craft an outfit that nobody has ever seen!
slopgala.com
Are you ready for the SLOP Gala?
The Fit / The Look at the SLOP Gala. Did you get your invitation?
Limitation: the system disk isnβt saved between sessions. Use screenshots + timelapse videos to keep an archive of your work.
Thanks @persistent.info for the emulator and embed API π Infinite Mac infinitemac.org has been an incredible resource! Iβm excited to use it more in workshops and lectures
- Save clean screenshots of the Macintosh display. Save in MacPaint and name your drawing, the screenshots use that name (in-browser OCR).
- Save a timelapse of your drawing as a super-light MP4 (in-browser ffmpeg). Easiest MacPaint timelapse I know (I used it to make the video in this post.)
Macintosh emulated display captured at real size on a MacBook screen
- Real-size display option to match the 9β Macintosh CRT screen. Uses a custom list to guess your physical display size, with calibration for purists. Great to feel the constraints Bill Atkinson, Jeff Raskin, and Susan Kare faced when designing the UI.
(even more below!)
It uses the insanely great Infinite Mac as the embedded emulatorβand adds useful tools around it:
- One-click shortcut to open MacPaint
- Crisp fullscreen by default, perfect for the original 1-bit UI.
(more below!)
Itβs also pure fun: draw in a retro way and explore the creative constraints of 1-bit illustrations.
MacPaint is a key step in the history of UX (and visual design!)
I built a super simple way to use MacPaint 1.0 in workshops and lectures. Try it: juliendorra.com/macintosh/
2-minute pottery in Mac Paint 1.0.
Time lapse created using juliendorra.com/macintosh/
in Accelerando (2005) by @cstross.bsky.social at the start the main character is using smart glasses and have the same disconnect with people. He's an early adopter, tech obsessed. It costs him something. Very relevant book.
Captures from @mkbhd.com's review of Meta's Ray-ban display
You should care about Kare
(because she's the first digital design professional)
Here are three people that are present everyday in your life: Jef Raskin, Bill Atkinson and Bas Ording. They make you move in very specific ways. Do you know why?
the full book: vintageapple.org/macprogrammi...
Artificial Intelligence programming book from⦠1986
I wanted to see if this video model would keep the grain, blooming and specific color quality of old sensors. And it did. What is the benefit? It allows to define the style not textually but visually, which is faster and better for people who have already mastered photo and video.
What if the atomium was a giant rotating sculpture? Shot with a 1997 camera, animated with a 2025 AI
Black Lava, made with Live Code Lab
AI and video games: using light fast AI models to get impactful results
- Combining game's ingredients in detailed monster description => GPT5 Thinking (ok, heavy model)
- Monster image gen => Flux Schnell, optimized for speed, <2s generation
- Monster video gen => Luma Ray3 draft, fast, 20-30s
CrΓ©ez le problΓ¨me pour mieux annoncer la solution π
By the way, the images are all generated using Flux Schnell, a very fast <2s, and quite small open weight image gen model, that can give you great results if prompted right. In this case the prompts are themselves generated and decided by the text model based on a mix of traits.