What if we could use AI models like Llama 3.2 or Mistral 7B in the browser with JupyterLite? π€―
Still at a very early stage of course, but making some good progress!
Thanks to WebLLM, which brings hardware accelerated language model inference onto web browsers, via WebGPU π
17.02.2025 08:00
π 6
π 1
π¬ 0
π 0
Asked a StackOverflow question recently. Got back a vague response that looks LLM generated by someone. And that was after putting a bounty otherwise no one answers.
Itβs going to be fun 10 years from now when models would have ingested all this as training data.
04.12.2024 10:18
π 0
π 1
π¬ 0
π 0
Thoughts on moving from Twitter/X to BlueSky - Welcome
Some quick thoughts on moving from Twitter/X to BlueSky and how I'll try to use social media after being burned once by Twitter.
I'm going to try using BlueSky more reliably for a while. Here are a few thoughts that are guiding my engagement here, and hopefully learning from our collective experience over at Twitter.
chrisholdgraf.com/blog/2024/bl...
22.11.2024 18:32
π 48
π 16
π¬ 3
π 5