Home New Trending Search
About Privacy Terms
#
#bitnet
Posts tagged #bitnet on Bluesky
Run a 100B Model on CPUs?! Microsoft Run a 100B Model on CPUs?! Microsoft

[JP] CPUで100Bモデルが動く!?Microsoftの1ビットLLM推論フレームワーク「bitnet.cpp」が革命的すぎる
[EN] Run a 100B Model on CPUs?! Microsoft

ai-minor.com/blog/en/2026-03-12-17733...

#BitNet #Microsoft #ローカルAI #LLM #AI #Tech

0 0 0 0
Post image

Just ran setup_env.py and it compiled the BitNet‑b1.58‑2B‑4T C++ backend with CMake in seconds. Ready for local inference on your machine—no Hugging Face hassle. Dive into the details! #BitNet #PythonCMake #LocalInference

🔗 aidailypost.com/news/python-...

0 0 0 0

TicketLLM: Next-Generation Sparse and Low-bit Transformers with Supermask-based Method

Yasuyuki Okoshi, Hikari Otsuka, Daichi Fujiki, Masato Motomura

Action editor: Brian Kingsbury

https://openreview.net/forum?id=sE69HKykQw

#bitnet #supermasks #supermask

0 0 0 0
Awakari App

The Era of 1.58-Bit Computing Introducing Trion Core Continue reading on Medium »

#artificial-intelligence #bitnet #llm #machine-learning

Origin | Interest | Match

0 0 0 0
The bitnet demo cannot process the word penchant. It reads it as "Pellant"

The bitnet demo cannot process the word penchant. It reads it as "Pellant"

Check out the official bitnet demo bitnet-demo.azurewebsites.net

it is hilarious

#LLM #AI #BitNet

0 0 0 0
Preview
Microsoft Bitnet 1.58-bit LLMs on AWS Lambda Deploying Microsoft Bitnet 1.58-bit 2 billion parameter LLMs on AWS Lambda

"Microsoft Bitnet 1.58-bit LLMs on AWS Lambda" by Manu Mishra

#lambda #llm #bitnet #1-bit-llm #quantized

0 0 0 0
Original post on franksworld.com

1-Bit LLM: The Most Efficient LLM Possible? Ever found yourself window-shopping, not for clothes or gadgets, but for the latest open-source AI models? If you have, you’ll quickly realize that own...

#AI #Large #Language #Models #AI #efficiency #AI #models #bitnet #computational #resources […]

0 0 0 0
Original post on infosec.exchange

Weird thing is that I am on a martial arts mailing list (originally created to mock the newbie rec.martial-arts poseurs) that I have been on since 1987 and I am by far the youngest member of the group. I have no idea why they invited me, weird old croaks probably just wanted a youngster […]

2 0 1 0
picture

picture

🔬🤯 Modele 1-bitowe to rewolucja w AI! Wagi sieci neuronowej zapisujemy tylko 1 bitem – zamiast 32 czy 16. To nawet 16x mniejszy rozmiar i ogromne oszczędności energii, przy zachowaniu jakości klasycznych LLM. Przyszłość AI jest lekka! 🚀#AI #LLM #quantization #BitNet

1 0 2 0
Preview
Microsoft’s BitNet Outperforms AI Giants With 400MB Memory Discover how Microsoft’s BitNet surpasses Llama and Gemma using just 400MB and no GPU, revolutionizing AI on standard devices.

Microsoft’s BitNet Crushes AI Giants Using Just 400MB and No GPU #ArtificialIntelligence #TechNews #Microsoft #BitNet #MachineLearning #GreenTech
www.squaredtech.co/microsofts-b...

1 1 0 0
Post image

Quando il minimalismo diventa la chiave per pensare (e far pensare) meglio:la mia riflessione su BitNet, il modello IA che sussurra dove gli altri#AIdemocratica #AIlowpower #AIminimalista #AIpertutti #BitNet #BitNetMicrosoft #bitnetcpp #efficienzacomputazionaleAI
www.corrierenerd.it/bitnet/

0 0 0 0
Original post on medium.com

1 Bit’lik Devrim: BitNet b1.58 2B4T ile LLM Verimliliğinde Yeni Bir Çağ Giriş ve Motivasyon...

medium.com/@cenghanbayram35/1-bitli...

#machine-learning #bitnet #llm […]

0 0 0 0
Post image

Rassurez-vous : les auteurs ont tous été payés. ^^'
#OhWait! #AI #BitNet #SpywareWithASmile ^^'

0 0 0 0
Preview
GitHub - microsoft/BitNet: Official inference framework for 1-bit LLMs Official inference framework for 1-bit LLMs. Contribute to microsoft/BitNet development by creating an account on GitHub.

🌟 #Microsoft AI Revolution: Bitnet.cpp

Fast inference for binary LLMs on CPU:
💻 100B #BitNet b1.58 model on single CPU
🔢 Binary math vs floating p decimal faster processing
🔄 Potential GPU architecture shift

👉 Discuss in Discord: linktr.ee/qdrddr

#LLM #AI #MachineLearning #Technology #Innovation

1 0 0 0
Preview
Microsoft researchers say they've developed a hyper-efficient AI model that can run on CPUs | TechCrunch Microsoft researchers have developed — and released — a hyper-efficient AI model that can run on CPUs, including Apple's M2.

"Microsoft researchers claim they’ve developed the largest-scale 1-bit #AI model, also known as a 'bitnet,' to date. Called #BitNet b1.58 2B4T, it’s openly available under an MIT license and can run on CPUs, including Apple’s M2."
#Microsoft #LLM #GenAI
techcrunch.com/2025/04/16/m...

0 0 0 0
Preview
Microsoft Releases BitNet b1.58 2B4T, a 1.58-Bit AI Model That Runs on Standard CPUs - WinBuzzer BitNet b1.58 2B4T from Microsoft Research aims for efficient AI use on CPUs with a native 1.58-bit architecture and custom frameworks.

Microsoft Releases BitNet b1.58 2B4T, a 1.58-Bit AI Model That Runs on Standard CPUs

#AI #Microsoft #LLMs #BitNet #OpenSource #MachineLearning #CPU #DeepLearning #1bitLLM #GenAI

winbuzzer.com/2025/04/17/m...

1 1 0 0
brief alt text description of the first image

brief alt text description of the first image

Microsoft open-sourced BitNet b1.58, a big 2B param 1-bit AI. Super efficient on CPUs like M2, it rivals similar models using less memory/speed. Needs a custom framework for now, GPU support pending.
#AI #Microsoft #BitNet

1 0 0 0
Screenshot of source code to arXiv submission. The visible lines read:

%Paper: hep-ph/9210243
%From: <GRADWOHL%UCLAHEP.BITNET@CORNELLC.cit.cornell.edu>
%Date: Fri, 16 Oct 92 15:19 PDT

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%  1.) The file has to be "TeXed", using the macropackage `phyzzx'.
%  2.) The 5 figures are available on request. They can be received
%      through e-mail in form of PS files, or by regular mail.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%\input phyzzx

\hsize 6.5truein
\vsize 9.0truein
\hoffset 0.2truein
\voffset 0.06truein

Screenshot of source code to arXiv submission. The visible lines read: %Paper: hep-ph/9210243 %From: <GRADWOHL%UCLAHEP.BITNET@CORNELLC.cit.cornell.edu> %Date: Fri, 16 Oct 92 15:19 PDT %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % 1.) The file has to be "TeXed", using the macropackage `phyzzx'. % 2.) The 5 figures are available on request. They can be received % through e-mail in form of PS files, or by regular mail. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %\input phyzzx \hsize 6.5truein \vsize 9.0truein \hoffset 0.2truein \voffset 0.06truein

Was looking at the source to a very early arXiv paper (https://arxiv.org/abs/hep-ph/9210243 The PDF is unavailable, for reasons that are obscure ("pre-1996 submission which cannot be processed"). But there's a lot of history in the source code: it looks […]

[Original post on mementomori.social]

0 0 0 0
Preview
GitHub - microsoft/BitNet: Official inference framework for 1-bit LLMs Official inference framework for 1-bit LLMs. Contribute to microsoft/BitNet development by creating an account on GitHub.

Microsoftが2024年10月、BitNet を一般公開している

ARM CPU で1.37 倍から5.07 倍の高速化を実現し、モデルが大きいほどパフォーマンスが向上。エネルギー消費を55.4%から70.0%削減とのこと
GPU/NPUへの対応は開発中

AIの実用化、性能向上による新しい地図が楽しみだ

github.com/microsoft/Bi...

#BitNet

0 0 0 0

💻 Supports running 100B #BitNet b1.58 model on single CPU at 5-7 tokens/sec
🛠️ Built on #opensource #llamacpp framework with optimized kernels
🔄 Compatible with existing 1-bit models from #HuggingFace
📱 Future support planned for #NPU and #GPU platforms

0 0 1 0
Preview
GitHub - grctest/FastAPI-BitNet: Running Microsoft's BitNet inference framework via FastAPI, Uvicorn and Docker. Running Microsoft's BitNet inference framework via FastAPI, Uvicorn and Docker. - grctest/FastAPI-BitNet

Check out FastAPI-BitNet on GitHub: github.com/grctest/Fast... #python #bitnet #fastapi #llm #docker

1 1 0 0
Preview
GitHub - grctest/Electron-BitNet: Running Microsoft's BitNet via Electron, React & Astro Running Microsoft's BitNet via Electron, React & Astro - grctest/Electron-BitNet

Check out Electron-BitNet on GitHub: github.com/grctest/Elec... #electron #astro #bitnet #llm

1 0 0 0
Preview
Como a arquitetura BitNet de próxima geração da Microsoft está acelerando a eficiência dos LLMs - Tecnocrata

Como a arquitetura BitNet de próxima geração da Microsoft está acelerando a eficiência dos LLMs

#IA #InteligênciaArtificial #MachineLearning #BitNet #LLMs #MicrosoftResearch #Tecnologia #Inovação #Privacidade #Eficiência

tecnocrata.com.br/2024/11/14/como-a-arquit...

0 0 0 0