this is really neat. classifiers will always be a great outcome of the access to LLMs. and this is a pretty neat idea to run it on follower bios.
@bsky.app if you ever consider adding insights at scale, neat idea by @mahdiyusuf.com
this is really neat. classifiers will always be a great outcome of the access to LLMs. and this is a pretty neat idea to run it on follower bios.
@bsky.app if you ever consider adding insights at scale, neat idea by @mahdiyusuf.com
The largest context windows in LLMs range from 128k tokens to 2 million π
That's up to 100,000 lines of code, 10 years of text messages, or 16 average English novels.
But what business cases are large context windows best for?
#ai #chatgpt #claude #gemini #tokens #llms
π¬
If you're having a hard time finding senior software engineers, you're not alone. 93% of employers are having a hard time hiring the right people (Linux Foundation Jobs Report).
πHere's how to hire senior engineers, save money, and speed up delivery:
t.co/4clqmJSejY