Marcin Junczys-Dowmunt (Marian NMT)'s Avatar

Marcin Junczys-Dowmunt (Marian NMT)

@marian-nmt

NLP. NMT. Main author of Marian NMT. Research Scientist at Microsoft Translator. https://marian-nmt.github.io

160
Followers
134
Following
16
Posts
25.11.2024
Joined
Posts Following

Latest posts by Marcin Junczys-Dowmunt (Marian NMT) @marian-nmt

Still no bookmarks?

05.02.2025 07:04 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Search Jobs | Microsoft Careers

Hi, the Microsoft Translator research team is looking for an intern for the summer. If you a PhD student in Machine Translation, Natural Language Processing, or related, check it out: aka.ms/mtintern

28.01.2025 17:55 πŸ‘ 5 πŸ” 4 πŸ’¬ 0 πŸ“Œ 0
Post image

Just 10 days after o1's public debut, we’re thrilled to unveil the open-source version of the technique behind its success: scaling test-time compute

By giving models more "time to think," Llama 1B outperforms Llama 8B in mathβ€”beating a model 8x its size. The full recipe is open-source!

16.12.2024 21:42 πŸ‘ 83 πŸ” 18 πŸ’¬ 4 πŸ“Œ 2

Rant: Apparently every vector-based sentence alignment tool insists on having an unusable file-based API.

16.12.2024 21:49 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

It's more confusing than that. It does exist and seems to mean crucifix which had me even more confused. Suddenly very high stakes πŸ˜€

16.12.2024 18:26 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Rood?

16.12.2024 18:19 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Missing bookmarks are a much bigger deal for me. But I think it's funny that they didn't go for one of the most requested features. Seemed like an easy win.

16.12.2024 18:11 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Phi-4 Technical Report Phi-4 is the latest LLM from Microsoft Research. It has 14B parameters and claims to be a big leap forward in the overall Phi series. From [Introducing Phi-4: Microsoft’s Newest …

Wrote up some notes on Microsoft's new Phi-4 LLM. They trained it on a LOT of synthetic data, and the details of how and why they did that are really interesting.
https://simonwillison.net/2024/Dec/15/phi-4-technical-report/

16.12.2024 00:02 πŸ‘ 14 πŸ” 6 πŸ’¬ 0 πŸ“Œ 0
Chart of time vs:
- number of cameras (exponentially increasing ),
- giant squid footage (exponentially increasing ),
- bigfoot footage (small and not increasing), and
- good quality UFO footage (small and not increasing)

Chart of time vs: - number of cameras (exponentially increasing ), - giant squid footage (exponentially increasing ), - bigfoot footage (small and not increasing), and - good quality UFO footage (small and not increasing)

It's messier, but I think this one slaps the point home a bit stronger by adding the giant squid footage. I think unique weather, like lighting sprites, would make the point just as well.

14.12.2024 20:25 πŸ‘ 91 πŸ” 3 πŸ’¬ 2 πŸ“Œ 0
Preview
Frontier Models are Capable of In-context Scheming Frontier models are increasingly trained and deployed as autonomous agent. One safety concern is that AI agents might covertly pursue misaligned goals, hiding their true capabilities and objectives - ...

the anthropomorphizing in this LLM scheming paper is through the roof and the interpretations are wild, but still a cute set of experiments and a fun skim, showing some interesting behaviors.

arxiv.org/abs/2412.04984

13.12.2024 09:36 πŸ‘ 37 πŸ” 3 πŸ’¬ 4 πŸ“Œ 0
Post image

πŸš€ Introducing the Byte Latent Transformer (BLT) – A LLM architecture that scales better than Llama 3 using patches instead of tokens 🀯
Paper πŸ“„ dl.fbaipublicfiles.com/blt/BLT__Pat...
Code πŸ› οΈ github.com/facebookrese...

13.12.2024 16:53 πŸ‘ 60 πŸ” 15 πŸ’¬ 5 πŸ“Œ 3

So... no edit button, huh?

13.12.2024 23:33 πŸ‘ 2 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Oh no no. VSCode is a an actual recommendation. My actual favorite piece of software that I didn't write.

13.12.2024 23:25 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Whispers in Microsoft: VSCode

13.12.2024 23:11 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

This place needs bookmarks.

13.12.2024 18:28 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Leave links. I am old.

12.12.2024 20:47 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

And they laughed at us when we pursued PhDs. Who's laughing now?

12.12.2024 19:50 πŸ‘ 3 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Ah, are these size-limited? And you guys continue with running numbering?

12.12.2024 19:46 πŸ‘ 0 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Oh. There you are. Where's that starter pack?

12.12.2024 19:40 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

You convinced me πŸ˜‰

12.12.2024 17:46 πŸ‘ 1 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Hi!

11.12.2024 08:13 πŸ‘ 10 πŸ” 1 πŸ’¬ 1 πŸ“Œ 0