If the technical barrier to migrating tools and systems approaches zero thanks for AI, how else can teams package their people and process changes to be palatable to higher ups?
If the technical barrier to migrating tools and systems approaches zero thanks for AI, how else can teams package their people and process changes to be palatable to higher ups?
The course has guidance on using Claude Code, but also Codex, GitHub Copilot, and Cursor, and tons of practical tips on using AI for data pipeline development generally, I can't recommend it highly enough!
Read more: dagster.io/blog/announc..., get started: courses.dagster.io/courses/ai-d...
We just launched a new course that walks you through how to use AI to build data pipelines. I've been using our Dagster skills and this approach and it is truly like having a Dagster expert right next to you.
Now we have Alexa+, and every time it says: “Hm, I’m having trouble turning on Christian’s Fire TV”
Until I say: “Alexa, turn on K-R-I-S-T-I-N”’s Fire TV, and then it works. So:
* Alexa+ assumes the man owns the TV?
* LLMs are weird
* Clearly the only solution is for one of us to change our name
My name is Christian, my wife’s is Kristin.
We have an amazon Fire TV named, very originally, “Kristin’s Fire TV”
With Alexa, we used to to say, “turn on the Fire TV” and it would, you know, turn the TV on.
1/2
I mean I don’t looks at the lines of code but like titles, description, sometimes code if I feel like it, kinda like Cipher watching the matrix
Back on my bullshit*
* looking at every PR coming in to our monorepo
With open core you can kind of borrow the taste, that’s true
The thing you want is access to the model of the world, and the continuing development of that model, that the people who made that app embody.
It’s why I don’t really believe in moats other than the moat of taste and the data gravity of a system of record; neither is replaceable by AI.
3/3
I work for a B2B SaaS company so _of course_ I’m biased, but I think it’s a huge misunderstanding to believe that the primary thing you buy with SaaS is a license to a bunch of code. In a very real way, the code is the easy part, an implementation detail.
2/3
SaaS isn’t dead because you are buying taste and a system of record — two things no LLM or agentic system can build for all the tokens in the world.
1/3
* in the onboarding docs and guides we went from offering a single blessed path 0 (!!) times, compared with 19 times in the current docs.
If you’ve tried out Dagster before and found the learning curve a little too steep, take another look, I think you’ll be surprised at how much easier it is
3/3
* Huge improvements to the core framework, including the dg cli to create, validate, and deploy assets and Components, Components for a YAML DSL, the create-dagster CLI to generate new Dagster projects with defined structure).
* Docs that went from un-opinionated to Very Opinionated
2/3
Looking back on my 2 years (!!) at Dagster Labs, the thing about Dagster that improved the most is having Opinions about the best way to do things.
We’ve baked those Opinions throughout our docs and right into the heart of the Dagster library. Since I started, we have:
1/3
Give it a read if you’re uv-curious, or if you already use it you might learn a thing or two:
#dataBS
2/2
www.realcloudnative.com/why-i-now-us...
The LinkedIn algorithm of all things heard my cry for a great uv + python skill, and while it didn’t quite deliver that, it did send me this incredible blog by Adriaan de Jonge that will now be my go to on explaining what problems uv solves and why it’s so useful.
1/2
Oh yeah if you don’t create a lot of envs and don’t update them the speed is absolutely not a huge deal.
For iteration and prod deployments (eg the joy that is slow Docker builds) it’s a godsend
The other sick feature is uv run —with requests script.py, which runs that script after installing the library in an ephemeral virtual env (or you can provide as special metadata in a file). Soo useful for utility scripts
You can create virtual envs for a folder outside of that folder, and then make sure you’re pointing at the correct one / activating the right one in the workflow.
Oh very interesting. I guess this is one reason why this is a Hard Problem, so many different ways of working!
I do loads of new environments for customer / prospects / demos and that works marvelously.
What went wrong? I did the same and never looked back so curious what made our experiences so different
My kingdom for a Skill that uses uv to install Python and manage virtual environments.
Like linters and formatters take the discussion out of style questions, can we take the discussion and questions out of who and how to manage Python and virtual environments?
Joining the resistance, one Claude Code session at a time
Claude code for iOS doesn’t have the plugin system but since plugins are just plain text you can ask Claude Code to clone the repo and copy the files to .claude/ and it just works!
Fun when things feel hackable like that
Oh yeah! I didn’t post this but I also do a this week I learned async thread. Best ones are the simple ones.
When I led an analytics eng team I abolished standup and replaced it with a “questions and blockers” meeting where everyone joined and helped each other
Second, I have a weekly Slack thread where we each share feedback on 1-2 calls from each other. The good, the bad, the ugly. No ego, everyone has great days and bad days. I’ve learned so much about what’s working, what’s not, what I should try next, and it’s very, very fun.
3/3
Two things I started doing recently to help with this. First, in our weekly team meetings our first 10 minutes is devoted to the “brag corner” — SEs brag for themselves or for someone else! We share our wins and put it in a running doc.
2/3
I’ve led two remote teams where the work is similar but you do it independently (analytics engineering and now sales engineering). It’s really important to share what’s working and what’s not, but it’s hard if you don’t have “water cooler” conversations. 1/3
The best tools, models, and workflows will achieve large adoption, and the time we all spent yak shaving will be wasted, right?
Wrong. Think of yourself as a VC. ~75% of your bets will be ROI negative, but the 25% that do will swamp the bad bets.
Sunk cost fallacy is a fallacy for a reason
I see AI-forward teams spending a ton of time building bespoke tools to help them develop more effectively using agents (see: proliferation of CLIs to work with git worktrees), trying out different models, testing our new workflows.