We don’t talk enough about how morally depraved the tech industry turned out to be. Every single ounce of their self-regarding statements of values was an outright lie.
We don’t talk enough about how morally depraved the tech industry turned out to be. Every single ounce of their self-regarding statements of values was an outright lie.
"Developers won't need to understand syntax anymore, just as long as they can <goes on to describe things that will require understanding of syntax>"
I see posts on LinkedIn from people who believe dev teams should be pushed to adopt "agentic" coding so they can reap the "10x" productivity gains.
Putting aside the lack of evidence for even 1.5x gains on teams, if it really were possible, why would they need to be pushed towards it?
If I could pick one word to sum up the last decade, it would be "preoccupied".
When can we start getting on with our lives again?
Ten to twelve years from now wars will be fought over the remaining sources of fresh water because of these fucking dipshits.
from reddit r/ExperiencedDevs • 21h ago Tech-Cowboy An AI CEO finally said something honest (Meta Dax Raad from anoma.ly might be the only CEO speaking honestly about Al right now. His most recent take: "everyone's talking about their teams like they were at the peak of efficiency and bottlenecked by ability to produce code here's what things actually look like - your org rarely has good ideas. ideas being expensive to implement was actually helping - majority of workers have no reason to be super motivated, they want to do their 9-5 and get back to their life - they're not using Al to be 10x more effective they're using it to churn out their tasks with less energy spend - the 2 people on your team that actually tried are now flattened by the slop code everyone is producing, they will quit soon - even when you produce work faster you're still bottlenecked by bureaucracy and the dozen other realities of shipping something real - your CFO is like what do you mean each engineer now costs $2000 extra per month in LLM bills"
someone in AI fucked up and actually told the truth
three panel webcomic 1st panel: one man says to another "we invented a robot that answers questions" 2nd panel: he continues "we just have to feed it 10 baby giraffes a day" 3rd panel: the other man asks "but it answers the questions correctly?" and the first man replies "oh my goodness, no. no no no no no."
It's Mandatory Monday and AI is clearly the future.
mandatoryrollercoaster.com/post/8081046...
"TDD slows me down"
Good
Carl Sagan's The Demon-Haunted World should be required reading in high school. And congress.
Technical debt is organizational debt wearing a compiler-approved disguise.
A new study from Anthropic finds that gains in coding efficiency when relying on AI assistance did did not meet statistical significance; AI use noticeably degraded programmers’ understanding of what they were doing. Incredible.
Seeing more and more posts from engineering leaders along the lines of "Tried Claude Code over the Xmas holiday and generated 10 squillion lines of code (that almost worked) in 10 minutes. Now mandating all our engineers use it as much as possible. Because I don't understand bottlenecks."
Very often when trying to think about a problem I start wondering:
"How does software X deal with that?"
And 98.5% the answer is: they don't, they just let it fail. Often silently. Every time I keep relearning how low the standards usually are in this industry.
The software industry's lack of concern for business or user outcomes has never been more visible.
I really do love the spec style syntax for tests (describe, before each, it). When used correctly it's the best at representing test setups and expressing test intent.
I also recognize it has many downsides that have to be accounted for.
Let me tell you, it really sucks resolving merge conflicts.
Yes, that's the main problem in our industry.
And, of course, there'll be the folks who warn about mythical developers "wasting time" making the software "too good".
As our civilisation relies ever more critically on software, we've collectively decided this would be a good time to lower our standards?
You call it "over-engineering" when devs make software more complicated than it needs to be.
But simpler solutions often require *more* thought. Complexity's easy. You just keep typing.
That's why I call over-complicating "under-engineering".
Dear product folks who may be wondering what a *complete* software specification looks like, I recommend taking a look on GitHub. It's full of them.
The solution is the same now as it always was: SLOW DOWN.
Take smaller steps - no, *smaller* than that! (No, even SMALLER than that!) - and test, inspect, refactor and merge more often.
No, *more* often than that...
etc
I'm sorry, I'm calling it. The software industry has lost it's fucking mind.
I’m exhausted by the world Silicon Valley has foisted upon us — one we’re just expected to accept and adopt en masse, with little say into the direction of technological travel or input on whether the technology that benefits companies and CEOs is actually benefiting the public that’s expected to use it. Typically, I would call for better technology, and that’s at the core of the argument my colleagues and I made for digital sovereignty last year — not just for non-US technology, but for technology with a wholly different set of economic incentives and social values at its foundation. But as we wait to see if that will ever arrive, there is a stronger argument forming with every passing month that rejecting the technologies being sold to us — and even going back to physical and analog alternatives — is the right move in the present. Maybe not everything has to be digital or digitized, maybe the internet shouldn’t be inserted into absolutely everything, maybe we shouldn’t be constantly connected in the way we’re now expected to be, and generative AI certainly does not need to be forced into every facet of society.
The world Silicon Valley has created is exhausting and terrible. We absolutely need better technology, but in the meantime it’s absolutely right to reject what doesn’t serve us and reassess our relationship to digital technology more broadly.
disconnect.blog/we-need-to-r...
The Planning Game in XP has a rule: if Johnny reckons he can do it quicker than Jane, then Johnny just volunteered to do it.
If your CEO/CTO/Head of Engineering comes back to the office on Monday and says "2 weeks?! I could do it in 2 hours using Cursor!", hand them the keyboard and wish them luck.
I get it. The boss is asking for an estimate. And you're thinking "But there are so many variables".
Pro tip: the answer is 12.
12 what?
Well, that's their problem.
2040: what was the AI revolution of the 2020s like, in 10 seconds
As 2025 draws to a close, my hopes "AI" was going to force more teams to address the real bottlenecks in development fade into the distance.
95% will just ship less reliable software, and take longer and spend more £ doing it. And users will be coerced into eating the costs.
Business as usual.
Imagine how much progress we could make if our profession wasn't expected to reinvent software engineering from scratch every 5 years.
2025 was the year when demand for "pre-AI" software developers started to pick up.
Will 2026 be the year when demand grows for devs who don't use it at all?