Steph Johnson's Avatar

Steph Johnson

@bkstephj1

CEO at Multiplayer.app: autonomous AI debugging in production

123
Followers
284
Following
220
Posts
13.12.2024
Joined
Posts Following

Latest posts by Steph Johnson @bkstephj1

Automatic capture and correlation of every piece of data from a user's session plus the corresponding system behavior.

06.03.2026 13:48 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

The problem: AI generates code faster than teams can review (and debug) it.

The constraint: AI tools need complete visibility into runtime context, not sampled fragments, to more accurately generate code or assist with debugging.

The solution: πŸ‘‡

06.03.2026 13:48 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
The hidden costs of tech support Quantifying the engineering cost of customer support.

Check out this article about the hidden cost of technical support: leaddev.com/software-qua...

03.03.2026 13:48 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

This triage matrix assumes visibility. Without full-stack, auto-correlated, unsampled data, you're making expensive decisions based on incomplete information.

How often does that guess turn out wrong?

03.03.2026 13:48 πŸ‘ 2 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

Oftentimes, you're making high-stakes decisions (what issues to prioritize, which developers to pull off other work, whether to wake someone up at 2am) before you fully understand:

β€’ Root cause
β€’ Blast radius
β€’ Ramifications

03.03.2026 13:48 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Post image

This is how engineering leaders triage production issues: πŸ‘‡

What this matrix doesn't show: the hidden cost of triaging blind.

03.03.2026 13:48 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Post image

AI tools boost velocity but erode deep system knowledge.

Debugging and system understanding are the next challenge.

Great article by Stephane Moreau: open.substack.com/pub/blog4ems...

26.02.2026 11:23 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Teams are rushing to add AI debugging to their observability stacks. But if the underlying data is:

β€’ Aggressively sampled
β€’ Missing payloads
β€’ Scattered across disconnected tools

Adding AI on top just means faster access to incomplete data.

Fix the data problem first.

23.02.2026 21:00 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Your AI tools (a) don't have access to the data they need, or (b) require humans to manually gather and correlate the data.

AI agents need correlated, contextual data to be useful. Right now, most teams don't have that, and their observability tools weren't built to provide it.

19.02.2026 15:43 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

The data exists, but it's scattered and unstructured:

β€£ Frontend errors live in Sentry
β€£ Backend traces live in Datadog
β€£ User actions live in... Screen recordings? Support tickets?

So when an AI tries to answer "why did checkout fail for this user?", it can't.

19.02.2026 15:43 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Imagine you're looking for a specific email, but:

β€£ Your inbox has 100,000 of them
β€£ They're all labeled "Email"
β€£ There's no search function
β€£ Some emails are in Gmail, some in Outlook, some in Yahoo

That's what AI agents face when trying to debug your system. 🧡

19.02.2026 15:43 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

If your observability data isn't correlated across frontend and backend (or you're missing critical data due to sampling or lack of instrumentation) adding AI on top won't fix it.

It'll just give you faster access to incomplete information.

AI debugging is only as good as the data you feed it.

17.02.2026 09:42 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

That's the skill gap that's emerging:

not who can ship features fastest, but who can explain why their system behaves the way it does (and fix it with confidence when it doesn't).

13.02.2026 08:42 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

AI has lowered the barrier to writing code.
But it hasn't made systems easier to understand.

When something breaks in production, you still need deep knowledge of your system, the ability to read traces, and the instinct to know where to look.

13.02.2026 08:42 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

The best engineers were never the ones who wrote code fast or with β€œclever” solutions.

The gap between top and bottom performers continues to widen.

13.02.2026 08:42 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

PS. Multiplayer captures all of this πŸ‘† automatically (request/response content and headers from internal services AND external dependencies), correlated in a single session recording.

28.01.2026 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

There a debugging bottleneck few talk about: the hours engineers spend reconstructing what happened in production because critical context is missing. For example:

β€’ What payload did we send?
β€’ What did the external API return?
β€’ Which headers were set?
β€’ What did the middleware modify?

28.01.2026 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Post image

Which pie chart is your team living in?

This is the difference between 3 hours of context switching and 10 minutes of clarity.

Bad debugging = manual correlation across scattered tools.
Good debugging = auto-correlated runtime context in one place.

23.01.2026 08:52 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 1

Not all session replays are built for the same job.

πŸ“ŠΒ Product analytics tools answer questions about user behavior.
πŸͺ²Β Debugging tools need to answer questions about system behavior.

When bugs span APIs, services, and data layers, engineers need replays that correlate user actions to backend data.πŸ‘‡

16.01.2026 09:12 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
30 Min Meeting | Multiplayer | Cal.com 30 Min Meeting

If you’re open to sharing what didn’t click, what felt heavy, or what made you pause, it would genuinely help us build a better experience for all of our users.

You can schedule time with me here: cal.com/multiplayer/...

12.01.2026 09:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Post image

Building developer tools means constantly stress-testing your own assumptions.

If you signed up for Multiplayer and bounced during onboarding, understanding why is incredibly valuable to us.

We’re offering a $50 gift card for a short conversation about your experience (15–20 min).

12.01.2026 09:21 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

Session replay is useful, but when visibility stops at the UI, engineers are left stitching together logs, traces, and payloads by hand. That friction adds up quickly.

Multiplayer is worth a look (and a free try!) if your debugging workflow still involves too much tab-hopping.

30.12.2025 16:35 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Question for teams using LogRocket: how much time do you spend jumping between tools to connect frontend issues to backend problems?

30.12.2025 16:35 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Multiplayer 2025: year in review In 2025 we focused on a simple but ambitious goal: making debugging faster, less fragmented and less manual. Check out all our releases to make that possible.

6/6 Grateful to our customers, design partners, and community for supporting us and pushing us forward … we’re excited for what we’re building next.

www.multiplayer.app/blog/multipl...

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

5/

I’m incredibly proud of our team. Not just for shipping fast, but for shipping thoughtfully, listening closely to our users, and raising the bar on quality with every release. πŸ’œ

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

4/

β€’ An MCP server to feed full-stack context into AI tools
β€’ A VS Code extension to debug from inside the editor
β€’ Mobile (React Native) support
β€’ Notebooks for full-cycle debugging and documentation
β€’ Automatic system architecture maps that stay up to date

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

3/ Seeing the full list of everything we produced all in one place really brought it home for me.

This year, with a lean team, we shipped:

β€’ Multiple recording modes for capturing issues when they happen
β€’ Annotations and sketches directly on session recordings

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

2/ We took vacations, tended to our families and protected our mental health.

Our partners and our customers were surprised at the pace we were able to keep. When you’re deep in the day-to-day, it’s easy to forget how unusual that is.

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Post image

1/ Looking back at 2025, what stands out the most isn’t one single thing. It’s how a very small team managed to ship our product and achieve our goal: making debugging faster, less fragmented and less manual. And, they did it without sacrificing their sanity.

24.12.2025 09:32 πŸ‘ 0 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

For engineering teams: what percentage of bugs in your app are purely frontend vs. backend or integration issues?

As systems become more complex, partial visibility creates friction across support and engineering.

This πŸ‘‡ highlights why end-to-end context is becoming table stakes for debugging.

19.12.2025 18:38 πŸ‘ 3 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0