Thanks. I've had good luck with "bhauman/clojure-mcp-light", and recently learned about "licht1stein/brepl" which also looks pretty good
@felixbarbalet.com
software engineering w/ #clojure, #data, #electronics, #photography, #infosec and #economics. My longer form writing is @ https://felixbarbalet.com https://keyoxide.org/hkp/535FFD2607AFCB2C97831159CD1D679C615C17C8
Thanks. I've had good luck with "bhauman/clojure-mcp-light", and recently learned about "licht1stein/brepl" which also looks pretty good
Learning Clojure was a revelation for me, changing my life and career. I know there's some in the Clj community who really don't like LLMs - I respect that. I'm choosing to use LLMs to write Clojure for the same reasons I've chosen to use it myself whenever I could felixbarbalet.com/simple-made-...
As software engineers, is this our future? If you haven't already read Steve Yegge's #GasTown, I recommend you do steve-yegge.medium.com/welcome-to-g...
The result is "Draft for Consultation." It proves my point: just because you can, doesn't mean you should. Itβs terrible. Youβre going to love it.
Sorry in advance to anyone who listens to it.
I pushed AI too far, and I apologize.
I decided to test the boundaries of good taste by forcing AI to generate an entire album of songs about #government #bureaucracy, #procurement delays, and policy reform.
soundcloud.com/felixcbr/set...
#GenerativeAI #Music #PublicSector #Humor #Canberra #AusPol
The real work is not typing; the real work is thinking.
#DeepWork #SoftwareEngineering #DevSky #Tech #EngineeringLeadership #SystemDesign #Philosophy
The rush to "build" often means prematurely entangling ideas, leading to accidental complexity that paralyzes future development. You cannot stumble into a robust architecture for high-stakes stochastic systems.
10/10. The Prerequisite of Deep Work
The most important development tool is not the keyboard, the IDE, or the latest LLM API. It is the chair, the notebook, and the quiet contemplation of the problem.
Spend time designing that model before you worry about the execution.
#SoftwareArchitecture #Datomic #DevSky #SystemDesign #Database #Tech #SoftwareEngineering #Data
Their states are assertions and retractions over time. If you don't have a rigorous, temporal model of your execution, you aren't engineering; you're just hoping.
9/10. We Confuse the Process with the Facts About the Process
Your execution engine shouldn't just run the workflow; it must record every step as an immutable fact. We use Datomic not just for the domain data, but for the orchestration itselfβworkflows, stages, tasks. These are entities.
Designing this interplay - where the stochastic meets the deterministic - is the hardest part of the architecture.
#KnowledgeGraph #AI #RAG #SystemDesign #DevSky #Tech #GraphDatabase #SoftwareEngineering
The LLM suggests actions or interpretations, but the KG validates them against established facts and schema. The LLM is a powerful reasoning engine, but the KG is the referee.
8/10. The Knowledge Graph as Constraint
Many see the Knowledge Graph solely as a source for Retrieval-Augmented Generation (RAG). This misses half the value. In a high-stakes system, the KG is also a constraint. It provides the deterministic boundaries within which the LLM is allowed to operate.
Choosing when to impose structure, and when to defer it, is the essence of sound design.
#DataModeling #Datomic #SoftwareEngineering #DevSky #Tech #Database #SystemDesign #Data
We don't force it into a predefined relational structure immediately. Using a database like Datomic allows us to accrete facts about this data later, evolving the schema as our understanding grows. By prioritizing data over structure, we maintain flexibility.
7/10. The Tyranny of Structure vs. The Power of Data
We often force information into rigid schemas too early. A well-designed engine stores intermediate resultsβthe outputs of LLM callsβas simple, serialized data (like EDN payloads).
Choosing where the seams are, and what belongs in an interceptor versus the task itself, is not trivial.
#SoftwareArchitecture #DevSky #CleanCode #SoftwareEngineering #Tech #DesignPatterns #Refactoring
If you want a system that can evolve, you must separate these concerns. We use interceptor chainsβmiddleware wrapping the execution of every task. This allows us to inject cross-cutting "plumbing" without polluting the core logic.
6/10. Isolating the Plumbing
We often complect the core logic of a task (the prompt engineering, the tool definitions) with the accidental complexity of its execution (error handling, retries, timing, validation). This entanglement guarantees rigidity.
Designing the rules for this adaptation requires profound clarity about the problem domain, not just the technology.
#AIAgents #SystemDesign #DevSky #SoftwareEngineering #Tech #AI #Orchestration #Automation
After a task runs, an assessor determines the continuation, potentially rewriting the workflow DAG at runtime. This is how you bridge the gap between predefined orchestration and autonomous behavior.
5/10. Dynamic Adaptation vs. Static Plans
A static workflow definition is a plan made when you know the least. Agentic systems, by definition, adapt. The path depends on the results of the previous stochastic step. The engine must support dynamic stages guided by "assessors."
Designing this ledger is a prerequisite, not an afterthought.
#LLMs #AI #Observability #DevSky #SoftwareEngineering #Datomic #Tech #SystemArchitecture #AIGovernance
This ledger must capture the exact inputs (fingerprinted), the precise function executed, the outputs, and the duration. Datomic gives us this history inherently. If you cannot replay and interrogate every probabilistic decision your system makes, you are flying blind.
4/10. The Necessity of an Immutable Ledger
In high-stakes environments, "It usually works" is malpractice. When an LLM makes a decision, you must know why. This requires more than logs. It requires an immutable, accretive Execution Ledger.
This composability allows you to manage complexity and scale without centralized control. Thinking functionally about parallel, probabilistic processes takes significant effort.
#FunctionalProgramming #SoftwareEngineering #SystemArchitecture #DevSky #Tech #Dataflow #Backend
Model workflows functionally: map, reduce, aggregate. We are transforming information, not dictating steps. A reduce operation shouldn't be a monolith; it should dynamically expand into a sequence of dependent reducer tasks at runtime based on the data shape.
3/10: Composition over Command: The Functional Dataflow
If your architecture diagrams look like flowcharts, you're building a brittle system. Flowcharts are about control and sequenceβconcepts that stochastic components actively resist. Instead, design for dataflow.
Designing this requires understanding the epistemology of your system. How do you know what you know, and when did you know it?
#LLMs #AI #DevSky #SoftwareEngineering #Data #Tech #Datomic #SystemDesign #KnowledgeGraph