Post-deterministic IT
Date: 2026-01-04 09:27
Thirty Years of IT: Knowledge and Ignorance
I have been working in IT for thirty years now. I have seen almost all common courses and movements: from ITIL to SAFe for Architects, from SQL to system architecture. But a mountain of knowledge is now accompanied by a sea of ignorance. A paradigm is shifting before my eyes. I notice that I suddenly find myself in a different world, for which there are hardly any books or articles that provide guidance. What do I spend my free time on these days? Building an agent ecosystem.
"The Computer Is Always Right"
During my first computer science class, a teacher said, "The computer is always right." That statement has always stuck with me. She helped analyze and solve numerous problems. Yet the computer sometimes seems to have a mind of its own and we speak of he to indicate this. We all know these kinds of statements:
- It's slow today - I restarted it, but it keeps acting up
- It worked fine yesterday
That kind of behavior almost always turns out to be due to carelessness: incorrect initialization, incomplete error handling, implicit assumptions. I call that unintentional non-determinism, in plain English: a bug.
The Paradigm Shift
Since I started working with agents, I see a paradigm shift developing. Not primarily because we work faster, although nowadays I create a conceptual data model in half a day that previously took months and with a better result on many points (not all).
The real shift is elsewhere. We are increasingly building systems that interpret instead of execute. Systems that derive meaning, weigh intentions and take context into account.
That will not happen anytime soon in applications for internet banking or calculating energy bills. There, determinism remains essential.
But in software that:
- Answers customer questions
- Analyzes files
- Assesses mortgage applications
The Foundation is Shifting
For years our starting point was clear:
code + rules + data = predictable behavior
That foundation is shifting. Not because developers have become sloppier. Not because testing is inadequate. But because interpretation becomes an explicit part of our systems.
New Architecture Questions
The counterpart of deterministic software is probabilistic software. Architecture will become less and less about recording everything and more and more about where uncertainty is allowed to exist and where not.
My belief: in the coming years, consciously placing probability will become an important skill for architects. Not everything that can be probabilized is allowed to be so. But pretending that everything remains deterministic is no longer an option.
Giving Language to Change
What architects face here is new. I didn't see this from a book, article or in a webinar. I see it happening before my eyes. This post is a first, tentative attempt to give language to that change, so that we can talk about it with the stakeholders with whom we make strategic choices.
https://www.linkedin.com/pulse/post-deterministic-why-architects-need-embrace-systems-hans-blok-ii4ae