The Study of Apps
Technology moves on. The business doesn’t.
COBOL systems still run critical operations at banks, insurers, and government agencies — decades after the engineers who built them moved on. Not because nothing better exists, but because replacing them is prohibitively expensive. The business looked at the cost of migration, weighed it against the risk of leaving things alone, and chose the devil they knew.
But here’s what happens over time: as a technology falls out of favor, the talent pool shrinks. Fewer developers learn COBOL. The ones who know it can command higher salaries. Supply and demand kicks in, and maintenance costs quietly climb — not because the system got more complex, but because the people who understand it got more scarce. Eventually, companies find themselves paying a premium just to keep the lights on, locked into a technology not by choice but by economics.
This pattern isn’t unique to COBOL. It plays out every time a platform, framework, or language falls out of mainstream adoption. The technology moves on. The business is left holding the bag.
The spec problem
Here’s the deeper issue: rewriting code isn’t necessarily hard — it’s the tedious, painstaking work of ensuring the new version faithfully reproduces production behavior. But the real problem runs deeper. It’s the knowledge embedded in the code that’s hard to rediscover.
Most software teams don’t maintain adequate specifications. In theory, there’s a tight loop between the product manager who defines requirements, the engineer who implements them, and the testers who verify the result. In practice, that loop breaks almost immediately. Requirements get written once and never updated. A bug gets fixed in code but the spec that produced the bug is never corrected. Edge cases discovered during development live only in the implementation.
This requires discipline that most teams simply don’t have — not because they’re bad at their jobs, but because the pressure to ship always wins. Updating documentation is overhead. Maintaining specs is overhead. And when the choice is between shipping a feature and updating a requirements document, the feature wins every time.
The result is predictable: over months and years, the codebase drifts from whatever specifications existed. The code becomes the source of truth — not by design, but by default.
Code as accidental source of truth
When code is the source of truth, you inherit a set of problems that compound over time.
Migration becomes archaeology. When it’s time to move to a new platform or language, teams can’t just pick up a spec and reimplement it. They have to reverse-engineer the existing system — reading code, tracing behavior, interviewing the people who built it (if they’re still around). The specification has to be extracted from the implementation, which is the exact inverse of how it should work.
Knowledge walks out the door. The engineer who made a critical architectural decision three years ago? They left. The context behind that decision — the constraints they were working around, the business rule they were encoding — lives nowhere except the code. And code tells you what it does, not why.
Duplication across ecosystems is staggering. Think about how many times the same fundamental problems have been solved across programming languages. Image manipulation, HTTP protocol handling, cryptographic algorithms, date parsing — every language community rebuilds these from scratch. The business logic is identical. The specifications are identical. But because code is the source of truth, and code is bound to a language, the work gets duplicated endlessly.
The business doesn’t care what language you use. The customer doesn’t either. They want things to work. But the industry’s dependence on code-as-truth means that every technology transition carries the full cost of reimplementation, not just the incremental cost of adaptation.
Software runs everything. Where are the standards?
Software doesn’t just run tech companies. It runs hospitals, banks, power grids, aircraft, elections, and supply chains. It processes prescriptions, executes trades, controls traffic signals, and files your taxes. There is virtually no industry left that doesn’t depend on software to function.
And yet, the profession responsible for building all of it has no unified standards, no oath, and no enforceable code of conduct.
Doctors take the Hippocratic Oath. Lawyers pass the bar and can be disbarred. Dentists, engineers, and even barbers are licensed and held to professional standards. If a doctor causes harm through negligence, there are consequences — not just lawsuits, but the loss of the right to practice. These professions regulate themselves because the stakes demand it.
Software engineers? Anyone can call themselves one. There’s no licensing body, no professional oath, no minimum standard of care. Robert C. Martin — Uncle Bob — has argued this point for years: programmers are the modern scribes, writing the rules that society runs on, yet operating with less professional accountability than the person who cuts your hair.
His argument is simple: if we don’t regulate ourselves, eventually a disaster will force governments to do it for us — and they’ll do it badly. The Programmer’s Oath he proposed is modeled on the Hippocratic Oath: do no harm, always produce your best work, never knowingly allow defective code to accumulate.
Now add AI to the equation. Writing code has been democratized. Tools that generate functional code from a prompt are available to anyone. The barrier to producing software has never been lower. But the barrier to producing correct, safe, maintainable software hasn’t moved. If anything, the gap between “code that runs” and “code that should be trusted” is widening.
This makes the specification problem even more urgent. When anyone can generate code, the question isn’t whether the code works — it’s whether anyone can verify what it’s supposed to do. Without rigorous specifications, there’s no standard to hold the code to. Without standards, there’s no accountability. And without accountability, the software that runs our world is only as reliable as the last person who touched it.
What other professions already figured out
The irony is that every other high-stakes profession has already solved this problem — just not with code.
Medicine has evidence-based protocols and a pharmacopeia: a shared, versioned library of treatments with known effects, dosages, and interactions. A doctor in Tokyo and a doctor in Toronto can reference the same body of verified knowledge. When a new treatment is validated, it gets added. When one is found to be harmful, it gets revised. The knowledge is maintained independently of any individual practitioner.
Law has case law and precedent. Every new case doesn’t start from scratch — it builds on prior rulings that have been argued, tested, and locked down. A legal decision in one jurisdiction can inform reasoning in another. The system accumulates institutional knowledge by design.
Civil engineering has building codes — specifications that are independent of materials. The code says a load-bearing wall must support a given weight. Whether you build it from steel, timber, or concrete is an implementation detail. The spec is the constant. Sound familiar?
Accounting has double-entry bookkeeping and GAAP — standardized rules that make every organization’s finances auditable against the same specification. It doesn’t matter what software the company uses or who their accountant is. The books either conform to the spec or they don’t.
Every one of these professions solved the same fundamental problem: how do you separate intent from implementation in a way that’s verifiable, transferable, and durable?
Software is the outlier. We skipped that step. We went straight from “build it” to “ship it” and never built the institutional layer in between. The specification problem isn’t just a software engineering problem — it’s the missing professional infrastructure that every other serious discipline takes for granted.
What if specifications were the source of truth?
Imagine a library of specifications — precise, structured, language-agnostic descriptions of what software should do. Not vague requirements documents. Not user stories on sticky notes. Formal specifications that define behavior clearly enough that an implementation can be validated against them.
Now imagine those specifications mapped to implementations across languages. A spec for an HTTP client that’s been confirmed against a C# implementation could be used to generate — or validate — a C++ implementation, a Rust implementation, a Go implementation. The spec is the constant. The code is the variable.
Natural language is the right foundation for this. Not because it’s imprecise — in fact, the goal is to make it rigorous — but because it’s the one language every stakeholder already speaks. Product managers, engineers, testers, and executives can all read a natural language specification. Nobody has to learn a new formalism to participate in defining what the software should do.
Locking it down
One of the most powerful ideas in this model is the ability to lock a specification once its behavior is confirmed.
When a spec has been implemented and validated — the tests pass, the behavior matches, the stakeholders sign off — that spec gets locked. It becomes a verified contract. Now AI can reason about it with confidence: “This C# code implements this locked spec. Therefore, this C++ code should implement the same spec, and here’s how.”
Locked specs become building blocks. They accumulate over time into a library of verified behavior that any future implementation can draw from. The knowledge doesn’t decay. It doesn’t walk out the door when someone leaves. It’s encoded in the specification layer, independent of any single codebase or technology.
Not just functional specs
Software behavior isn’t one-dimensional. A function can be correct according to its business logic and still fail in production because of performance, security, or operational concerns. That’s why specifications need categories.
Functional specifications define what the software does — the business rules, the transformations, the expected inputs and outputs.
Operational specifications define how the software behaves under real-world conditions — performance requirements, availability targets, resource constraints, failure modes.
And there will be others. Security specifications. Compliance specifications. Accessibility specifications. Each category captures a different dimension of “correct,” and each can be versioned, validated, and locked independently.
The point is that a complete specification isn’t a single document. It’s a structured collection of contracts that, taken together, define what it means for the software to work.
What we’re building toward
We’re not claiming this system exists today in its final form. We’re saying the pieces are coming together — and AI is the catalyst.
Large language models can read code and extract intent. They can generate implementations from structured descriptions. They can compare behavior across languages. They can maintain the mapping between specification and implementation that humans have always struggled to keep current.
The vision is a world where the business logic isn’t trapped in any single codebase. Where migrating to a new technology means pointing your specifications at a new target, not reverse-engineering years of accumulated decisions. Where the reasoning behind the software is as durable as the software itself.
That’s what Appology is building toward. Not just better tools for writing code — better tools for understanding, specifying, and preserving what software is supposed to do.