In the world of banking, there’s a well-known secret: Mainframes and COBOL still power the majority of financial services. Despite wave after wave of digital transformation initiatives, these systems remain at the heart of core banking operations and for good reason.
Banks have invested massively in mainframes and COBOL over the decades. This gradual accumulation of functionality has created complex systems that, paradoxically, no one fully understands anymore and yet, they continue to work reliably.
This also makes them incredibly hard to maintain and evolve. Additionally, they depend on a shrinking pool of COBOL and mainframe specialists, and are typically batch-oriented, which is a poor fit for today’s real-time, digital-first world.
As such one could argue they represent a significant risk. But the opposite is also true: they are incredibly stable, fast, and reliable. These systems have been hardened over decades, processing trillions in transactions with remarkable efficiency.
The truth is: these systems aren’t legacy because they’re outdated, they’re legacy because they’re mission-critical.
Banks continue to rely on COBOL and mainframes for a mix of technical, economic, operational, and regulatory reasons:
Throughput and performance that still outperform many modern stacks.
Mature audit trails and compliance processes regulators trust.
Complex business logic that has evolved over decades, with little to no documentation.
Interconnected ecosystems involving middleware, batch schedulers, and legacy databases.
High costs and risk of replacement, far beyond simple developer hours.
Replacing core COBOL systems is tempting, until the real cost becomes clear. Functional logic is just the tip of the iceberg. The non-functional requirements (like performance, availability, robustness, auditability…) are much harder to replicate.
Large-scale rewrites tend to fail because:
They break subtle business logic nobody knew existed.
They require perfect parity in performance and auditability.
They trigger costly regulatory revalidation of previously accepted systems.
Instead, most banks adopt more pragmatic approaches:
Encapsulating legacy systems with APIs, adding orchestration layers on top
Replatforming to cloud-hosted mainframes
Partially rewriting specific, less-critical modules
Outsourcing development and maintenance to specialist providers (like TCS or Accenture).
However, the rise of AI may introduce new alternatives. AI could play a role in accelerating legacy transformation by:
Mapping undocumented logic and business rules
Assisting with modular rewrites into modern stacks
Generating test cases and test data to validate the migration
But even AI cannot fully remove the risk, especially around non-functional behaviours, which are notoriously difficult to simulate, test and debug in modern distributed systems.
AI may become an accelerator, but it’s not a silver bullet. The decision to (partially) replace legacy systems is ultimately a strategic trade-off, balancing reliability vs. innovation, risk vs. reward and stability vs. agility.
For now, mainframes and COBOL aren’t going away, not because banks are resistant to change, but because keeping them remains the most rational choice. The big question is: for how long?
As neobanks like Revolut, Monzo, and Nubank continue to scale - operating with leaner cost-to-income ratios and greater agility - traditional banks may eventually face a tipping point. When that moment arrives, the risk may no longer be in staying with legacy systems, but in having waited too long to change.

Comments
Post a Comment