Why your COBOL code isn't going anywhere
Over time, applications become stale and clunky, and perhaps just stop working altogether. So it's amazing that, 60 years after the COBOL programming language first came on the scene, many COBOL applications are still with us. That's an incredible testament to the resilience of the language.
Here's why that happened, and how COBOL and the many business applications written in the language are continuing to evolve and adapt to the modern enterprise.
COBOL arose from a revolution
The change from paper-based systems to computers was arguably bigger than the transition to the Internet. Businesses such as banks had little choice: Adopt or go out of business.
This new technology was expensive, affordable only for the most important processes. C-level executives had to buy into it, and businesses had to create new functions to run it. Enterprise IT was born.
The mainframe technology stack included hardware, middleware, and applications. The latter were the magic that represented the business. But what language should businesses write applications in?
Religious wars were fought among PL/1, Assembler, Fortran, and COBOL fans. But IBM said COBOL was the correct choice—and its recommendation carried a lot of weight. Truth was, COBOL was a sensible choice. It included much richer data storage, processing, and reporting features than did the alternatives.
Over the following decades, people who worked on mainframes began writing applications for newer generations of computers. Although COBOL was originally synonymous with the mainframe, a plethora of new COBOL applications evolved that ran on platforms familiar to IT professionals today.
The result: COBOL was used for business-critical apps, both on and off the mainframe.
Why COBOL has survived
Several factors have kept COBOL working over 60 years of hardware evolution. These include:
Its precise data layout provided an early advantage
Early applications were constrained by hardware. In the popular media, the mainframe is seen as a monolithic, uber-powerful machine. But the early ones weren't, and programmers needed ingenuity for their work to fit into the constraints.
Memory was at a premium, and COBOL let programmers specify exactly how each byte was to be used. (Also, overlays allowed programmers to determine which pages of code were loaded, to optimize the working set.)
Hardware evolved from 8-bit architectures to 16-bit, 32-bit, 64-bit, big-endian, little-endian, RISC, and so on.
COBOL's precise data layout provided a level of isolation. In COBOL, a variable to hold eight digits is a PIC 9(8). In languages such as C, data-types such as int change size with the architecture.
The COBOL standard defines arithmetic behavior; it's not left to an arbitrary CPU instruction. Performance wasn't always optimal, but the code ran as expected, without any changes, on a wide variety of platforms. Performance was important, but many applications were I/O-bound. Saving a few clock-cycles on in-memory conversions usually didn't matter.
It has few dependencies
COBOL is a big language. When I worked on a COBOL compiler, we played games such as constructing the longest sentence from COBOL's reserved words (there are hundreds). By comparison, Java has fewer than 50. COBOL has so many because many features are implemented as syntax.
For example, it has a built-in implementation of indexed files (a precursor to databases, which actually still often perform very well), and sorting, formatting, string, and other syntax.
COBOL code is hard to replace
Many engineers will tell you that COBOL is old and should be replaced with something better. And they can be very emotional about it.
But how exactly is that "something" better?
It's no surprise that businesses place value on something that runs reliably and that they aren't keen to take unnecessary risk with a core process (which is what COBOL tends to be used for). Usually, the desire is to change, versus replace, what's there. In that case, it's probably best to keep what you have, and adapt.
No one wants to do a rewrite
There is rarely an easy path from one language to another, which means you face the daunting task of a rewrite.
The more mature an application is, the less likely your organization is to have good knowledge of it. If you have a specification, has it been updated as the application has changed over the years? The staff that built it are less likely to still be with you, especially decades later. Your best source of knowledge may be how the application behaves today, and often that means the source code.
Engineers are keen to write from scratch with new technology, but understanding and rewriting someone else's code is less appealing. A great deal of determination and diligence is required. I've not seen any recent primary research on rewrite success rate, and few publicly announce their failures, but it definitely isn't 100%.
Martin Fowler once proposed a way for rewrites to be successful: sacrificial architecture. The idea is to plan for a rewrite when you first start writing the app. But while that's a good idea, it's unlikely that the original engineers who created your COBOL programs did this.
So is there something better?
COBOL continues to evolve
Today, COBOL supports many of the language features from C# and Java, can run directly on .NET and Java platforms (compiling to respective byte code representations). This allows in-process mixing of languages, and the ability to naturally exploit the power of these ecosystems. COBOL is far from the island you might think it is.
It's also fully backward-compatible with old COBOL, so this is a more sensible place to start than pursuing a full rewrite.
Today, enterprise architects don't care what language an application is written in; they care about how to decompose the application into services that they can integrate and reuse. The building block is no longer a subroutine or library; it's a microservice, probably running in a container.
Provided you meet certain hygiene criteria, you can code in any language you want. Today, applications are assembled, not built. Multiple languages are inescapable.
Traditional COBOL has a different structure than modern languages, but it's not hard to learn. Once you've cracked data division, you're largely there. The concepts are far easier than lambdas, promises and async, found in other languages.
You can write just about anything in COBOL; the Micro Focus COBOL compiler is itself written in COBOL. And the development experience is comparable to that of other languages. You can use your favorite tools, such as Visual Studio, Eclipse, with code completion, and many more.
What's next for COBOL
Don't dismiss legacy applications without understanding the technology or the value of the investment in the applications your organization has developed. And you certainly shouldn't dismiss a critical application for reasons as fickle as the programming language in which it was created.
We're in a world where it really doesn't make sense to keep rewriting things over and over, because rewriting isn't the best value of the intellect of our software engineers.
If I'm right, then most COBOL applications that were rife for migration have already been rewritten or replaced. Most of the rest, I expect, will continue to run for the foreseeable future.
What's the future of COBOL in your organization? Do you agree that in most cases businesses will continue to run well into the future? Share your thoughts and stories below.