Estimated reading time: 6–7 minutes
“COBOL has become the victim of its own success: it is so reliable that nobody wants to replace it.
But it is so old that nobody wants to learn it any more.”
COBOL (Common Business Oriented Language) appeared as the essential language for the business world, concentrating on the rigorous management of files and on reports. Although it lacked the academic refinement of ALGOL or the brute force of Assembler, COBOL offered a quality born of dependability. It was the instrument that ensured correct payments and the precise closing of accounts.
And this language did not die, against all the odds.
The COBOL of the 1960s is still alive and well today, in 2026.
There are programme packages written in COBOL that still run in banking systems, insurance and the public administration.
It is estimated that over 200 billion lines of COBOL code are in production, running on mainframes that process trillions of transactions every year.
What is the problem?
Those who wrote it are retiring or have already died. And the young programmers do not want to learn it — they prefer JavaScript, Python or C++.
The result?
COBOL programmers now rank among the highest-paid in the United States.
They can command salaries exceeding 150 to 200 US dollars per hour on emergency contracts.
In 2020, when the pandemic struck, the computer systems for processing unemployment benefits in New Jersey — written in COBOL forty years earlier — collapsed. A great deal of processing and a number of small modifications were needed.
The authorities then issued a public appeal: “We urgently need COBOL programmers!”
This was because nobody dared to rewrite those systems. It was safer to pay a fortune to a seventy-five-year-old programmer than to risk building something new that might not work.
COBOL has become the victim of its own success: it is so reliable that nobody wants to replace it.
But it is so old that nobody wants to learn it any more. And so the few who know it can name their price.
COBOL would not let you forget anything. It was like a sober office manager who demanded that you fill in every box, even when they did not seem necessary. But at the same time, it protected you from mistakes. It required you to state everything you wished to do — with subject, predicate, comma and confirmation.
For example, to display a simple message, in COBOL you wrote:
DISPLAY "Buna ziua!" UPON CONSOLE.
Where in other languages a simple print would have sufficed, COBOL obliged you to specify the exact destination. And if you wished to send data to a printer for an official report, things became more rigorous still: it was no longer a simple display, but a writing process (WRITE) to a precisely defined output file.
It was a language that never said “let me guess.”
The variable declarations were equally flavoursome:
DISPLAY "Buna ziua!" UPON CONSOLE.
01 SALARIU-LUNAR PIC 9(5)V99 VALUE 00000.00.
You had to define everything: the field, the size, the position of the decimal, the default value. As though you were sending a letter with a letterhead, an incoming-outgoing register and a return receipt.
But beyond the redundancy, COBOL also had subtleties that impressed me — and one of them was the REDEFINES clause.
This clause allowed you to view the same area of memory in several forms, depending on the context. With a single declaration, you could treat a string of raw characters as, variously, a date field, a numeric code or a compound structure. A kind of old-epoch polymorphism, but without classes, inheritance or sophisticated compilers — just well-thought-out lines of code.
Personally, I have never encountered such elegance in data description in any other language I have used since. And sometimes, I have been sorry not to have had REDEFINES when I needed it most.
And finally, there came PL/I.
A promising language, conceived by IBM as a fusion of Fortran and Cobol. In theory, it was a dream: a language that combined scientific rigour with economic applicability.
In practice, however, PL/I was a nightmare.
This was because it suffered from overambition: it wanted to be good at everything — and ended up being difficult to master in anything.
The manuals were ponderous, the compilers slow and fragile, and the syntax was full of exceptions rather than rules.
For the programmers of the day, who had only just learnt to get along with Fortran or Cobol, PL/I was like a foreign language that claimed to speak your tongue but corrected you at every sentence.
Perhaps that is why it never truly caught on: it was too large for the machines of the time, too complicated for real users, too expensive for applications in the socialist economy. It died before it was understood.
And yet, for me, that mixture of languages was like a festival of logical dialects: each language had its own temperament, its own way of asking for something and of delivering results.
In the end, the lesson was a single one: what matters is not how difficult a language is, but how clear the idea behind it.