Estimated reading time: 8–10 minutes
“Assembly language programmers were a breed apart… People spoke of them with a mixture of respect and fear, as one might speak of the snipers of computing: they did not fire often, but when they did, everything was precise.”
I have left the material on programming languages until last. Not because they were unimportant, but because they were… hard to forget.
We learnt Assembler, a language in which one wrote with bated breath. You knew that every bit counted. Assembly language programmers were a breed apart. Not only did they write code, but they knew what was happening behind every bit. People spoke of them with a mixture of respect and fear, as one might speak of the snipers of computing: they did not fire often, but when they did, everything was precise. They could work miracles with a computer that gave everyone else nothing but errors. And they always had that air of weary men about them. They had the gaze of someone who does not waste time.
They knew how to read the dumps — those mysterious listings of hexadecimal traces and frozen registers — and to see where the so-called high-level programmes had run out of control, whereupon they calmly added a few zeroes. They were not surprised, they were not alarmed. They simply raised an eyebrow, took out a pencil and rewrote the faulty section of code… much as a surgeon might operate without so much as blinking.
Then there was ALGOL (Algorithmic Language) — elegant, clean, but too theoretical for the task we had to accomplish. Few people realise that ALGOL was not initially intended for “practical programming” like FORTRAN or COBOL, but rather to define the standard for expressing algorithms. ALGOL was created collaboratively, as an international, academic and formal effort. It was adopted as the published language for certain scientific papers in computer science for decades. ALGOL, first defined in 1958 and then in 1960, left a profound imprint on programming languages up to the present day:
When you write today if x > y then … else …, you are following in the footsteps of ALGOL, which gave priority to clarity over jumps, and to art over the chaos of labels. ALGOL was the first language to introduce expressive, logical conditionals with clear, predictable branching, without “wild jumps.” This if … then … else structure was subsequently adopted directly in: PASCAL, Java, Python, JavaScript and virtually every modern language. ALGOL introduced a clear separation between statements and expressions — a valuable principle that has been lost over time.
What does this mean? In ALGOL, statements (commands that affect programme execution) and expressions (which compute a value) were treated as clearly separate and were not mixed.
Y := X + 1;
if Y > 10 then
Y := 0;
// assignment expression
// control statement
// assignment statement
if ... then ... is a control statement
X + 1 is an expression (it evaluates a value)
Y := X + 1 is a statement (it assigns the value)
You could not have indented blocks that mixed logic with evaluation on the same line. In other words, ALGOL said: “First calculate, then decide what to do with the result.” No “tricks.” This idea was taken further with Pascal — a direct descendant of ALGOL. Pascal took the essence of ALGOL and simplified it for the real world. It stripped away the academic complexity and created a language that worked efficiently on real-world problems. That is why Pascal succeeded where ALGOL remained merely a theoretical ideal.
And the same philosophy — clarity, the separation between what you calculate and what you do with the result, a syntax that prevents confusion — was adopted by Python, one of the most popular programming languages of the present day. Python is similar to ALGOL in principle, but it is every bit as practical as Pascal: it does not allow you to write ambiguous code, it obliges you to think clearly, and indentation — the spaces that structure the code — is not optional but part of the language itself.
When I returned to programming after more than twenty years away, I worked briefly in Visual Basic for Applications (VBA) — a very powerful language, but with serious limitations when it came to working in the cloud. In the end I chose Python. Not out of nostalgia for ALGOL, but for practical reasons: simplicity, rigour and the strength of the community. Python gives you what ALGOL promised: clarity and elegance. But, unlike ALGOL, Python works. Of course, Python has its own challenges. The “jungle” of library versions can be quite overwhelming — what worked yesterday may not work today, and dependencies can indeed become a jungle. But that is why virtual environments exist — a wonderfully elegant concept: each project lives in its own bubble, with its own library versions, without interfering with one another. It is rather like having several separate offices in the same building, each with its own tools, without any cross-contamination.
FORTRAN (Formula Translation) was everybody’s friend: robust, with a permissive syntax and intuitive behaviours, which were sometimes too lenient for the programmer’s own good. For example, if you did not declare a variable type, FORTRAN automatically assumed that any variable name beginning with I, J, K, L, M or N was of type INTEGER, whilst the rest were taken to be REAL.
Thus, X = 5 did not produce an error, but quietly gave you a real variable if X did not begin with one of the “correct” letters.
A detail that could cause a great deal of headache.
Another example was the DO loop, where a misplaced full stop changed the meaning of the entire instruction:
DO 10 I = 1.10
What should have been a loop DO 10 I = 1, 10, the compiler interpreted as an assignment statement DO10I = 1.10, without any warning.
The programme ran, but it did something quite different from what you would have expected.
For those of us who were learning, FORTRAN was forgiving. But it was precisely in this indulgence that a silent punishment sometimes lurked: the code worked, but it was not what you thought you had written.