Everyone’s all, “Wow, chatGPT, amazing, a real milestone, everything will change from now on”. And they’re right – but probably don’t realize that this is not the first time something like this has happened. In fact there’s been wave after wave of computing technological innovation ever since the industry got started in the 1950’s. Here are some of the waves I’ve experienced personally
The Stone Age of computing.
When I got into computing, in 1965 at UBC in Vancouver, it was all very primitive. I took a numerical analysis course while doing a math degree.
I learned Fortran and later, that summer, IBM 7040 assembler, which I got pretty good at. However when I moved on to Berkeley to do a math Phd I almost completely dropped computing because I thought it was primitive – punched cards and Fortran – and not really going anywhere. But I kept my hand in.
LISP
While I was at Berkeley I was introduced to LISP and again my mind was boggled. I was very taken by recursively defined algorithms working on hierarchical structures. FORTRAN didn’t support recursion nor did it have hierarchical data structures.
Given that FORTRAN was my first language these omissions could have scarred me for life but instead I really took to LISP. In retrospect LISP was a major milestone because it took recursion into the mainstream.
However the first LISP systems were pretty sluggish and I mistakenly dismissed LISP as being impractical. I continued my math research into what would be known as Wadge degrees.
I was wise to pursue my interest in computing because by the time I finished my PhD the math job market had collapsed. My first teaching position was in computer science, at the University of Waterloo. In those days there weren’t enough CSC PhDs to staff a department so they hired mathematicians, physicists, engineers etc.
Time Sharing
Between my time at UBC and my arrival at Waterloo there was one big milestone – time sharing. Dozens of people could share connections to mainframes, which had gotten bigger and more powerful. This was mind-boggling, because before you had to wait your turn for computer access. At UBC I punched my program on to cards and left the deck at reception. I’d come back one or two hours later (the “turnaround time”) when the programs in the “batch” had been run and get the output. Which would typically contain an error report.
It took forever to produce a correct program. Time sharing cut turnaround time to seconds and greatly speeded up the software development process. Thus was born “interactive” computing.
Time sharing needed a lot of supporting technology. An operating system with files. Terminals for users, running some form of file browser, compilers and file editors.
UNIX
All this technology quickly arrived with the UNIX operating system, a non-official project at Bell Labs developed on an abandoned computer. UNIX conquered the world and still dominates to this day.
When UNIX arrived in a department it was like Christmas because there was a whole tape full of goodies. You got the C language and its compiler, the vi screen editor, and utilities like yacc, awk, sed, and grep. UNIX was a huge leap forward and an unforgettable milestone.
Soon there was a whole ecosystem of software written in C by UNIX users. Unix was a real game changer.
Terminals
One of the UNIX utilities was the shell (sh), a simple CLI. It was easy to set up a terminal, like a vt00, to run the shell. The shell allowed the user to define their own shell commands, and allowed recursive definitions. An irreplaceable command was vi, the screen editor. It replaced line editors, which were very difficult to use.
The end result was that anyone with a UNIX terminal had at their disposal the equivalent of their own powerful computer. At first the terminals were put in terminal rooms but gradually they were moved into individual offices.
Digital typesetting
About the time UNIX showed up, departments started receiving laser printers that could print any image. In particular they could print beautifully typeset documents, including ones with mathematical formulas. The only question was, how to author them?
UNIX had the answer, a utility called (eventually) troff. To author a document you created a file with a mixture of your text and what we now call markup. You ran the file through troff and sent the output to the printer.
The only drawback was that you had to print the document to see it – a vt100 could only display characters. This problem was solved by the next milestone, namely the individual (personal) computer.
Workstations
A personal computer (often called a “workstation”) was a stand-alone computer designed to be used by one individual at a time – a step back from timesharing. The workstations were still networked and usually ran their own UNIX. the crucial point is that they had a graphics screen and could display documents with graphs, mathematics, and even photographs and other images.
As workstations became more common, timesharing decreased in importance till only communication between stations was left.
Originally communication was by file transfer but this was quickly replaced by email as we know it. At first senders had to provide a step-by-step path to the receiver but then the modern system of email addresses was introduced.
At this point many computer people relaxed, thinking we’re finally reached a point where there are few opportunities for innovating – boy were they wrong!
The Web
At CERN in Switzerland Tim Berners-Lee decided that email lists were inefficient for distributing abstracts and drafts of physics papers. He devised a system whereby each department could mount a sort of bulletin board. This was the origin of the Web.
Unfortunately at first no one used it, until Berners-Lee put the CERN phone directory on the Web and its popularity took off. Soon there were thousands of web sites opening every day.
If anything the Web was a more significant milestone than even UNIX or timesharing. I remember playing around and discovering Les Tres Riches Heures du Duc de Berry – medieval images in glorious color on the Web (via a Sun workstation). My mind was, of course, boggled, and obviously not for the first time.
Minor milestones
After Timesharing, UNIX and the Web there are other important milestones that seem almost minor by comparison. There’s Microsoft’s Office suite including the spreadsheet Excel. Then there’s the iPhone and other smart phones, People with a phone carry around a computer thousands of times more powerful than the old 7040 I learned computing on.
And yet the innovation doesn’t stop. For a long time AI was a bit of a joke amongst computer scientists-we called it “natural stupidity”. The translations were of poor quality and although they mastered Chess a more complex game like Go (Baduk) was beyond their reach. Periodically industries and granting agencies would get discouraged and an “AI winter” would set in.
AI Summer
Then a very few years ago this changed dramatically. Go was solved and the translations became almost perfect (at least between European languages). What I call the “AI Summer” had arrived.

This was all thanks to employing new strategies (ML, neural nets) and using the vast stores of data on the internet.
And now we have Midjourney etc and chatGPT. Very significant milestones but not, as we have seen, the first – or the most significant.
Well, that depends on your definitions. Even fairly early versions of Forth managed that on top of just an IBM 1130 with an IBM 2250 to provide terminals, without needing any of that other functionality already in place to support it. Of course, it still needed that functionality to be fully practical, but the Forth provided that itself. So I don’t classify that as supporting technology, in that particular case.
In my book, Microsoft didn’t deliver anything new in the way of technology with that at all, as it was an achievement of marketing rather than of technology (the same functionality had already been delivered by the products others had produced). And the whole walled garden thing means that, while people might indeed be carrying around computers thousands of times more powerful than the IBM 7040, in a very real sense most of those people don’t really “have” computers that powerful any more than a passenger in a bus “has” a bus. Again, it’s a definition thing.