A Bad Trip to Infinity [4800 views]

The fear of infinity is a form of myopia that destroys the possibility of seeing the actual infinite
~ Georg Cantor

Recently NETFLIX released a documentary on mathematical concept of infinity, titled A Trip to Infinity. NETFLIX’s trip is a bad one.

Continue reading

Posted in Uncategorized | Leave a comment

GOFAI is dead – long live (NF) AI! [10,000 views]

Art is what you can get away with.
Marshall McLuhan

[All the images in this post were produced with generative AI – Midjourney,DALL-E 2, Stable diffusion.]

I’d like to give you my thoughts on the recent amazing developments in AI (Artificial Intelligence).

I’m a retired (emeritus) professor of computer science at the University of Victoria, Canada. I ought to know a bit about AI because I taught the Department’s introduction to AI course many times. 

All I can say is thank God I’m retired. I couldn’t have kept up with the breakthroughs in translation, game playing, and especially generative AI.

Continue reading

Posted in Uncategorized | 1 Comment

50 Years of Wow- I lived through 5 decades of computing milestones [4400 views]

Everyone’s all, “Wow, chatGPT, amazing, a real milestone, everything will change from now on”. And they’re right – but probably don’t realize that this is not the first time something like this has happened. In fact there’s been wave after wave of computing technological innovation ever since the industry got started in the 1950’s. Here are some of the waves I’ve experienced personally

The Stone Age of computing.

When I got into computing, in 1965 at UBC in Vancouver, it was all very primitive. I took a numerical analysis course while doing a math degree.

IBM 7040

I learned Fortran and later, that summer, IBM 7040 assembler, which I got pretty good at. However when I moved on to Berkeley to do a math Phd I almost completely dropped computing because I thought it was primitive – punched cards and Fortran – and not really going anywhere. But I kept my hand in.


While I was at Berkeley I was introduced to LISP and again my mind was boggled. I was very taken by recursively defined algorithms working on hierarchical structures. FORTRAN didn’t support recursion nor did it have hierarchical data structures.

Given that FORTRAN was my first language these omissions could have scarred me for life but instead I really took to LISP. In retrospect LISP was a major milestone because it took recursion into the mainstream.

However the first LISP systems were pretty sluggish and I mistakenly dismissed LISP as being impractical. I continued my math research into what would be known as Wadge degrees.

I was wise to pursue my interest in computing because by the time I finished my PhD the math job market had collapsed. My first teaching position was in computer science, at the University of Waterloo. In those days there weren’t enough CSC PhDs to staff a department so they hired mathematicians, physicists, engineers etc.

Time Sharing

Between my time at UBC and my arrival at Waterloo there was one big milestone – time sharing. Dozens of people could share connections to mainframes, which had gotten bigger and more powerful. This was mind-boggling, because before you had to wait your turn for computer access. At UBC I punched my program on to cards and left the deck at reception. I’d come back one or two hours later (the “turnaround time”) when the programs in the “batch” had been run and get the output. Which would typically contain an error report.

It took forever to produce a correct program. Time sharing cut turnaround time to seconds and greatly speeded up the software development process. Thus was born “interactive” computing.

Time sharing needed a lot of supporting technology. An operating system with files. Terminals for users, running some form of file browser, compilers and file editors.


All this technology quickly arrived with the UNIX operating system, a non-official project at Bell Labs developed on an abandoned computer. UNIX conquered the world and still dominates to this day.

When UNIX arrived in a department it was like Christmas because there was a whole tape full of goodies. You got the C language and its compiler, the vi screen editor, and utilities like yacc, awk, sed, and grep. UNIX was a huge leap forward and an unforgettable milestone.

Soon there was a whole ecosystem of software written in C by UNIX users. Unix was a real game changer.


One of the UNIX utilities was the shell (sh), a simple CLI. It was easy to set up a terminal, like a vt00, to run the shell. The shell allowed the user to define their own shell commands, and allowed recursive definitions. An irreplaceable command was vi, the screen editor. It replaced line editors, which were very difficult to use.

The end result was that anyone with a UNIX terminal had at their disposal the equivalent of their own powerful computer. At first the terminals were put in terminal rooms but gradually they were moved into individual offices.

Digital typesetting

About the time UNIX showed up, departments started receiving laser printers that could print any image. In particular they could print beautifully typeset documents, including ones with mathematical formulas. The only question was, how to author them?

UNIX had the answer, a utility called (eventually) troff. To author a document you created a file with a mixture of your text and what we now call markup. You ran the file through troff and sent the output to the printer.

The only drawback was that you had to print the document to see it – a vt100 could only display characters. This problem was solved by the next milestone, namely the individual (personal) computer.


A personal computer (often called a “workstation”) was a stand-alone computer designed to be used by one individual at a time – a step back from timesharing. The workstations were still networked and usually ran their own UNIX. the crucial point is that they had a graphics screen and could display documents with graphs, mathematics, and even photographs and other images.

As workstations became more common, timesharing decreased in importance till only communication between stations was left.


Originally communication was by file transfer but this was quickly replaced by email as we know it. At first senders had to provide a step-by-step path to the receiver but then the modern system of email addresses was introduced.

At this point many computer people relaxed, thinking we’re finally reached a point where there are few opportunities for innovating – boy were they wrong!

The Web

At CERN in Switzerland Tim Berners-Lee decided that email lists were inefficient for distributing abstracts and drafts of physics papers. He devised a system whereby each department could mount a sort of bulletin board. This was the origin of the Web.

Unfortunately at first no one used it, until Berners-Lee put the CERN phone directory on the Web and its popularity took off. Soon there were thousands of web sites opening every day.

If anything the Web was a more significant milestone than even UNIX or timesharing. I remember playing around and discovering Les Tres Riches Heures du Duc de Berry – medieval images in glorious color on the Web (via a Sun workstation). My mind was, of course, boggled, and obviously not for the first time.

Minor milestones

After Timesharing, UNIX and the Web there are other important milestones that seem almost minor by comparison. There’s Microsoft’s Office suite including the spreadsheet Excel. Then there’s the iPhone and other smart phones, People with a phone carry around a computer thousands of times more powerful than the old 7040 I learned computing on.

And yet the innovation doesn’t stop. For a long time AI was a bit of a joke amongst computer scientists-we called it “natural stupidity”. The translations were of poor quality and although they mastered Chess a more complex game like Go (Baduk) was beyond their reach. Periodically industries and granting agencies would get discouraged and an “AI winter” would set in.

AI Summer

Then a very few years ago this changed dramatically. Go was solved and the translations became almost perfect (at least between European languages). What I call the “AI Summer” had arrived.

This was all thanks to employing new strategies (ML, neural nets) and using the vast stores of data on the internet.

And now we have Midjourney etc and chatGPT. Very significant milestones but not, as we have seen, the first – or the most significant.

Posted in Uncategorized | 1 Comment

Just How Smart are You, ChatGPT? I quiz chatGPT about math.[7500 views]

Everyone’s heard about chatGPT, the latest and most sophisticated chatbot to date.We all know it can bs proficiently about ‘soft’ topics like English literature. I decided to quiz it about a hard topic, mathematics. As you probably know, I have a PhD in math, so I won’t go easy.

Continue reading
Posted in Uncategorized | 2 Comments

To Be or Not to Be – Mathematical Existence and the Axiom of Choice [3900 views]

The axiom of choice (AC) seems harmless enough. It says that given a family of non empty sets, there is a choice function that assigns to each set an element of that set.

AC is practically indispensable for doing modern mathematics. It is an existential axiom that implies the existence of all kinds of objects. But it gives no guidance on how to find examples of these objects. So in what sense do they exist?

Continue reading
Posted in Uncategorized | Leave a comment

We Demand Data – the story of Lucid and Eduction [1700 views]

Power concedes nothing without a demand.
-Frederick Douglass

When the late Ed Ashcroft and I invented Lucid, we had no idea what we were getting in for.

Continue reading

Posted in Uncategorized | 1 Comment

Hyperstreams – Nesting in Lucid [1000 views]

When Ed Ashcroft and I invented Lucid, we intended it be a general purpose language like Pascal (very popular at the time.)

Pascal had while loops and we managed to do iteration with equations. However in Pascal you can nest while loops and have iterations that run their course while enclosing loops are frozen. This was a problem for us.

Continue reading

Posted in Uncategorized | Leave a comment

PyLucid – where to get it, how to use it [130 views]

Recently there was a post on the HN front page pointing to a GitHub repository containing an old (2019) version of the source for PyLucid (I don’t know who posted it). It generated a lot of interest in Lucid but unfortunately the 2019 version is crude and out of date and probably put people off.

I’m going to set things right by releasing an up to date version of PyLucid (Python-based Lucid) and up to date instructions on how to use it to run PyLucid programs. The source can be found at pyflang.com and this blog post is the instructions. (The source also has a brief README text file.)

Continue reading

Posted in Uncategorized | Leave a comment

Shennat dissertation: Dimensional analysis of Lucid programs [410 views]

I recently graduated my 17th and last PhD student, Monem Shennat. This is the abstract of his dissertation with my annotations (the abstract of a University of Victoria dissertation is limited to 500 words).

The problem he tackled was that of dimensional analysis of multidimensional Lucid programs. This means determining, for each variable in the program, the set of relevant dimensions, those whose coordinates are necessary for evaluating individual components.

Objective: to design Dimensional Analysis (DA) algorithms for the multidimensional dialect PyLucid of Lucid, the equational dataflow language. 

Continue reading

Posted in Uncategorized | Leave a comment

Portrait vs Landscape – more than meets the eye [2100 views]

I have some theories about these modes – for example, cropping one into the other. I tried them out on the Monna Lisa and … read on!

Continue reading

Posted in Uncategorized | Leave a comment