A Little Ludwig Goes a Long Way

A smattering of opinions on technology, books, business, and culture. Now in its 4th technology iteration.

Playing with Quantum Computing

07 January 2022

As an antidote to the web3 nonsense, I’ve been trying to play around with some more interesting and more fundamental innovations. Microsoft has a really nice quantum computing tutorial that I have been working my way thru. I have enough math, quantum, electonic, and cs background to wade through. You probably want to be reasonably comfortable with vectors and imaginary numbers or it might be frustrating.

I have a general impression, which is that it is very very very early in the development of quantum computing. And that there is probably not much interesting software work to do right now.

Looking at the history of digital computing, we had to invent a lot of layers of technology to get to the point where it was a mass market useful thing. We had to invent the transistor (and triode before that). We had to combine resistors into circuits to make logic gates, coming up with some conventions that a certain voltage/current value was a “1” and another value was a “0”. We had to invent Boolean algrebra. We had to combine gates into higher order components like adders. We had to move from discrete transistors to integrated circuits. We had to develop Von Neumann machines. We had to get to a standard system architecture of clocked cpus with memory. We had to develop assembly language and then higher order languages like Fortran. And we had to develop a lot of tools and process technology all along the way. All this work overlapped and moved in fits and starts, but took decades to wrangle all this together. By the time we got to Fortran, we could write code that many humans could understand and which you could express real world problems in.

Without all this, writing “software” for transistors would have been terribly hard.

With quantum computing, we are throwing out the transistor and starting back up the stack. We are establishing conventions for what the equivalent of a “1” and “0” is. But the stack above that is still early or nonexistent. We are nowhere close to a development platform that lets a lot of humans write code that is understandable and relates to a large body of real world problems.

Sam pointed me to this article that says useful quantum computers may require a few million qubits, and we are toying around today with systems of ~10-100 qubits. Feels about right.

I am optimistic about quantum computing – today’s transistors require us to fling around a huge amount of electrons to do anything useful. If we can get to circuit elements that can work with discrete quanta, man that will be a lot of computing performance. But it is going to take us a while.