“OK,” you say, “you told us in your last post that quantum computers are about to emerge big time and that they are really fast compared to the ordinary computer today. How does being ‘quantum’ make them fast?”

I’ll try to give you an idea without lapsing into science-speak too much.

In the 1980’s some really smart people realized that certain quantum phenomena can be used to find solutions to mathematical problems. They can be used to transmit spy-proof messages, generate true random numbers, and the break cryptographic codes hitherto deemed unbreakable. Since then, a lot of other problems have been formulated in a way which makes them amenable to quantum computing.

Your credits as a geek and hacker of ordinary computers aren’t going to be of much help with quantum computers.

At the heart of ordinary computers rests a **C**entral** P**rocessing U**u**nit, a so-called CPU, which performs all calculations. Each CPU can only do one computation at once. Since it can do them very quickly, it appears as if it were doing several of them in parallel, but deep down this is only a deception.

The basic kernel of information on which a CPU performs its calculations is called a “bit”, representing the smallest amount of information possible: either 0 or 1. A bit can have only one of those two values. But that’s OK, because all it takes is a sufficiently long string of bits to represent what ever you want: large numbers, letters, images, sound, videos, etc. Everything can be represented by bits.

A quantum computers works not on bits but on “qbits” (not to be confused with “Q-tips”, the little sticks with cotton tips you swab your ears with). Qbit stands for “quantum bit” which suggests a close correspondence with classical bits but on the quantum level.

“What’s the difference?” you ask. Well, the main difference is that while a classical bit can either be 0 or 1, a qbit can be a bit of both to varying degrees. There is an infinity of possible states a qbit can be in, compared to only two states of a classical bit. This difference comes to play when you look at several qbits.

Eight bits are usually strung together to form a byte, which can represent any number between 0 and 255. One byte stands for one number in the range 0…255. Eight qbits, however, can represent any number in that range *at once*. And in that lies the power of the quantum computer: it can perform many calculations truly concurrently.

Comparing classical computers and quantum computers is like comparing two pianists: one plays a melody with one finger, a note at a time. The other uses millions of fingers at once and plays all melodies at the same time. I’m not sure which one I would actually prefer listening to, but the quantum pianist is certainly going to finish first.

The difficult part is how to make qbits, keep them alive long enough and have them interact in specific ways to perform calculations. The handling of qbits is the part where research is needed most. Already a large number of scientists have found several approaches to this problem. Looking at the speed at which quantum computers have progressed in the past couple of years alone, 5-10 years in the future will have them starting to move into mainstream computing.

In my upcoming novel “The Pi Effect” a quantum computer and the number pi interact in a completely unforeseen way which is bound to keep you at the edge of your seats and turning the pages.