# E Johnston

Author
since Sep 17, 2019
E likes ...
Eric Johnston ("EJ") is a co-author of Programming Quantum Computers from O'Reilly Media, and also an acrobat and competitive gymnast.
As an engineer, EJ values surprise and whimsy above all other things. He studied Electrical Engineering and Computer Science at U. C. Berkeley, and worked as a researcher in Quantum Engineering at the University of Bristol Centre for Quantum Photonics.
EJ worked at Lucasfilm for two decades as a software engineer for video games and movie effects, along with occasional motion-capture stunt performance.
Fun links: siggraph-talk / gears / gymnast-outtakes / my-instructables
Boston and San Francisco
Cows and Likes
Cows
11
In last 30 days
0
Total given
0
Likes
6
0
Total given
0
Given in last 30 days
0
Scavenger Hunt
Ranch Hand Scavenger Hunt
Greenhorn Scavenger Hunt

## Recent posts by E Johnston

Hi Frank! This is a great question; the answer is sometimes simple and sometimes baffling.

Frank Röber wrote: For the following, let's ignore the noise of real existing systems. Im talking of an ideally QPU.

I agree that this is useful. Considering a fault-tolerant QPU lets you focus on the issue, not the noise. And we'll have those soon enough. Here's a link to the company I work for, which is building one such device.

Frank Röber wrote:For the following, let's ignore the noise of real existing systems. Im talking of an ideally QPU.
If I know there is only one result, the candidate should be clear after let#s say 10 cycles. But this is just a feeling, I cannot evaluate the error rate.
If there're two expected results (e.g. root-finding like Newton), how many cycles would I need to become confident about the result? It gets worse if there are more roots I didn't expect...

My statistics is also patchy, but if you know there's one correct answer, and a 98% chance of getting the right answer with one run, then there's a 1.0 - 0.02*0.02 = 99.96% chance that at least one of two runs will be correct. So that's also about the probability (seat-of-the-pants leap of logic here, someone please correct me) that if you do see two identical results, the answer you see is correct.

However there's a big (huge) help when it comes to many common algorithms: I's often hard to find the answer, but easy to verify it. For example, in the Shor chapter of the octopus book, the program produces factors of a number, so simple multiplication will tell you whether or not you need to re-run it.

In Figure 10-8 of the Quantum Search chapter you can also check the answer, but doing so requires possibly getting eaten by a tiger.

Frank Röber wrote:As a QPU programmer, will I have to code that cycles explicitly or is this hidden by the device?

Both. For example, when using the IBM 5Q (as in the Quantum Teleportation chapter of the octopus book) you can specify a number of runs, and it will aggregate the results for you (see Figure 4.5). If you need to modify the program based on output from each previous run, then you'll need to run them one at a time.
2 years ago

Himai Minh wrote:Hi, Eric Johnson,
Would you be interested to be our next speaker of Washington DC quantum computing meetup group  to talk about your new book?
If you are in DC, our group will meet up at Cambridge Quantum Computing office.
If you are not in DC, we can have online event via Brighttalk or Zoom or etc.
Thanks.

Hi Himai,
Thanks for this offer! I need to run it by my employer (so many IP issues just means we're thorough about these things). When you have a chance, please send info regarding the meetup to moosemeet@machinelevel.com
Cheers,
- ej
2 years ago
Hi Chris! Good to hear from you.
There are several "quantum internet" projects underway, and it's a hot topic of research. There are even satellite-based qubit transmission projects. One of the big challenges is that you need to send single photons (or qubits, anyway), and cannot use repeaters. That's the catch; anything which amplifies the signal will crush the quantum state, rendering it useless. The type of repeater normally considered for things like quantum encryption is a "trusted node" (basically a manned military bunker) which decrypts the data and then re-encrypts it.

As a specific answer to your questions, here are some guesses from me.
Note that nothing here claims there will be FTL communication. :]

A Completely Made-up Timeline of Home QPU Technology:

When: Today
Home hardware: Internet-connected PC
Neat things you might be able to do: Classical communication with a computer which controls a QPU. This allows you to write programs, execute them, and receive results. You can do this today, using the book sample code and the IBM Q Experience. This will also allow your mobile phone to transparently make use of QPUs without you realizing it. QRNG (quantum random number generation) which is neat, but pretty basic for a quantum computer.

When: Soonish
Home hardware: Optical fiber output (no input) port on your laptop or router, capable of sending 6 different raw single-qubit states (requires premium internet fiber package with extra sports channels)
Neat things you might be able to do: Ability to perform QKD-based quantum encryption. Ability to perform BQC (blind quantum computation) on a remote QPU, which is where the remote QPU does not know what operations you've asked it to run, and runs them anyway. This is a type of security which doesn't make sense on classical computers. A bunch of other QPU applications and experiments cooler than just digital-in/digital-out computation.

When: Also soonish
Home hardware: Hardware upgrade to optical fiber output (no input) port on your laptop or router with a mini-QPU which can generate, entangle, and measure arbitrarily rotated raw photonic qubits.
Neat things you might be able to do: Bi-directional quantum teleportation. Not sure why this is useful outside of a laboratory setting. Qubits may be stored in a spool of optical fiber, for a couple of nanoseconds per foot of fiber. Also enables all-local QRNG, so you can add a whole new level of randomness to your daily life.

When: Not as soon
Home hardware: Optical fiber bidirectional I/O port on your laptop or router, with a mini-QPU which can generate, entangle, and measure raw photonic qubits.
Neat things you might be able to do: Ability to function peer-to-peer, without a server-side QPU.

When: Even less soon
Home hardware: 1K-qubyte Fault-tolerant QPU the size of a desk, connected to laptop via digital and optical ports. \$70k or so.
Neat things you might be able to do: Full-on QPU mayhem in the home.

When: 22 years after that
Home hardware: 1M-qubyte Fault-tolerant QPU inside your phone.
Neat things you might be able to do: No idea what this is for. Selfies maybe?

Note that the moment one of the readers of our book has an unexpected insight and invents the Best QPU Application Ever™️, then this table will require revision.
Maybe shape-shifting materials, doors with no hinges, or something from Diamond Age?

2 years ago

Liutauras Vilda wrote:Thank you for answering questions. It has been a very active week for this book.

It's been a pleasure! Thank you all for a very fun discussion. :]
2 years ago

Campbell Ritchie wrote:So that Bristol encryption card would send all possible passwords and some way to work out which the correct password is? Also, if anybody snooped the password in transit, would it become obvious immediately at the other end and allow us to repeat the encryption process?

Ah, nope. This card device implements something called QKD (Quantum Key Distribution).
There's a demo of this in chapter 2 in the book, in the Spy Hunter example.

Here's how it works:
• Using four laser diodes and a random number generator, the card sends a series of single photons to the bank (via fiber optics in the ATM). Each photon is polarized in one of 4 directions, randomly.
• Anyone intercepting the bitstream (even successfully) will receive only random bits. And (as you mentioned) they would be detected, as in the Spy Hunter code sample. If the random bitstream is compromised, then no data of value is sent.
• If the random bitstream arrives intact, without being intercepted, then the card and the bank use it as a private one-time pad, which is just about the best encryption you can get.

• ...so with QKD, no data of value is sent via the quantum channel at all.

2 years ago

Claude Moore wrote:
Well, one could argue that from a practical point of view, being able to train a model in a few microseconds would be the same as loading it from a persistent storage, wouldn't it be?

If the "trained" structure is qubit-based, then it can't be loaded, as loading involves making a copy of what's in the persistent storage. You could keep digital data in persistent storage, such as instructions used to train it...

If the "trained" structure is digital data and not qubits then yep, loading it will work.

There's a middle ground: As covered in chapter 9 in the book, one form of QRAM (quantum memory) is digital data which may be accessed in superposition (so you can load from multiple addresses at once into qubit registers). If this is used to store the trained structure, then loading will also work.

...so it depends on the design of your ML system.
2 years ago

Claude Moore wrote:Thanks a lot for your detailed answer. I am very excited about what you said about the superscalar speed up you can achieve on algorithms using QC:l can't stop thinking about what we could achieve in the field of Neural networks, and in AI in general...

There's an interesting aspect concerning AI which hasn't really been dealt with yet: Qubits can't be copied. They can be moved and transformed, but there's physically no way to duplicate even one.

So... warning: far-future sci-fi speculation (just for fun)
Suppose (with your far-future-practically-scifi hat firmly in place), suppose you have a complicated quantum machine-learning system, and it took you a year to train it to do a job really well (maybe it can win championship-level games of chess and ping pong simultaneously), and its entire learned state is maintained in fault-tolerant 4 giga-qubit storage. Someone sees your AI demo, and wants to buy 100 of them. Qubits physically cannot be copied, so you can't just save the state and stamp out more just like it, you'll have to train each one and hope it comes out the same. There's no way to back it up, so if the power goes down you'll need to start over. And if over time (and after more input) it gets worse at its job, you'll also have to start over.

Thanks to Nic, Chapter 13 in the book covers Quantum Machine Learning!
It's all very early and experimental; QML systems in the near future will be trained in a few microseconds, use the knowledge, and lose it all a few microseconds later.
2 years ago

Michael Krimgen wrote:However, do you see any applications of the concepts of QC algorithms in other areas like concurrent programming?

Hi Michael,
The tricky thing about treating QC like classical forms of concurrency or parallelism is this: when you do things in parallel on a CPU, more work is done and you're able to save all of the results.
On a QPU, it's a bit different. When you execute multiple tasks in superposition, you can't get all of the results, but you can get things like the sum, or the average, or the parity. In general, the total amount of information you get from doing a million things in superposition is the same as the amount of information you get from just executing one, but you can get different information out, which would have required many runs on a classical computer.

The best illustration of this is known as the Deutsch-Jozsa problem.
Here's (loosely) how it goes:

Imagine you have a function which takes in one bit (0 or 1) and puts out one bit (0 or 1).
The function takes an hour to run. Using it, you answer two questions:
Question 1: If the input is 0, what's the output? - This one is easy. Just give it the input, and an hour later you're done.
Question 2: Is the output the same for 0 and 1, or different? - In this case, you need to run it twice to get an answer to your question.

Using a QPU function, you can run both in superposition. It runs all values simultaneously in a single run, and then you can transform the output to answer either Question 1 or Question 2, but not both. So you can find out whether the answers for 0 and 1 are different (Question 2), so you're 2x faster than your classical computer (yay), but on this case you collapse the state so that you don't actually get the specific output values. If you'd just run the classical function twice you'd have both answers.

(This is explained a bit better by Mercedes in chapter 14, and there's an accompanying code sample here.)

...so if someone tells you a QPU can "play all possible chess games at once" that's sort of true, but not as useful as it sounds. :]
2 years ago

Jay Ko wrote:I don't know anything about quantum computing, but as I read some of posts, it is not 0 and 1.(If it is wrong, please point out)
Then, do I need new computer in order to quantum computing?
If no, should I have to new OS?
Thanks!

Hi Jay, thanks for posting your question!
The first time you use a QC, you probably won't even know you've done it. Currently, when you use your phone's Maps app to plan a drive from LA to New York, it only takes a few seconds, which is amazing. It's fast because the route planning isn't actually done on your phone. It happens on a computer you never had to buy, in a server room in a building you've never had to think about. Similarly, the first useful QPUs will be in server rooms, performing computation (maybe even road trip planning) for people. They'll just get specific types of jobs done, more efficiently than a normal computer would have.

When you use an ATM, the transaction may be secured by a pair of QCs, just to combat hackers and increase security. At the University of Bristol in the UK, there's even a project where they've taken a quantum device and embedded it inside an ATM card, to send a secure data key directly to a larger QC located in the bank's server room. It's like the little encryption chip in your ATM card now, but cooler.

Eventually, computers, tablets, and phones that you buy will have QPUs build into them (just as they currently have GPUs for graphics), but they'll likely use updated versions of the same operating systems we use now.
2 years ago
Hi Paul! Thanks for this question.
Looking forward to QPUs (quantum processing units) becoming "mainstream", the critical thing to watch is not the number of qubits, but the number of fault-tolerant qubits. This is what separates a mainstream QPU from an advanced science experiment. To illustrate, I'll use Mainak's example...

Mainak Biswas wrote:Authors - how many qubits would be required to do create say a calculator?

I really like this example.

Imagine it's 1950, and you're trying to implement a calculator using a "digital electronic computer" which is new and exciting. It's a big advanced model, so it has 64 bits of RAM. So you've got 8 bytes to work with. Is that enough to implement a calculator? For sure, if just a simple one. You can increment, add, divide-by-2 by bit-shifting, it'll be a fun project. But then you find out that the eight bytes aren't error-corrected or fault-tolerant. You can clear a register to zero, but as the next instructions execute some of the bits start losing their values, and after about 40 instructions it's gone completely random. Suddenly, making a useful calculator becomes a lot more difficult, so this computer might not be ready to be a "mainstream" product.

In 2019, the current state of the art in QC is 50-72 raw (non-FT) qubits, which is challenging (not impossible) to build a quantum calculator with.
...so it's really the number of FT qubits we need to watch when deciding when something will be mainstream. These are coming, but it's hard to peg a date until we can see them working.

Still, we can simulate FT Qubits with no problem:

In the book, chapter 5 (right near the beginning) contains everything you need to implement and simulate a calculator on a QPU using very few qubits. Basic math functions which work in quantum superposition are specified and demonstrated, with code samples that run in one click.

For example, Example 5-3 shows how to implement "a = a + b * b" in quantum superposition, and you can run it right now, on the sample code page.

This simple example can also be run for free on a physical QC, since the page contains Qiskit and OpenQASM versions. If you do that, you'll notice that non-FT qubits get really fuzzy.

2 years ago

Roger Yates wrote:‘what language is/can be used to develop these native libraries for QPUs? Is this more assembly-like or are there already compilers for higher level abstractions?’

Hi Roger,
As you mention per the other Java thread, code to use the libraries can be developed in any language.

To develop the libraries themselves, it depends on whether you're doing...
• ...a QPU simulation in which case any language will do; at work I use C++ and asm to make it fast, but the one I wrote which runs the book samples is all in JavaScript, just so that readers can click one button and see it go. If you want an open-source library to build on top of, then something like Qiskit (which is in Python) might be a great starting point.
• ...or a driver for a physical QPU in which case the language used at the low level of your library may be constrained by specific needs of your hardware, but you can probably still provide high-level interfaces in any languagfe you like.

• 2 years ago

Claude Moore wrote:In the authors' opinion, do they think that we need to redefine concepts we used to use in traditional informatics, for example the Turing Machine model, Big O complexity notation and so on, or the same will still be valid to study algorithms from a theorical point of view?

Hi Claude! Great question.
Nope, all of the information theory concepts you're mentioning still apply, exactly as they always have. The only thing QCs change is which operations can be done efficiently.

• Big O: This notation is used all over QC, especially to describe the speedup provided by a quantum algorithm. For example, Grover's Database Search is an application of QCs which allows them to search an unsorted database of n items in O(sqrt(n)) time, when it would normally be O(n) time. Quantum search is covered very well (thanks to Mercedes) in chapter 10 of the book.
• Turing model still applies as well, as we're still really dealing with sort-of digital data. Where this gets really interesting is when you consider that the program counter in a CPU (the register which points to the next instruction) is actually data, and a QPU might store that in quantum superposition, allowing a program to "both branch and not branch". This is potentially weird, and not well-explored. Although it's too early to really exhaustively explore this on a physical QPU, I did a small-scale simulation of this a few years ago. That whitepaper is pretty fun (if not directly useful), and can be found here. (language warning: the FSM model I based it on (created by Urban Müller) has a cheeky name).
• 2 years ago
HI Jean-François!

Jean-François Morin wrote:Are the code and the algorithms in those books some sort of emulation of how QC might be achieved on an actual quantum computer?

Yes, exactly. There are several ways to simulate a QC on a classical computer, depending on the desired result.
• State Vector Simulation is the most common: reading ("measuring") one qubit produces one bit, so the sim must track the probability of each possible value which can be read from your qubits. For example, an 8-bit number can have 256 possible values, so the sim internally keeps track of all 256 "terms". These probabilities aren't just 0%-100%, they're complex numbers, which means each one is a floating-point 2D vector. In our book, we draw these 2D vectors using "circle notation" (see below) which is something I came up with more than a decade ago, as a way of building my own intuitive understanding.
• Density Operator Simulation is like state vector sim, but with more detail. To do this you have to square the number of circles required. So for 8 qubits, instead of 256 circles you need 256*256=65,536 circles.
• Stabilizer Simulation is much faster, and requires much less memory (O(n) instead of O(2^n)), but it can only simulate a very small number of quantum operations.
• Physical Simulation is where the actual physical hardware is simulated in as much detail as possible. It's like simulating the transistors and capacitors of a digital computer, instead of just the bits and bytes. This may involve FDTD photon propagation using Maxwell's equations, or some other form of detailed physical dynamics. These simulations are typically used by teams building the QPUs, not using them. These sims are very time and resource intensive, and only very simple systems can be simulated.

• Jean-François Morin wrote:If so, I tend intuitively to think the speed might not be the same... Am I right?

Frank Röber wrote:If emulation would be fast enough, we wouldn't need QC at all...

You are both exactly correct.
• sometimes much faster: For a state vector sim of small quantum states (4 qubits = 16 terms = 256 bytes of RAM), a digital computer's manipulation of the state vector is much faster than a present-day quantum computer can operate. In fact, up to about 26 qubits, simulations can easily be done in a browser, even on a mobile phone.
• sometimes much slower: For a state vector sim of bigger quantum states (40 qubits = 1.1 trillion terms = 16 terabytes of RAM), a digital computer's manipulation of the state vector is much slower than a present-day quantum computer can operate. Alternatively, a physical simulation of even a simple system may take weeks to complete a sim of a few nanoseconds of the device's activity.
• sometimes impossible: For a state vector sim of a medium-sized near-future QC (200 qubits = 1.6x10^60 terms = 2.4x10^52 GB of RAM, which is more RAM than will ever exist on Earth, which has only about 10^80 atoms total), the simulation can never be done on digital computers as we currently know them.
• There are some techniques to do approximate simulations of up to 70 qubits or so, but these are rough and require supercomputers to run.

Jean-François Morin wrote:In other words, I'm trying to make sense of QC and QC algorithms being implemented and run on a non-quantum computer...

Frank Röber wrote:I expect, that it's especially pretty difficult (tricky?) to simulate the superposition state - I wonder how they do that...

You'll be surprised to learn that the simulation itself is not complicated at all. It's simple enough to understand visually, and even do simple cases in your head or on paper. We use this in the book and it works really well. Using one circle to represent each term in a state vector, many QPU operations are easy to visualize.

For example, the book's explanation of the NOT (also called an X-gate) on a quantum superposition of values looks like a simple swap of the values:

As the state vectors get larger, we exaggerate the phase markings and colors, to make patterns easier to spot:

Doing it without a computer at all: In 2000, I wrote a whitepaper describing a way to do QC simulation with a pen and paper, which is not that useful but very fun. I still use this to work out details of some QPU programs while I'm sitting on the beach or in the park instead of in front of a screen.

2 years ago

David Sachdev wrote:Does your book go into what and why there are algorithms exclusive to Quantum computing?  Does it have information on if it is the speed of quantum computing, the nature of it, or other reasons that the algorithms are targeted at the platform.  Also, it seems more and more that algorithms aren't taught, but people are introduced to libraries where the low level algorithms are presented.

Hi David, thanks for these questions! Based on these, I think you'll really like the structure of our book. We dive straight into the quantum-specific algorithms such as QFT, Grover, Phase estimation, etc. We present them as library functions, and then go into detail about specifically what they're made of, why they work, and what aspects of a QPU (quantum processing unit) make it suited to them. Then, we take these functions and use them to construct higher-level applications, such as Logical search, Shor, and Quantum Supersampling.

By building it up in this way, we hope to convey not just a top-level understanding of what the functions are called, but what they actually do under the hood, so that readers can think about building primitives and applications of their own.

David Sachdev wrote:What are the languages and platforms that are leading the way when it comes to Quantum computing?

The QPU libraries themselves are accessible from a variety of languages. For example, the book sample code has versions which run in Python, JavaScript, C# and OpenQASM. But really if you have a QPU simulator or a physical QPU, the library functions are likely to be accessible through other languages as well (just as libraries for GPUs are).
2 years ago