# Quantum Computing: Schrödinger’s Cat is SO Clever

*The greatest technological leap forward for humanity in the second quarter of the 21 ^{st} Century will be Quantum Computing (QC). It will accelerate computer processing speeds beyond our imagination. It’s a weird world; and yet the first QC inspired computers are already working on problems of mind-boggling complexity.*

Quantum Theory was formulated in Europe in the 1920s by Max Planck, Niels Bohr and others after Einsteinian Relativity had already become main stream. These two branches of physics have never fully been reconciled with one another. Relativity is the physics of the unspeakably large, while quantum theory is the physics of the unimaginably small. They appear to obey different laws. (Mankind, as Sir Arthur Eddington once ventured, is *half way between the two.*)

The un-mathematicised nub of quantum theory is that matter (particles such as electrons) can exist in two places at once. Or even (sorry about this) anywhere in between. Physicists know this due to an experimental technique called diffraction. And they can describe this phenomenon mathematically. Quantum objects like electrons are sometimes observed as particles and sometimes as waves. Or they can be described as both at the same time.

Conventional computers are based on *bytes* made up of eight *bits*. Hence computer scientists use a hexadecimal (base 16) number system because it is easier to express binary number combinations in hexadecimal than it is in any other base number system. Each bit – the lowest addressable component of your PC, laptop or iPad – is either ON or OFF. One or zero. Binary. And all the computing power that we experience arises from those trillions of microscopic switches on computer chips that are either on or off at any given moment.

Now just try to imagine – if you can – a breed of computer where each addressable bit is not necessarily ON or OFF but *both at the same time, or somewhere in between*. This would mean that computers are no longer limited to processing information on an on or off binary basis. If this sounds barmy, you are in good company: Einstein himself was deeply uncomfortable with the implications of quantum theory. But the fact is that the mathematics of quantum theory works, even if the philosophical outcomes that arise from it are entirely counterintuitive.

In 1926 the Austrian physicist Erwin Schrödinger (1887-1961) devised a mathematical equation called the *wave function* which can yield the probabilities that a particular particle will have specific characteristics when it is observed and measured by a physicist in an experiment. This equation, on which *quantum mechanics* was constructed, was later refined to use “complex” numbers – that is numbers, which do not reside on the conventional number line. Based on the conjectures which arose from the wave function, in 1935, Schrödinger formulated the thought experiment often referred to as *the paradox of Schrödinger’s cat*.

Schrödinger imagined a cat sealed inside a box which contains a radioactive substance. The quantum wave function for the substance gives us a 50:50 chance that the substance will decay over a given time and, by so doing, will detonate a canister of poison which will kill the poor cat. Now, before we open the lid of the box to peek inside, we ask the question: *is the cat alive or dead?* The standard quantum theorists view based on the wave function mathematics is that, before we peek, *the cat is simultaneously both alive and dead*. It’s only when we open the lid that we get to find out what *really* happened to that unfortunate cat[i].

Going back to the bits and bytes inside your computer, quantum mechanics can describe a bit that is simultaneously one and zero – before we get to test what it actually is. Quantum computation uses quantum bits, which can be in any *superposition* of states. Superposition is a concept in quantum mechanics which says that, for example, Schrödinger’s wretched cat could be (a) alive, (b) dead, (c) alive and dead… or (for all I know) (d) neither dead nor alive. It also uses another concept from quantum mechanics called *entanglement* – the idea that two particles (or by extension computer bits) may be invisibly connected although they reside in different places.

In the 1980s a computer scientist and mathematician named Richard Feynman translated a version of the wave function mathematics into a computer programme that could simulate these weird quantum properties inside a computer. Since then huge advances have been made.

And there is right now a company based in Vancouver, Palo Alto and Washington DC which is as yet the only company to have sold QC inspired computers. **D-Wave Systems Inc.** founded by Geordie Rose in 1999 is a private company backed and currently owned by a panel of blue chip venture capital and private equity firms including Kensington Capital Partners, Harris & Harris Group, Goldman Sachs, In-Q-Tel[ii], Draper Fisher Jurvetson, Bezos Expeditions and others. D-Wave Systems is the global leader in the development and manufacture of superconducting quantum computers which operate at very low temperatures. Its systems are already being used by top companies and institutions including **Lockheed-Martin (NYSE:LMT)**, **Google-Alphabet (NASDAQ:GOOGL)**, NASA, and the University of Southern California (USC).

Geordie Rose, a Canadian ex-wrestler and academic, founded D-Wave Systems after reading a book on quantum computing by Colin Williams, a NASA scientist and a former research assistant to Professor Stephen Hawking. Colin Williams now works at D-Wave. Together, Rose, Williams and a team of top-flight brains developed a computer architecture called *adiabatic annealing*. (And my spell-checker didn’t even flinch.)

In 2010 D-Wave Systems released their first commercial system, the D-Wave One quantum computer. Three years later it unveiled the 512-qubit D-Wave Two system. And in 2015 D-Wave announced that the 1000 qubit D-Wave 2X system was available. (A qubit is the common term for a quantum bit inside a quantum computer.) In addition, D-Wave Systems is developing layers of software to make the systems more accessible to users.

D-Wave has been granted over 110 US patents and has published over 80 peer-reviewed papers in leading scientific journals. But there’s a problem here. The academic world is notoriously sceptical about the advance of science by private corporations. Some academics acknowledge that D-Wave’s machines are operating adiabatically, but not at quantum speed. Some think the machine is an advanced conventional computer but not a quantum computer at all. All think it is a game-changing advance.

Google teamed up with NASA in 2013 to acquire their D-Wave machine. It is located in Mountain View, California, a few miles from the *Googleplex*. The machine is literally a three metre high black box which is largely a freezer with a remarkable chip at its core[iii]. The single computer chip is composed not of silicon but of tiny loops of niobium (a rare chemical element) wire cooled to a temperature well below that of deep space – in fact just above absolute zero. Qubits are highly sensitive little blighters – not only must they be kept at super-low temperatures, but a single molecule of air or a mild vibration can disturb them.

There are also a number of quasi-governmental bodies working on quantum computing. The Russians have set up a body called the Russian Quantum Centre which claims to share research for all[iv]. In the UK the Centre for Quantum Computation (CQC) is an alliance of quantum information research groups at the University of Oxford. It was founded by Professor Artur Ekert, who is credited with inventing quantum cryptography, in 1998. Until recently, the CQC included research groups at the University of Cambridge, but the Cambridge groups now operate as an independent entity called the Cambridge Centre for Quantum Information and Foundations (CQIF)[v]. Lockheed Martin has teamed up with the University of Southern California to form the Quantum Computation Centre (QCC)[vi]. Moreover, it has been reported that China is preparing to launch a communications satellite which relies on quantum principles, the signals from which will be impossible for any other party to decode. With all this research underway it is likely that D-Wave Systems will have some competition for its machines soon.

To what use might an increase in processing speed in the order of 100 million times current levels be put?

First, as I have outlined before, medicine might be a major beneficiary by collating and interpolating enormous quantities of medical data quickly using algorithms of a sophistication that we can barely imagine now. I don’t see why the NHS couldn’t map everyone’s genome in the reasonably near future and then track that genome database against the entire data population of health outcomes. You can feed in all your aches and pains; the medics will update all your observed conditions and procedures. Very soon you might get an email on your implanted smartphone *before* you have a heart attack saying *Dear Customer – you are about to have a heart attack – a self-driving ambulance with robot paramedics is on its way*. *Don’t panic.*

Second, there are still problems in pure mathematics that cannot be solved by the computers of today because it would take billions of years to crunch the numbers. The general rubric of *optimization* can be applied to a host of problems – from forecasting stock market prices to predicting the impact of climate change – but some optimisation problems may require entirely new mathematical tools. In my piece on European banks in the March edition of MI magazine I explained how Benoit B Mandelbrot envisioned that financial risk management might be revolutionised by fractal mathematics. We are still waiting for the breakthrough.

Third, the growing problems of cyber-security – hacking and all that – might be resolved definitively – at last. It might be possible do this by anticipating hacking activity before it actually happens, doing Edward Snowdon and his ilk out of a job. But this is intimately tied up with AI.

Fourth, self-driving vehicles and even self-flying aircraft will become standard. Lockheed Martin, in collaboration with NASA, is already using a D-Wave Systems machine to optimise flight control systems for a next-generation supersonic passenger jet to be delivered around 2030.

Most importantly, and fifth, quantum computing will accelerate the arrival of Artificial Intelligence (AI). Geordie Rose thinks that machines will “outpace us” by 2028[vii]. Alphabet/Google is already using the D-Wave X2 machine in its AI programme, for example to test image recognition software for mobile phones. There are already cases where computers write the algorithms that other computer systems use. Personally, I am a mild AI sceptic, in the sense that I believe, for philosophical reasons, that machines can crunch numbers *ad nauseam*; but can you programme creativity and intuition into things that are incapable of any sensory experience? The human brain, after all, is non-computational. That said I am sure that there will be Turing Machines[viii] in operation well within my lifetime. And I am now persuaded that space missions will reach the exoplanets sooner than we think – but with no “humans” on board.

Sometimes it seems that technology is accelerating so rapidly that our brains are being subjected to intolerable G-forces. We’d better get used to it. Schrödinger’s cat is much cleverer than we can grasp – even though it’s probably dead!

[i] See a discussion of the paradox in Quantum of Solitude by Jon Cartwright, New Scientist, 16 July 2016, page 30.

[ii] In-Q-Tel is the venture capital arm of the CIA.

[iii] See Clive Thompson’s article in Wired of 20 May 2014 available at: http://www.dwavesys.com/media-coverage/wired-age-quantum-computing-has-almost-arrived

[v] See: http://www.qi.damtp.cam.ac.uk/

[vi] http://www.isi.edu/research_groups/quantum_computing/home

[vii] See Metro, Friday, 01 July 2016.

[viii] A Turing Machine is a computer that could trick you into thinking it was human if obscured behind a curtain. This was Alan Turing’s definition of what an intelligent machine would be.

## Comments (0)