Wednesday, November 1, 2017

A Beginner’s Guide to Quantum Computing (Future Computing)

How can you get more and more out of less and less? The smaller computers get, the more powerful they seem to become: there's more number-crunching ability in a 21st-century cellphonethan you'd have found in a room-sized, military computer 50 years ago. Yet, despite such amazing advances, there are still plenty of complex problems that are beyond the reach of even the world's most powerful computers—and there's no guarantee we'll ever be able to tackle them. One problem is that the basic switching and memory units of computers, known as transistors, are now approaching the point where they'll soon be as small as individual atoms. If we want computers that are smaller and more powerful than today's, we'll soon need to do our computing in a radically different way. Entering the realm of atoms opens up powerful new possibilities in the shape of quantum computing, with processors that could work millions of times faster than the ones we use today. Sounds amazing, but the trouble is that quantum computing is hugely more complex than traditional computing and operates in the Alice in Wonderland world of quantum physics, where the "classical," sensible, everyday laws of physics no longer apply. What is quantum computing and how does it work? Let's take a closer look!
You probably think of a computer as a neat little gadget that sits on your lap and lets you send emails, shop online, chat to your friends, or play games—but it's much more and much less than that. It's more, because it's a completely general-purpose machine: you can make it do virtually anything you like. It's less, because inside it's little more than an extremely basic calculator, following a prearranged set of instructions called a program. Like the Wizard of Oz, the amazing things you see in front of you conceal some pretty mundane stuff under the covers.
How Quantum Computers Work.

Conventional computers have two tricks that they do really well: they can store numbers in memory and they can process stored numbers with simple mathematical operations (like add and subtract). They can do more complex things by stringing together the simple operations into a series called an algorithm (multiplying can be done as a series of additions, for example). Both of a computer's key tricks—storage and processing—are accomplished using switches called transistors, which are like microscopic versions of the switches you have on your wall for turning on and off the lights. A transistor can either be on or off, just as a light can either be lit or unlit. If it's on, we can use a transistor to store a number one (1); if it's off, it stores a number zero (0). Long strings of ones and zeros can be used to store any number, letter, or symbol using a code based on binary (so computers store an upper-case letter A as 1000001 and a lower-case one as 01100001). Each of the zeros or ones is called a binary digit (or bit) and, with a string of eight bits, you can store 255 different characters (such as A-Z, a-z, 0-9, and most common symbols). Computers calculate by using circuits called logic gates, which are made from a number of transistors connected together. Logic gates compare patterns of bits, stored in temporary memories called registers, and then turn them into new patterns of bits—and that's the computer equivalent of what our human brains would call addition, subtraction, or multiplication. In physical terms, the algorithm that performs a particular calculation takes the form of an electronic circuit made from a number of logic gates, with the output from one gate feeding in as the input to the next.
The trouble with conventional computers is that they depend on conventional transistors. This might not sound like a problem if you go by the amazing progress made in electronics over the last few decades. When the transistor was invented, back in 1947, the switch it replaced (which was called the vacuum tube) was about as big as one of your thumbs. Now, a state-of-the-art microprocessor (single-chip computer) packs hundreds of millions (and up to two billion) transistors onto a chip of silicon the size of your fingernail! Chips like these, which are called integrated circuits, are an incredible feat of miniaturization. Back in the 1960s, Intel co-founder Gordon Moore realized that the power of computers doubles roughly 18 months—and it's been doing so ever since. This apparently unshakeable trend is known as Moore's Law.It sounds amazing, and it is, but it misses the point. The more information you need to store, the more binary ones and zeros—and transistors—you need to do it. Since most conventional computers can only do one thing at a time, the more complex the problem you want them to solve, the more steps they'll need to take and the longer they'll need to do it. Some computing problems are so complex that they need more computing power and time than any modern machine could reasonably supply; computer scientists call those intractable problems.
As Moore's Law advances, so the number of intractable problems diminishes: computers get more powerful and we can do more with them. The trouble is, transistors are just about as small as we can make them: we're getting to the point where the laws of physics seem likely to put a stop to Moore's Law. Unfortunately, there are still hugely difficult computing problems we can't tackle because even the most powerful computers find them intractable. That's one of the reasons why people are now getting interested in quantum computing.
What is quantum computing?
Quantum theory is the branch of physics that deals with the world of atoms and the smaller (subatomic) particles inside them. You might think atoms behave the same way as everything else in the world, in their own tiny little way—but that's not true: on the atomic scale, the rules change and the "classical" laws of physics we take for granted in our everyday world no longer automatically apply. As Richard P. Feynman, one of the greatest physicists of the 20th century, once put it: "Things on a very small scale behave like nothing you have any direct experience about... or like anything that you have ever seen." (Six Easy Pieces, p116.)
If you've studied light, you may already know a bit about quantum theory. You might know that a beam of light sometimes behaves as though it's made up of particles (like a steady stream of cannonballs), and sometimes as though it's waves of energy rippling through space (a bit like waves on the sea). That's called wave-particle duality and it's one of the ideas that comes to us from quantum theory. It's hard to grasp that something can be two things at once—a particle and a wave—because it's totally alien to our everyday experience: a car is not simultaneously a bicycle and a bus. In quantum theory, however, that's just the kind of crazy thing that can happen. The most striking example of this is the baffling riddle known as Schrödinger's cat. Briefly, in the weird world of quantum theory, we can imagine a situation where something like a cat could be alive and dead at the same time!
What does all this have to do with computers? Suppose we keep on pushing Moore's Law—keep on making transistors smaller until they get to the point where they obey not the ordinary laws of physics (like old-style transistors) but the more bizarre laws of quantum mechanics. The question is whether computers designed this way can do things our conventional computers can't. If we can predict mathematically that they might be able to, can we actually make them work like that in practice?
People have been asking those questions for several decades. Among the first were IBM research physicists Rolf Landauer and Charles H. Bennett. Landauer opened the door for quantum computing in the 1960s when he proposed that information is a physical entity that could be manipulated according to the laws of physics. One important consequence of this is that computers waste energy manipulating the bits inside them (which is partly why computers use so much energy and get so hot, even though they appear to be doing not very much at all). In the 1970s, building on Landauer's work, Bennett showed how a computer could circumvent this problem by working in a "reversible" way, implying that a quantum computer could carry out massively complex computations without using massive amounts of energy. In 1981, physicist Paul Benioff from Argonne National Laboratory tried to envisage a basic machine that would work in a similar way to an ordinary computer but according to the principles of quantum physics. The following year, Richard Feynman sketched out roughly how a machine using quantum principles could carry out basic computations. A few years later, Oxford University's David Deutsch (one of the leading lights in quantum computing) outlined the theoretical basis of a quantum computer in more detail. How did these great scientists imagine that quantum computers might work?

What would a quantum computer be like in reality?
In reality, qubits would have to be stored by atoms, ions (atoms with too many or too few electrons) or even smaller things such as electrons and photons (energy packets), so a quantum computer would be almost like a table-top version of the kind of particle physics experiments they do at Fermilab or CERN! Now you wouldn't be racing particles round giant loops and smashing them together, but you would need mechanisms for containing atoms, ions, or subatomic particles, for putting them into certain states (so you can store information), knocking them into other states (so you can make them process information), and figuring out what their states are after particular operations have been performed.In practice, there are lots of possible ways of containing atoms and changing their states using laser beamselectromagnetic fieldsradio waves, and an assortment of other techniques. One method is to make qubits using quantum dots, which are nanoscopically tiny particles of semiconductors inside which individual charge carriers, electrons and holes (missing electrons), can be controlled. Another method makes qubits from what are called ion traps: you add or take away electrons from an atom to make an ion, hold it steady in a kind of laser spotlight (so it's locked in place like a nanoscopic rabbit dancing in a very bright headlight), and then flip it into different states with laser pulses. In another technique, the qubits are photons inside optical cavities (spaces between extremely tiny mirrors). Don't worry if you don't understand; not many people do! Since the entire field of quantum computing is still largely abstract and theoretical, the only thing we really need to know is that qubits are stored by atoms or other quantum-scale particles that can exist in different states and be switched between them.

What can quantum computers do that ordinary computers can't?

1)Artificial Intelligence

A primary application for quantum computing is artificial intelligence (AI). AI is based on the principle of learning from experience, becoming more accurate as feedback is given, until the computer program appears to exhibit “intelligence.”This feedback is based on calculating the probabilities for many possible choices, and so AI is an ideal candidate for quantum computation. It promises to disrupt every industry, from automotives to medicine, and it’s been said AI will be to the twenty-first century what electricity was to the twentieth.For example, Lockheed Martin plans to use its D-Wave quantum computer to test autopilot software that is currently too complex for classical computers, and Google is using a quantum computer to design software that can distinguish cars from landmarks. We have already reached the point where AI is creating more AI, and so its importance will rapidly escalate.

2)Molecular Modeling

Another example is precision modeling of molecular interactions, finding the optimum configurations for chemical reactions. Such “quantum chemistry” is so complex that only the simplest molecules can be analyzed by today’s digital computers.Chemical reactions are quantum in nature as they form highly entangled quantum superposition states. But fully-developed quantum computers would not have any difficulty evaluating even the most complex processes.Google has already made forays in this field by simulating the energy of hydrogen molecules. The implication of this is more efficient products, from solar cells to pharmaceutical drugs, and especially fertilizer production; since fertilizer accounts for 2 percent of global energy usage, the consequences for energy and the environment would be profound.
3)Cryptography
Most online security currently depends on the difficulty of factoring large numbers into primes. While this can presently be accomplished by using digital computers to search through every possible factor, the immense time required makes “cracking the code” expensive and impractical.Quantum computers can perform such factoring exponentially more efficiently than digital computers, meaning such security methods will soon become obsolete. New cryptography methods are being developed, though it may take time: in August 2015 the NSA began introducing a list of quantum-resistant cryptography methods that would resist quantum computers, and in April 2016 the National Institute of Standards and Technology began a public evaluation process lasting four to six years.There are also promising quantum encryption methods being developed using the one-way nature of quantum entanglement. City-wide networks have already been demonstrated in several countries, and Chinese scientists recently announced they successfully sent entangled photons from an orbiting “quantum” satellite to three separate base stations back on Earth.
4)Financial Modeling
Modern markets are some of the most complicated systems in existence. While we have developed increasingly scientific and mathematical tools to address this, it still suffers from one major difference between other scientific fields: there’s no controlled setting in which to run experiments.To solve this, investors and analysts have turned to quantum computing. One immediate advantage is that the randomness inherent to quantum computers is congruent to the stochastic nature of financial markets. Investors often wish to evaluate the distribution of outcomes under an extremely large number of scenarios generated at random.Another advantage quantum offers is that financial operations such as arbitrage may require many path-dependent steps, the number of possibilities quickly outpacing the capacity of a digital computer.
5)Weather Forecasting
NOAA Chief Economist Rodney F. Weiher claims (PowerPoint file) that nearly 30 percent of the US GDP ($6 trillion) is directly or indirectly affected by weather, impacting food production, transportation, and retail trade, among others. The ability to better predict the weather would have enormous benefit to many fields, not to mention more time to take cover from disasters.While this has long been a goal of scientists, the equations governing such processes contain many, many variables, making classical simulation lengthy. As quantum researcher Seth Lloyd pointed out, “Using a classical computer to perform such analysis might take longer than it takes the actual weather to evolve!” This motivated Lloyd and colleagues at MIT to show that the equations governing the weather possess a hidden wave nature which are amenable to solution by a quantum computer.Director of engineering at Google Hartmut Neven also noted that quantum computers could help build better climate models that could give us more insight into how humans are influencing the environment. These models are what we build our estimates of future warming on, and help us determine what steps need to be taken now to prevent disasters.The United Kingdom’s national weather service Met Office has already begun investing in such innovation to meet the power and scalability demands they’ll be facing in the 2020-plus timeframe, and released a report on its own requirements for exascale computing.
6)Particle Physics
Coming full circle, a final application of this exciting new physics might be… studying exciting new physics. Models of particle physics are often extraordinarily complex, confounding pen-and-paper solutions and requiring vast amounts of computing time for numerical simulation. This makes them ideal for quantum computation, and researchers have already been taking advantage of this.Researchers at the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) recently used a programmable quantum system to perform such a simulation. Published in Nature, the team used a simple version of quantum computer in which ions performed logical operations, the basic steps in any computer calculation. This simulation showed excellent agreement compared to actual experiments of the physics described.“These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.”Investors are now scrambling to insert themselves into the quantum computing ecosystem, and it’s not just the computer industry: banks, aerospace companies, and cybersecurity firms are among those taking advantage of the computational revolution.While quantum computing is already impacting the fields listed above, the list is by no means exhaustive, and that’s the most exciting part. As with all new technology, presently unimaginable applications will be developed as the hardware continues to evolve and create new opportunities.
Today's Quantum Computers.
Quantum computers could one day replace silicon chips, just like the transistor once replaced the vacuum tube. But for now, the technology required to develop such a quantum computer is beyond our reach. Most research in quantum computing is still very theoretical.The most advanced quantum computers have not gone beyond manipulating more than 16 qubits, meaning that they are a far cry from practical application. However, the potential remains that quantum computers one day could perform, quickly and easily, calculations that are incredibly time-consuming on conventional computers. Several key advancements have been made in quantum computing in the last few years. Let's look at a few of the quantum computers that have been developed.

1998

Los Alamos and MIT researchers managed to spread a single qubit across three nuclear spins in each molecule of a liquid solution of alanine (an amino acid used to analyze quantum state decay) or trichloroethylene (a chlorinated hydrocarbon used for quantum error correction) molecules. Spreading out the qubit made it harder to corrupt, allowing researchers to use entanglement to study interactions between states as an indirect method for analyzing the quantum information.

2000

In March, scientists at Los Alamos National Laboratory announced the development of a 7-qubit quantum computer within a single drop of liquid. The quantum computer uses nuclear magnetic resonance (NMR) to manipulate particles in the atomic nuclei of molecules of trans-crotonic acid, a simple fluid consisting of molecules made up of six hydrogen and four carbon atoms. The NMR is used to apply electromagneticpulses, which force the particles to line up. These particles in positions parallel or counter to the magnetic field allow the quantum computer to mimic the information-encoding of bits in digital computers.Researchers at IBM-Almaden Research Center developed what they claimed was the most advanced quantum computer to date in August. The 5-qubit quantum computer was designed to allow the nuclei of five fluorine atoms to interact with each other as qubits, be programmed by radio frequency pulses and be detected by NMR instruments similar to those used in hospitals (see How Magnetic Resonance Imaging Works for details). Led by Dr. Isaac Chuang, the IBM team was able to solve in one step a mathematical problem that would take conventional computers repeated cycles. The problem, called order-finding, involves finding the period of a particular function, a typical aspect of many mathematical problems involved in cryptography.

2001

Scientists from IBM and Stanford University successfully demonstrated Shor's Algorithm on a quantum computer. Shor's Algorithm is a method for finding the prime factors of numbers (which plays an intrinsic role in cryptography). They used a 7-qubit computer to find the factors of 15. The computer correctly deduced that the prime factors were 3 and 5.

2005

The Institute of Quantum Optics and Quantum Information at the University of Innsbruck announced that scientists had created the first qubyte, or series of 8 qubits, using ion traps.

2006

Scientists in Waterloo and Massachusetts devised methods for quantum control on a 12-qubit system. Quantum control becomes more complex as systems employ more qubits.

2007

Canadian startup company D-Wave demonstrated a 16-qubit quantum computer. The computer solved a sudoku puzzle and other pattern matching problems. The company claims it will produce practical systems by 2008. Skeptics believe practical quantum computers are still decades away, that the system D-Wave has created isn't scaleable, and that many of the claims on D-Wave's Web site are simply impossible (or at least impossible to know for certain given our understanding of quantum mechanics).If functional quantum computers can be built, they will be valuable in factoring large numbers, and therefore extremely useful for decoding and encoding secret information. If one were to be built today, no information on the Internet would be safe. Our current methods of encryption are simple compared to the complicated methods possible in quantum computers. Quantum computers could also be used to search large databases in a fraction of the time that it would take a conventional computer. Other applications could include using quantum computers to study quantum mechanics, or even to design other quantum computers.But quantum computing is still in its early stages of development, and many computer scientists believe the technology needed to create a practical quantum computer is years away. Quantum computers must have at least several dozen qubits to be able to solve real-world problems, and thus serve as a viable computing method.

How far off are quantum computers?

Three decades after they were first proposed, quantum computers remain largely theoretical. Even so, there's been some encouraging progress toward realizing a quantum machine. There were two impressive breakthroughs in 2000. First, Isaac Chuang (now an MIT professor, but then working at IBM's Almaden Research Center) used five fluorine atoms to make a crude, five-qubit quantum computer. The same year, researchers at Los Alamos National Laboratory figured out how to make a seven-qubit machine using a drop of liquid. Five years later, researchers at the University of Innsbruck added an extra qubit and produced the first quantum computer that could manipulate a qubyte (eight qubits).
These were tentative but important first steps. Over the next few years, researchers announced more ambitious experiments, adding progressively greater numbers of qubits. By 2011, a pioneering Canadian company called D-Wave Systems announced in Nature that it had produced a 128-qubit machine. Thee years later, Google announced that it was hiring a team of academics (including University of California at Santa Barbara physicist John Martinis) to develop its own quantum computers based on D-Wave's approach. In March 2015, the Google team announced they were "a step closer to quantum computation," having developed a new way for qubits to detect and protect against errors. In 2016, MIT's Isaac Chang and scientists from the University of Innsbruck unveiled a five-qubit, ion-trap quantum computer that could calculate the factors of 15; one day, a scaled-up version of this machine might evolve into the long-promised, fully fledged encryption buster! There's no doubt that these are hugely important advances. Even so, it's very early days for the whole field—and most researchers agree that we're unlikely to see practical quantum computers appearing for many years—perhaps even decades.




Bibliography 

1. Explainthatstuff(by Chris Woodford)2017

Articles

2.SingularityHub(by Mark Jackson)25.06.2017
Image Credit: IQOQI Innsbruck/Harald Ritsch
3. Howstuffworks(BY KEVIN BONSOR & JONATHAN STRICKLAND)

Sources

  • "12-qubits Reached In Quantum Information Quest." Science Daily, May 2006. http://www.sciencedaily.com/releases/2006/05/060508164700.htm
  • Aaronson, Scott. "Shtetl-Optimized." April 10, 2007. http://scottaaronson.com/blog
  • Bone, Simone and Matias Castro. "A Brief History of Quantum Computing." Imperial College, London, Department of Computing. 1997. http://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/
  • Boyle, Alan. "A quantum leap in computing." MSNBC, May 18, 2000. http://www.msnbc.msn.com/id/3077363
  • "Center for Extreme Quantum Information Theory (xQIT), MIT." TechNews, March 2007. http://www.technologynewsdaily.com/node/6280
  • Centre for Quantum Computer Technology http://www.qcaustralia.org/
  • Cory, D.G., et al. "Experimental Quantum Error Correction." Amerian Physical Society, Physical Review Online Archive, September 1998. http://prola.aps.org/abstract/PRL/v81/i10/p2152_1
  • Grover, Lov K. "Quantum Computing." The Sciences, July/August 1999. http://cryptome.org/qc-grover.htm
  • Hogg, Tad. "An Overview of Quantum Computing." Quantum Computing and Phase Transitions in Combinatorial Search. Journal of Artificial Intelligence Research, 4, 91-128 (1996). http://www.cs.cmu.edu/afs/cs/project/jair/pub/volume4/ hogg96a-html/node6.html
  • "IBM's Test-Tube Quantum Computer Makes History." IBM Research, December 19, 2001. http://domino.watson.ibm.com/comm/pr.nsf/pages/ news.20011219_quantum.html
  • Institute for Quantum Computing. http://www.iqc.ca
  • Jonietz, Erika. "Quantum Calculation." Technology Review, July 2005. http://www.technologyreview.com/Infotech/14591
  • Maney, Kevin. "Beyond the PC: Atomic QC." USA Today. http://www.amd1.com/quantum_computers.html
  • "Quantum Computing." Stanford Encyclopedia of Philosophy, February 26, 2007. http://plato.stanford.edu/entries/qt-quantcomp
  • Qubit.org http://www.qubit.org
  • Simonite, Tom. "Flat 'ion trap' holds quantum computing promise." NewScientistTech, July 2006. http://www.newscientisttech.com/article/ dn9502-flat-ion-trap-holds-quantum-computing-promise.html
  • Vance, Ashlee. "D-Wave qubits in the era of Quantum Computing." The Register, February 13, 2007. http://www.theregister.co.uk/2007/02/13/dwave_quantum
  • West, Jacob. "The Quantum Computer." Computer Science at CalTech, April 28, 2000. http://www.cs.caltech.edu/~westside/quantum-intro.html







No comments:

Post a Comment

TOP 7 MILITARY UNITS THAT CHANGED THE TIES OF HISTORY DURING BRONZE AGE

The Bronze Age is a historical  period  characterized by the use of  bronze ,  proto-writing , and other early features of urban  civiliz...