Quantum computing why




















Some companies, such as IBM and Google, claim we might be close , as they continue to cram more qubits together and build more accurate devices.

Not everybody is convinced that quantum computers are worth the effort. Some mathematicians believe there are obstacles that are practically impossible to overcome, putting quantum computing forever out of reach. All topic-based articles are determined by fact checkers to be correct and relevant at the time of publishing. Text and images may be altered, removed, or added to as an editorial decision to keep information current.

Twenty-seven years ago, Shor showed how to do all this for the problem of factoring integers, which breaks the widely used cryptographic codes underlying much of online commerce.

We now know how to do it for some other problems, too, but only by exploiting the special mathematical structures in those problems. Compounding the difficulty is that, if you want to talk honestly about quantum computing, then you also need the conceptual vocabulary of theoretical computer science. A million times?

A billion? This could mean taking a problem where the best classical algorithm needs a number of steps that grows exponentially with n , and solving it using a number of steps that grows only as n 2. In such cases, for small n , solving the problem with a quantum computer will actually be slower and more expensive than solving it classically.

Alas, it turns out to be staggeringly hard to prove that problems are hard, as illustrated by the famous P versus NP problem which asks, roughly, whether every problem with quickly checkable solutions can also be quickly solved. The problem, in a word, is decoherence, which means unwanted interaction between a quantum computer and its environment — nearby electric fields, warm objects, and other things that can record information about the qubits.

The only known solution to this problem is quantum error correction : a scheme, proposed in the mids, that cleverly encodes each qubit of the quantum computation into the collective state of dozens or even thousands of physical qubits. But researchers are only now starting to make such error correction work in the real world, and actually putting it to use will take much longer. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years.

But why are firms like Goldman taking this leap — especially with commercial quantum computers being possibly years away? At its core, the digital computer is an arithmetic machine. It made performing mathematical calculations cheap and its impact on society has been immense. Advances in both hardware and software have made possible the application of all sorts of computing to products and services.

Without computers we would never have reached the moon or put satellites in orbit. The more complicated the code, the more processing power required and the longer the processing takes.

What this means is that for all their advances — from self-driving cars to beating grandmasters at Chess and Go — there remain tasks that traditional computing devices struggle with, even when the task is dispersed across millions of machines. A particular problem they struggle with is a category of calculation called combinatorics.

If you have navigational errors, the rocket ship can take you to places where you can't otherwise go, but you will end up in the wrong location — you won't end up at B. To be very clear: It would be inaccurate to say that a QC runs programs faster than a PC or an x86 server. A "program" for a QC is a very different order of beast than anything ever produced for a binary processor.

The translation between a mathematical problem intelligible by college professors into a binary program, and the translation between the same problem into a QC program, are as different from one another as "20 Questions" is from billiards. There are several fundamental compromises when you move into the realm of quantum computing. Here's one that's daunting just by itself: Solutions will rarely be exact or definitive.

A QC is not a deterministic machine; in other words, there is no singular solution for which any other result would be an error. Instead, a QC will tend to render sets of answers with their respective probabilities. If that doesn't discourage you, get prepared for this: The atom-level device that actually performs the quantum calculations will, as a result of its work and as is its nature, self-destruct when it's done.

A quantum computing mechanism would actually be a machine that automatically builds the computing device out of atoms calcium atoms are good candidates , sustains the operating conditions of that device for the duration of its program, applies the program, allows it to execute, looks the other way because quantum logic gates are shy and will explode if anyone sees them , interprets the final state of its registers as the final probability table of results, then resets itself to rebuild another mechanism all over again.

Imagine if Alan Turing's incredible machine that cracked the Nazi "Enigma" code , was guaranteed to explode after every run. QC engineers prefer the term "collapse," but let's call it what it is: explode. And if Turing, an ingenious engineer, devised an automated manufacturing operation that rebuilt that machine out of new parts, each and every day. Every quantum computer engineer has done more than imagine such a scheme, but built a plan for such a device on the quantum scale.

Indeed, such hypothetical "on paper" schemes are called Turing machines. Quantum engineers believe their computers can and will work, because their Turing machine experiments give them cause for faith. Also: Riding a Quantum Computing Wave. Are there real-world applications of quantum computing technology, or some derivative of it, that people are putting to good use right now? Put another way, what does quantum actually do, and whom does it directly serve? Also: Quantum computing may make current encryption obsolete, a quantum internet could be the solution.

Now to the more controversial question: Assume someone built a mechanism that successfully leaps over the hurdles imposed by quantum physics, producing a full quantum computer capable of performing all the tasks currently relegated to the realm of theory and simulation.

What do experts in this field think a quantum computer should be able to do, assuming every phenomenon that physicists have theorized and that scientists have observed and verified, is ultimately exploitable? Also: Quantum computing: A cheat sheet TechRepublic. The word "computer" here has a very basic context -- not a handheld device or a cooled server with a processor and memory.

Think of a computer the way Charles Babbage or John von Neumann considered it: as a mechanism guaranteed to deliver a certain output given a specific set of inputs and a defined configuration. At the deepest microscopic levels of a modern microprocessor, one logic unit is what these fellows would have called a computer. Every classical electronic computer exploits the natural behavior of electrons to produce results in accordance with Boolean logic for any two specific input states, one certain output state.

Here, the basic unit of transaction is the binary digit 'bit' , whose state is either 0 or 1. In a conventional semiconductor, these two states are represented by low and high voltage levels within transistors. In a quantum computer, the structure is radically different. Instead of transistors, a quantum computer obtains its qubits by bombarding atoms with electrical fields at perpendicular angles to one another, the result being to line up the ions, but also keep them conveniently and equivalently separated.

When these ions are separated by just enough space, their orbiting electrons become the home addresses, if you will, for qubits. While a conventional computer focuses on voltage, a quantum system is passively concerned with one aspect of electrons at the quantum level, called spin. Yes, this has to do with the electron's angular momentum.

The reason we use the term quantum at the subatomic level of physics is because of the indivisibility of what we may observe, such as the amount of energy in a photon a particle of light. Spin is one of these delightfully indivisible components, representing the angular momentum of an electron as it orbits the nucleus of an atom. It's the up or down state of electron spin that corresponds to the '1' and '0' of the typical binary digit. Yet it's here where quantum computing makes a sharp turn into a logical black hole, through a tunnel of white noise, and jettisons us helplessly into a whimsically devious universe whose laws and principles seem concocted by the University of Toontown.

A qubit maintains the quantum state for one electron. When no one is looking at it, it can attain the '1' and '0' state simultaneously. If you look at it, you won't see this happen, and if it was happening before, it immediately stops. This is literally true. Yet the fact that the qubit's electron was spinning both directions at once, is verifiable after the fact.



0コメント

  • 1000 / 1000