Smart Advice About Quantum Computing?

This paper digs into the fundamental problems of the slow but progressive breakthrough in embracing quantum computing and how its benefit and risk affects humanity. Drawing analysis from its probable practicality, while also exploring today’s available technology.

GPU’s have been going the path of parallel-processing for many, a number of years and didn’t immediately suffer from the drop-off failing to meet Moore’s Law when the CPU business did, but it too has now slowed in development, proving that in addition, there is a potential upper limit to how many stream parallel-processors you can have operating. So for how much time the extra cores be tacked on until a real stagnation point is hit, will we manage to make the transition to quantum computing before a real stagnation point is shot in the progression of computer technology? Or are we wrong to be pursuing quantum computing altogether, will a technology as yet un-imagined be discovered? Sadly the answers to these greatly depend on what’s more profitable to the great corporations of the industry, who’re responsible for funding a large part of the research done, that could potentially unlock the processing platforms of the future.

Holy cow …

The aim of this idea is to monitor the effectiveness of quantum computing and the way it could impact on mankind tracing its history and looking into what awaits mankind in the future.

Quantum Computing: More News

The result invariably shows realistically the importance of quantum computing to all mankind when eventually fabricated in the future.

Desktop PC or cell phone can be adapted for practical quantum computing. Quantum computing exploits the properties of subatomic particles and the legislation of quantum mechanics. Today’s computers have bits in either a 1 or a 0 state. Qubits, however, can be in both states at the same time.

A quantum computer is a new kind of computer that uses the unusual properties of quantum physics to resolve problems that are impossible for regular computers. They do this by using qubits instead of bits. Qubits can represent a one or zero, like bits. What makes them special is that a qubit can be one, zero or a superposition of both. That means that a qubit can be either one and zero at the same time – making quantum computers exponentially more powerful than their conventional counterparts.

Quantum computers can solve problems that’d be impossible or take thousands of years to complete by using superposition. Quantum computers dramatically outperform classical computers in calculations involving large numbers of equally possible solutions.

Quantum computers will most probably be applied to breaking codes and optimizing complex systems on the basis of their strength at analyzing combinations. Researchers also expect that quantum computers will be in a position to accurately model events at the molecular scale, providing a powerful instrument for biology, physics, and chemistry research.

CISC is a CPU design that enables the processor to handle more complex instructions from the software at the cost of speed. All Intel processors for PCs are CISC processors. Complex instruction set computing is either of the two main types of processor design in use today. It is slowly losing popularity to RISC designs; currently all the fastest processors in the world are RISC. The most popular current CISC processor is the x86, but there’s also still some 68xx, 65xx, and Z80s in use. CISC processor is intended to execute a relatively high number of different instructions, each taking a different amount of time to execute (depending on the complexity of the instruction). Contrast with RISC.

Complex Instruction-Set Computer has CPU designed with a thorough set of assembly calls, systems and smaller binaries but generally slower execution of each individual instruction.

One important assumption in circuit design is that all circuit elements are ‘lumped’. This means that signal transmission time from one element to the other is insignificant. Meaning that the time it takes for the signal produced at one time on the circuit to transmit for the remainder of the circuit is tiny in relation to the times involved in circuit operation.

Electrical signals travel at the speed of light, suppose a processor works at 1GHz. That is one billion clock cycles per second, also meaning that one clock cycle goes one billionth of a second, or a nanosecond. Light travels about 30cm in a nanosecond. As a result, the size of circuitry involved at such clock speeds will be much less than 30cm, therefore, the most circuit size is 3cm. Bearing in account that the actual CPU core size is less than 1cm on a side. This is still okay, however, this is just for 1 GHz.

Cases where the clock speed is increased to 100GHz, a cycle will be 0.01 nanoseconds, and signals will only transmit 3mm in this time. So, the CPU core will definitely need to be about 0.3mm in size. It will be very difficult to cram a CPU core into so small a space. This is still okay, but somewhere between 1 GHz and 100GHz, there will provide a physical barrier. As smaller and smaller transistors are manufactured soon there may be physical limit as the numbers of electrons per transistors will become one and this will increase to a close to the state of electron.

Break encrypted secret messages in seconds that classical computers cannot crack in a million years.

Create unbreakable encryption systems to shield national security systems, secure Internet transactions, financial transactions, and other systems based on present day encryption schemes.

Advance cryptography to where messages can be transmitted and retrieved without encryption and without eavesdropping.

Explore large and unsorted databases that had already been virtually impenetrable using classical computers.

Improve pharmaceutical research because a quantum computer can sift through many chemical substances and interactions in seconds.

Cripple national security, the Internet, email systems, defences, and other systems based on encryption schemes.

Decode secret messages sent out by government employees in seconds versus the millions of years it would take a classical computer.

Break many of the cryptographic systems (e.g., RSA, DSS, LUC, Diffie-Helman) used to protect encrypted mail, secure Web pages, and many other types of data.

Break cryptographic systems such as public key ciphers or other systems used to protect secure Web pages and email on the Internet.

The idea of quantum computing was first explored in the 1970s and early 1980’s by physicists and computer scientists like Charles GH. Bennett of the IBM Thomas J. Watson Research Center, Paul A. Benioff of Argonne National Laboratory in Illinois, David Deutsch of the University of Oxford, and the late Richard P. Feynman of the California Institute of Technology (Caltech). This idea emerged as scientists were debating the fundamental limits of computation. They realized that if technology continued to go by Moore’s Law, the continually shrinking size of circuitry packed onto silicon chips will get to a stage where individual elements would be no bigger than a few atoms. Then there was disagreement over the atomic scale the physical laws that rule the behaviour and properties of the circuit are inherently quantum mechanical in nature, not classical. Then came the issue of whether a new type of computer could be invented on the basis of the principles of quantum physics.

Feynman was the first to give an answer by producing an abstract model in 1982 that demonstrated how a quantum system could serve as a for computations. Besides he explained how such a machine could serve as a simulator for quantum physics. Meaning that, a physicist may have the capacity to conduct experiments in quantum physics in a quantum mechanical computer.

In 1985, Deutsch discovered that Feynman’s claim could lead to a general purpose quantum computer and published a crucial theoretical paper illustrating that any physical process, in principle, could be moulded perfectly by a quantum computer. So, a quantum computer would have capabilities far beyond those of any traditional classical computer. The search began immediately after Deutsch publication.

Unfortunately, all that could be found were a few rather contrived mathematical problems, until Shor circulated in 1994 a preprint of a paper in the course of which he set out a methodology for using quantum computers to crack an important problem in number theory, namely factorization. He showed how an ensemble of mathematical operations, designed specifically for a quantum computer, could be organized to allow for a such a machine to factor huge numbers extremely rapidly, much quicker than is possible on conventional computers. Quantum computing transformed from a mere academic curiosity directly into a national and world interest with this breakthrough.

Right now, quantum computers and quantum information technology is still under its pioneering stage, and obstacles are being overcome that will provide the knowledge to drive quantum computers up in becoming the fastest computational machines in existence. This hasn’t been without problems. However, it’s nearing a stage now where researchers may have been equipped with tools they need to assemble a computer robust enough to adequately withstand the impact of the de-coherence. With Quantum hardware, we’re still full of hope though, except that progress so far indicate that it will just be a matter time before the physical and practical breakthrough comes around to test Shor’s and other quantum algorithms. This breakthrough will permanently stamp out today’s modern computer. Although Quantum computation has origin is in highly specialized fields of theoretical physics; however its future undoubtedly lies in the profound effect it will bring to permanently shape and improve mankind.

Leave a reply