REAILITE

View Original

An Introduction to Quantum Computing

Classical computers have become the backbone of 21st century life and have reached processing potentials that seem more than adequate to deal with modern problems. Right? Well not exactly.

Classical computers rely on electrical signals from digital 1s and 0s and they store numbers and data in the form of binary (a 2 base number system). This two digit system has practical benefits (it’s easier to deal with 2 digits rather than 10 (decimal system) and this way computers can utilize true/false calculations) and it aligns with certain laws of electricity. 

In the past 10 years though, many tech companies have been challenging the two digit system. They have found that classical computing has its limitations despite its usage in powerful supercomputers. So we now find ourselves on the brink of a new era led by an area of computer science known as q computing.

Developed to address the limitations of classical computing, q computing uses the principles of quantum theory which explains the behavior of energy and electricity at an atomic or subatomic level. Using electrons and protons, quantum bits or qubits, allow these subatomic particles to exist in multiple states. For example, binary requires information to be explicitly in the form of 1 or 0 but qubits will allow data to exist in more than one form at the same time. In theory, this could allow us to perform calculations that would otherwise take millions of years.

Currently IBM is leading the field for quantum computing as they unveiled the first quantum computer back in May of 2016. With 20 plus quantum systems already, Jay Gambetta, vice president of IBM Quantum, is looking to add four more quantum computing systems by 2025.