We hear about ‘quantum-quantum’ everywhere; what is quantum? and what is quantum computing? Is it a super faster computer? Father of supercomputers?
The answer is - no. It's the difference between a candle and an Edison light bulb. You cannot combine multiple candles to produce what a bulb can, so is the difference between existing computing and quantum computing - it’s about a different technology.
So, let’s start our journey by understanding quantum physics.
When we discuss quantum, we're talking about the nature of a miniscule particle. It's about science at the atomic and subatomic levels. Think of particles, waves, and electrons that orbit elements, with neutrons and protons at their centre. These are the tiniest building blocks of the known universe. Quantum physics helps us understand the behaviour, or mechanics of ‘the tiny’. Scientists generally aspire to succeed in a world where events and outcomes are predictable and measurable. When we heat water, we expect it to boil. When we drop something, we know that gravity takes it to the ground. Thousands of years of accumulating scientific evidence has given us a reasonable understanding of the world. However, to our surprise, discoveries throughout the 20th century demonstrated that nature at a quantum scale has a different set of rules. It exhibits a sometimes confusing, illogical, and unpredictable group of behaviours.
For example, recording the speed and location of a football kicked into the air is something which is quite easy to measure in contemporary science. In the quantum world, measuring the speed and location of a particle is not nearly as simple. In fact, we can only measure one of these two dimensions in quantum mechanics. In other words, we can know the location, but not the speed, and vice versa. We can only measure the other unknown through probability. As scientists probe deeper into quantum mechanics, the discoveries become even more confounding. For example, it was discovered that a wave such as light could behave like a particle, and particles like waves.
This was not intuitive, as waves and particles were thought to occupy different physical domains. Of particular significance to quantum computing is that - specific atoms, and particles, when forced into a particular quantum state called a ‘superposition’, were found to maintain multiple conflicting states, simultaneously. Unlike classical computing, where a bit is either a one or a zero, a quantum bit, called a cubit, can be a one, zero, or both at the same time. It's the foundation of why quantum computing, when fully realized, is predicted to be significantly faster at certain computations than our classical computing today. Our discoveries in the quantum realm now help us understand what keeps electrons orbiting around an atom, how atoms bond to produce molecules, and it's beginning to help us unlock the secrets of dark matter and black holes. Quantum physics is helping us make sense of the smallest and largest phenomena in the universe and everything in between.
That said, there's still so much more to know. Our greatest and most surprising discoveries are likely still ahead. We know about quantum mechanics, which also enables us to create innovations, particularly quantum computing, quantum cryptography, and a possible quantum internet. These game-changing capabilities directly result from our understanding of the science of the very small and our ability to begin to leverage its remarkable power.
So, what is quantum computing?
After several decades of the success of microchips, we are now approaching their physical limitations. At some point, chips and their transistors will be closing in on the scale of atoms, which will probably trigger the quantum phenomenon. Electrons that are supposed to flow predictably will behave unpredictably and may pass through solid surfaces or even teleport mysteriously to another part of the chip, in what's called quantum tunnelling. So, the end is coming as, at some point, further miniaturization will not be possible. However, this does not mean the end of binary computing for some time. We still have enough to squeeze plenty of performance - out of it. We should also anticipate that it will exist in combination with any new emerging computing platforms.
Quantum computing is a completely different way to do computing. It's different from binary computing. The algorithms are different. The way that you do computation is different. You need to start training now to be prepared for it, and there are some opportunities where you can learn on a classical computer about quantum computing, to prepare you for that quantum computer when it does come. The difference between classical and quantum computing can be superficially summarised as follows. While classical computing is limited to processing just one of two states at any given time, a one or a zero, binary bits, quantum computing leverages what is called quantum bits or qubits. These are an induced property of a subatomic particle that can represent a one, zero, both, and any value in between, surprisingly, all at the same time. It's a quantum phenomenon called superposition.
To explain this, put on our holiday hat and imagine you are looking into the ocean, and you see waves, and you drop a rock in a pond to see the waves ripple. At the same moment, if you drop another rock into the pond, at a different place - then you see waves combine, and form superpositions. You get troughs and crests that will add and subtract from each other, in some coherent way that can be constructive or destructive. Can we have a computational model that can manage a zero, one, or both at the same time? This model is known as a qubit. Qubit can have multiple states simultaneously, which is called superposition. It is defined as the physics of quantum mechanics that enables waves and particles to be in multiple states until measured, at which point they collapse to either a zero or one. The creation of Qubits is done by changing the state of certain atoms and other quantum-scale particles. Nucleus, the electron, and even photons can be used. We can use electromagnetic fields, laser beams, radio waves, and other techniques to change an electron into a qubit. Another example of the difference between binary position and superposition is to imagine a coin. So, when you flip the coin, you either get heads or tails, and this can be defined as Classical bits (binary zero and one). However, if you were to spin the coin to see both heads and tails simultaneously, as well as every state in-between, the coin would be in a superposition.
The other concept in quantum computing is called entanglement. This is when two particles with no physical connection respond to each other as if they know about each other. It's so peculiar and unexplainable that Albert Einstein called it a spooky action at a distance. As were many of his physicist colleagues, he was known to be uncomfortable with this unexplained phenomenon. To explain it more clearly, let’s go back to our example of coins. Let’s spin two coins on the table. If we stop one of the coins, the outcome will be a heads or a tails. If these two coins were entangled, the other coin would also stop, and it would rest on the other side of the coin to the one we stopped. It's as if both coins know about each other. No matter which one we stop first and which side it rests on, the other will stop and rest on the opposite side. In quantum entanglement, these coins are particles, and they can be any distance from each other. In fact, both could be on opposite sides of the universe. When one particle is observed, the other has the exact opposite measurement. They are connected without communicating. How? No one knows yet, but there are some theories. Perhaps, as some believe, quantum entanglement is in some way holding the universe together.
So how does quantum entanglement factor into quantum computing? Previously, we explored the quantum equivalent of a classical bit, called a qubit. A qubit spins until observed, and then it assumes an up or down position, a one or a zero. If two qubits are placed into an entangled state, it means that when one of the qubits is known, we immediately know the other. If we tie together many qubits, their entangled partners will be known at the time of measurement. Let's look at how quickly this stacks up. Two qubits represent four states, (0,0), (0,1), (1,0) and (1,1). Three represent eight states, (0,0,0), (0,0,1), (0,1,0), (0,1,1), (1,0,0), (0,1,0), (1,1,0) and (1,1,1). Twenty entangled qubits can store two to the power of 20, or over one million values. That's remarkably efficient. Compared to classical computing, where each bit, a one or a zero, is evaluated in sequence, ‘qubits at scale’ produce enormous computational outputs in parallel.
So far, we have talked about all the cool stuff. Now, let's discuss a concept called quantum decoherence and what it means. Qubits are created using specific atomic particles, or even photons, which are light waves. An electron of phosphorus is an example. To begin the process of creating a qubit, the atoms are placed in a superconducting magnet. This forces the electron into what we call a spin-down position. In an environment with a normal temperature, the electron's spin direction is not stable, as heat produces energy, forcing the electron to move. The environment needs to be contained in a supercooled refrigerator to create stability. For now, the need for large cooling systems ensures that quantum computers will be quite big for some time. Now to get the electron to spin up, a pulse of microwaves must be fired at it. Stopping the microwaves at a point anywhere between spin-down and spin-up creates a superposition. It is now a stable, coherent cubit. Any changes to temperature, light, sound, vibration, and other external factors will impact the qubit's state. Today's big technical challenge is making those quantum states of matter in the laboratory, and ultimately in quantum hardware, so that we have intrinsically, inherently, much more coherent and more stable qubits. So, the art of having stable qubits needs to be mastered.
The quantum computer has three main components:
- A Qubit area that houses the qubits
- A signal transfer method for transferring signals to the qubits
- Classical compute to execute a program that will send instructions
Applications of quantum computing use cases
Quantum computing (QC) is not a silver bullet for everything but has few specific use cases along with the potential to make a huge impact.
Quantum machine learning – ML comes with a high computational cost, especially if you have a tricky machine learning model. This has hindered the ML's scope and development to a whole new level. The potential solution would be to implement quantum software that enables faster machine learning. Quantum computing helps in processing complex problems in very little time.
Drug design and development – QC can play a pivotal role in eradicating the trial-and-error method of drug development. It can be effective in understanding drugs and their effect on humans, which can potentially save tons of money and time for drug research.
Cryptography and cybersecurity – QC has the potential to crack the bitcoin encryption (R1) and eradicate cyber-attacks by creating unbreakable encryption known as quantum cryptography.
Financial modelling – QC can execute the Monte Carlo simulation, which is used to find the right mix for fruitful investment and deliver high-quality outcomes at a low cost. It can also be used for algorithmic trading; to automatically do share dealing.
Logistics optimization – QC can play a big role in scheduling and logistics workflows associated with supply-chain management. Quantum annealing is an advanced optimization technique that can optimize these workflows in real-time.
Weather forecasting – Quantum Machine learning can help understand weather pattern recognition, which will help us predict extreme weather events.
So, when is quantum computing going to hit the mainstream? We get an answer that we're about ten years out from mainstream quantum computing. However, that's been said each year for many years already. Recently, Microsoft has created a Preview offering in Azure - called Azure Quantum and Quantum development bundle, which could be a good starting point for anyone. Microsoft has created a very robust ecosystem for quantum computing. Azure Quantum is an open ecosystem that provides diverse hardware, software, and solutions from a partner enriched ecosystem. Q# and the quantum development kit provide a way to develop quantum applications on Azure.
Microsoft offers $10,000 credits(R3) to get started on Azure Quantum for investigating potential use cases.
Quantum computing may be the most fascinating and potentially the most disruptive new technology ever encountered. From the pure wonder of the physics of the very small to the potential human impacts of a massively superior computing platform, this is an area of consequence. Still a lot of research is still needed in this space; from creating more system stability, to implementing fault tolerance, to supporting greater numbers of qubits and building the ecosystem of software, hardware, and service providers, our challenges are not trivial. We also need to establish the standards. In a nutshell, quantum can play a big role in the fourth industrial revolution.
Azure quantum resource links:
- Learn more about Azure Quantum
- Start developing with the quantum development kit
- Get an overview of Microsoft quantum computing technology
- Read the Microsoft Quantum blog
- Access quantum learning material
Reference:
- https://www.notebookcheck.net/Scientists-estimate-that-quantum-computers-may-become-powerful-enough-to-crack-the-Bitcoin-encryption-in-a-decade.597437.0.html
- https://analyticsindiamag.com/top-applications-of-quantum-computing-everyone-should-know-about/
- https://azure.microsoft.com/en-gb/services/quantum/#features