As a Data Scientist and Researcher, I always try to find answers to the problems I come across every day. Working on real-world problems, I have faced many complexities both in time and computation. There have been many cases where the classical machine learning and deep learning algorithms failed to work, and my computer ended up crashing.
During the lockdown, I stumbled upon a cool new sci-fi series called Devs streaming on Hulu. Devs explores quantum computing and scientific research that is actually happening now, around the world. This led me to think about Quantum Theory, how Quantum Computing came to be, and how Quantum Computers can be used to make future predictions.
After researching further I found Quantum Machine Learning (QML), a concept that was pretty new to me at the time. This field is both exciting and useful; it could help resolve issues with computational and time complexities, like those that I faced. Hence, I chose QML as a topic for further research and decided to share my findings with everyone.
Quantum Machine Learning is a theoretical field that’s just starting to develop. It lies at the intersection of Quantum Computing and Machine Learning.

The main goal of Quantum Machine Learning is to speed things up by applying what we know from quantum computing to machine learning. The theory of Quantum Machine Learning takes elements from classical Machine Learning theory, and views quantum computing from that lens.
Contents
This post will cover the following main topics:
- Comparison of Classical Programming with Classical Machine Learning and Quantum Machine Learning
- All the Basic Concepts of Quantum Computing
- How Quantum Computing Can Improve Classical Machine Learning Algorithms
Bring this project to life
Classical Programming vs. Classical Machine Learning vs. Quantum Machine Learning
To compare Classical Programming, Classical Machine Learning, and Quantum Machine Learning, let’s consider the simple problem of determining whether a number is even or odd.
The solution is simple enough: first you need to get a number from the user, then you divide the number by two. If you get a remainder, then that number is odd. If you don’t get a remainder, then that number is even.
If you want to write this particular program using the classical programming approach, you would follow three steps:
- Get the input
- Process the input
- Produce the output
This is the workflow of the classical programming paradigm.

The processing is done through the rules which we have defined for the classification of the number — even or odd.
Similarly, let’s look at how we would solve this particular problem using a Machine Learning approach. In this case things are a bit different. First we create a set of input and output values. Here, the approach would be to feed the input and expected output together to a machine learning model, which should learn the rules. With machine learning we don’t tell the computer how to solve the problem; we set up a situation in which the program will learn to do so itself.
Mathematically speaking, our aim is to find f, given x and y, such that:
y = f(x)

Let’s move onto Quantum Computing. Whenever you think of the word “quantum,” it might trigger the idea of an atom or molecule. Quantum computers are made up of a similar idea. In a classical computer, processing occurs at the bit-level. In the case of Quantum Computers, there is a particular behavior that governs the system; namely, quantum physics. Within quantum physics, we have a variety of tools that are used to describe the interaction between different atoms. In the case of Quantum Computers, these atoms are called “qubits” (we will discuss that in detail later). A qubit acts as both a particle and a wave. A wave distribution stores a lot of data, as compared to a particle (or bit).
Loss functions are used to keep a check on how accurate a machine learning solution is. While training a machine learning model and getting its predictions, we often observe that all the predictions are not correct. The loss function is represented by some mathematical expression, the result of which shows by how much the algorithm has missed the target.
A Quantum Computer also aims to reduce the loss function. It has a property called Quantum Tunneling which searches through the entire loss function space and finds the value where the loss is lowest, and hence, where the algorithm will perform the best and at a very fast rate.
The Basics of Quantum Computing
Before getting deep into Quantum Machine Learning, readers should be familiar with basic Quantum Computing terminologies, which are discussed here.
Bra-ket Notation
In quantum mechanics and quantum physics, the “Bra-ket” notation or “Dirac” notation is used to write equations. Readers (especially beginners) must know about this because they will come across it when they read research papers involving quantum computing.
The notation uses angle brackets, 〈 〉, and a vertical bar, | , to construct “bras” and “kets”.
A “ket” looks like this: |v〉. Mathematically it denotes a vector, v, in a complex vector space V. Physically, it represents the state of a quantum system.
A “bra” looks like this:〈f| . Mathematically, it denotes a linear function f: V → C, i.e. a linear map that maps each vector in V to a number in the complex plane C.
Letting a linear function〈f| act on a vector |v〉is written as:
〈f|v〉⍷ C
Wave functions and other quantum states can be represented as vectors in a complex state using the Bra-ket notation. Quantum Superposition can also be denoted by this notation. Other applications include wave function normalization, and measurements associated with linear operators.
The Concept of “Qubits” and the Superposition States
Quantum Computing uses “qubits” instead of “bits,” which are used by classical computers. A bit refers to a binary digit, and it forms the basis of classical computing. The term “qubit” stands for Quantum Binary Digit. While bits have only two states —0 and 1—qubits can have multiple states at the same time. The value ranges between 0 and 1.
To better understand this concept, take the analogy of a coin toss. A coin has two sides , Heads (1) or Tails (0). While the coin is being tossed, we don’t know which side it has until we stop it or it falls on the ground. Look at the coin toss shown below. Can you tell which side it has? It shows both 0