## Why do we need Quantum Computers?

"To make everything faster and better" you might say.

Before diving in to the topic of **Quantum Computers**, let us take a look into **why** the change from our normal "everyday" hardware is *necessary*.

#### Complexity of problems

Specific problems in computing do have** limits** for special degrees of complexity **to our capacities**. These problems can be in finding the right path in **labyrinth´s** or **Prime Factorization**. Prime numbers are only dividable by 1 or their own without leaving any rest, so they consist of 2 factors (i.e. 2,3,5,7,11....). For our purpose we can take any number with arbitrary size and decompose it as long as the factors are not prime numbers. In the case of 15 the prime factors are 3 and 5 because 3x5=15 is true. This is an easy example and by expanding the size of our starting number, the problem gets more difficult. What if we had a number with 80 places before the decimal? A **normal computer**, even the most powerful non Quantum computers * would take literally millions of years*. That is the first argument for a "new" type of computer, which is designed to solve these kind of tasks faster.

#### Uncertainty on microscopic scales

As you already know, pc hardware has become smaller and smaller over time. This might have worked until now, but from now on **decreasing size creates errors caused by Quantum Mechanics**. __ Wait, don´t we want to use Quantum Mechanics in order to build our new computers?__ This is still true, but errors caused by decreasing chip sizes are a different kind of problem. Some of the components like transistors are so tiny, that on a 2cmx2cm chip area, you can fit 18 BILLION of them onto the chip. Isn´t that crazy?

*18 BILLION*! By becoming as small as particles and electrons, uncertainty is increasing in exponential manner.

*"Uncertainty" describes the probability of finding a particle in 2 given properties with certain probabilities which are innverse proportional. Given location and velocity you will never know both of them for sure, only with a given probability. The more certain we are about the location of a particle, the less certain is its velocity and vice versa. This principle was first proposed by Heisenberg and holds true for microscopic objects but shouldn´t scare you away from our main point.*

In order to visualize this problem let us take a basketball. Throwing the ball would lead us to take measurements. On *scales* which are **"normal"** to us, *measuring momentum and location are no problem*. But on microscopic scales, we would either be able to measure its momentum

__its location,__

**or***not both at the same time*.

So when transistors make *uncertain decisions*, we will receive *uncertain outputs* and normal tasks would be *undoable*. This is not the goal when trying to work with deterministic systems. With the help of **Superposition** we can make use of this so far "negative" characteristic. To understand Superposition i suggest you to take a look into my Brief Explanation of Quantum entaglement and Superposition.

These are some of the reasons scientist were working on new computers which are designed to solve even more complex tasks and solve difficulties given by temporary hardware problems. In the next upcoming article i will describe the way how normal computer store data and work with **BITS** in order to understand what makes a Quantum Computer special. Make sure to not miss the following content by leaving a follow and be sure to leave questions below!

TL;DR: Normal computers are getting better at solving tasks and in order to do so the size is decreasing at an alarming rate. At these tiny scales which hardware is evolving into, problems caused by Quantum Mechanics occure which tamper deterministic outputs. These are the most prominent challenges everyday computers have to face, which is why Quantum Computers were found to solve given tasks in faster times without relying on the latter errors.