In the world of research there a some distinct categories of information. There is the current “normal” or base literature that is easy to access and mainstream in the sense it is being taught or is the starting point for someone to learn a field. There is the “future” category of material that, while not the current standard of the field, is where many perceive it to be heading and is also well documented. There is the “advanced” area of research which is mostly left to those very deep in the subject field comprised of high level papers not yet ready to be put into mainstream use. The final category of information is of course the bleeding edge, where ideas and experiments are abundant and only a fraction will ever make it into the easier categories or see practical utility. The category quantum computation falls under is this one, with as of recently perhaps under the advanced category as well. While the technology itself will eventually become far more mainstream than most in the field would admit, there is virtually no simple explanation of what it is outside of the more primitive examples that fail to actually provide insight as to how the technology works. I intend to rectify this today as a part of what I hope to be a series of articles detailing this marvelous communication and computation advancement, and will assume some basic knowledge of computation.

Let’s begin where most explanations end. We know that a quantum bit is not like a normal bit. In a normal computer, we carry information in bits, and therefore binary. For example, to store a value like 11 the binary representation would be 1011, or in hex values 0B. From there the primary thing we can do with a computer is add or subtract these numbers. Say I wanted to add 3 to our initial value of 11. While in the conventional decimal system the math would look like 3+11=14, in binary it would look instead like 1011 + 0011 = 1100. In binary we use powers of two, also called base two, for our number representation. The typical numeric system of decimal, or base ten, does everything this can do but is not friendly to computers since base two is all we can do with two possible states.

In Quantum Computing though we are not bound by such a trivial issue. Quantum computing is based on the idea that a quantum bit isn’t just a 1 or 0, like a normal bit. It is instead a difficult-to-measure probability of that bit being either a zero or a one. It takes 8 traditional bits of information to represent a single letter of the alphabet, like 10011011. A single quantum bit on the other hand could have a small chance of being a zero, a small chance of being a one, be half a zero and half a one (50/50), or any other combination of percentages in between. The more accurately you can measure that probability, the more information you can squeeze into a single bit. In quantum mechanics it is well established that the closer and more precisely you measure something at that level, the more you collapse the probabilities it has and force it into choosing a state. Using this in property in a computer means that, instead of a bit having only two possible states, we can create bits that many states!

Another way to describe this is as a yes or no. A normal bit has two possible states: yes or no. If you have two bits, there are 4 possible combinations.

- Yes Yes
- Yes No
- No Yes
- No No

You can store a huge amount of information by adding more and more bits. Two bits yield us four combinations, or states. Three bits gives us eight states. Four bits is sixteen, and the process continues to double with every added bit. Most computers now rely on at a minimum 64 bit processors, yielding us a staggering 2 to the 64th power possible states.

As immense at that power is though it still is greatly limited by the fact that it is just two being exponentially increased. Now let’s imagine that a quantum bit, or qubit, has 3 possibilities. Probably, maybe, and probably not. Let’s list our new possible states we can have.

- probably probably
- probably maybe
- probably probably not
- maybe probably
- maybe maybe
- maybe probably not
- probably not probably
- probably not maybe
- probably not probably not

Do you see what happened? Two qubits of data yields us nine combinations. Adding a third qubit would be 27 combinations, four would be 81, and 8 would be over 6000 combinations. Compare that to the meager 256 combinations eight bits could offer us. Even a single additional state has dramatically increased our possible combinations, and that of course means our computing power. Why is this? Because now instead of taking hundreds of bits to represent something, say a character, I can do that same task with three qubits. This cuts down data space, which cuts down data processing time, and increases our total power.

Let’s take this farther. Now imagine that instead of a qubit with 3 possible values, we have a qubit that has thousands of possible values. It should be beginning to dawn on you the immense potential here. With so many possible states you can achieve everything a billion bits can with merely hundreds of qubits. Imagine how much more information you could fit in a small computer. Imagine how much a clever programmer could do to use these new possibilities to make the computer even more powerful, especially at specific tasks that have lots of possible combinations. The difference, the leap if you will from bit to qubit computing is so immense it is quite literally impossible to overstate. Regular computers keep getting faster because engineers find better ways to put more tiny wires and transistors (bits) in a small chip and read it faster. Quantum computing however is a completely different game.

To conclude, there are of course issues to properly address before this technology is realized. The problem with using the probability of a quantum state is that it is painfully difficult to measure at the present. With the way quantum mechanics work, the second the qubit is observed fully, it collapses and is no longer a probability, eliminating those extra states we want. So then the art of quantum computing is learning how to achieve these states and probabilities consistently so we may perform actual calculations with them. This is why quantum computers initially will be very useful for some things like intense calculations, but not so much for work that must be error free. Storage as well is not something that will chance from binary for quite some time. Barring a massive advancement in the subject (which I would not rule out) binary has proven a robust and practical method of storing data. With these notes aside it can be safely said quantum computing will be a large part of the future for computers.