Monday, December 16, 2019

Quantum Computing: Drivers of the Hype

Last October, I attended PSIA's SoftCon.ph 2019, where one of the plenary hall speakers is a Managing Director from Accenture who discussed a new set of emerging technologies following the SMAC (Social, Mobile, Analytics, Cloud) technologies from 6 years ago. Abbreviated as the DARQ Power which stands for Distributed Ledgers, AI, Reality Extended, and Quantum Computing.

The first time I heard of a Quantum Computer from BBC's 2013 documentary, "Defeating the Hackers", was with a raised eyebrow due to the fact that Quantum Mechanics, from how I recall it in college around 10 years ago, defies the reality as we know it. And upon pursuing this topic now, this branch of physics is still regarded as 'weird' even by physicists themselves.

Despite the uncertainties and being at an early phase of development, people are heeding and already considered Quantum Computing/Computer (which I will refer to as QC from here on) as if it is the only way to move forward.

'Quantum Supremacy'

Just last October of this year, Google announced claiming that they have achieved 'quantum supremacy' with their QC system named Sycamore. The task involved, somewhat like a 'hello world' program for QCs, is just a random set of instructions which they estimated will take 10,000 years for the world's fastest supercomputer to complete. Sycamore made it in 3.33 seconds.

At this point, there are no practical applications yet but they compared the feat to the first successful flight of the Wright brothers. It is the beginning of something that surely could become very significant.

The Early Buy-ins

Even before Google's announcement, Accenture had already made their move. Last July, they acquired a patent for a Machine Learning module that would help business decision-makers determine if QC would be beneficial to them.

Accenture, starting in 2015, already did their research work on the feasibility of QC as a business solution. In 2017, partnering with 1QBit Information Technologies Inc., they were able to map out more than 150 use cases for the technology. Most notably out of these is for speeding up drug discovery.

With QC's power becoming more evident, more companies and investors are taking interest, including the Trump administration, whose support is mainly coming from cybersecurity reasons. The US government made Quantum research development a priority.

 The Ultimate Driver

Obviously, most of QC's potential applications are coming from the areas where supercomputers are having difficulties still. Molecular simulations, (which is important for chemistry research) and forecasting (for meteorology, financials, logistics etc.) are exponentially complex where classical computers are generally unable to overcome, except through approximations. Although, there could still be ways to improve our current supercomputers, we are nearly approaching the physical limits. Expanding the infrastructure will eventually become impractical unless we make every component smaller to maximize the use of space.

But here is the problem...
The heart of a supercomputer is its CPUs (IBM's Summit, the world's fastest supercomputer has 200,000 of them) and the "building-blocks" of a CPU are its transistors. Today, transistors are around 10-20 nanometers (the most recently launched AMD Zen 2 is at 7nm), and we are still going for 5, even 3nm, on the next couple of years, but as transistors get smaller, it will reach a point where it will experience quantum tunnelling. We are nearing the end of Moore's law.

If one is to imagine a transistor being a switch where it has an 'on' and 'off' state, hence how a computer communicates through the binary language of 1s and 0s respectively. At an 'off' state, there is a barrier that stops the flow of electrons, but in quantum tunnelling, the electrons are now able to get to the other side regardless of that barrier, likening to a ghost that can go through a wall. This means, the transistor will cease to function as intended.

In closing...

Still, we do not expect QC to replace classical computers any moment now or even in the next couple of decades. There is still a long way down the road, and it is reserved to specific purposes especially given its expensive setup. It is fundamentally different against classical systems, like how pencil is to a pen.

----------------------------------------------------------------------------------------------------------
This article was originally published in LinkedIn

0 comments:

Post a Comment