#Artificial Intelligence and #machine learning are poised to take us into a world beyond comprehension. We are standing at the crossroads where the next big technological leap might land us in the world of science fiction – star wars, star trek and the like. However they face a gigantic challenge. A recent study showed that 2.5 Quintilian bytes of data approximately are added per day. Big data forces us to have faster and more efficient solutions in data storage, retrieval and processing. As “Internet of Things” becomes commonplace – connecting anything and everything, the demand for efficient solutions only goes up. AI needs big data and big data needs AI. They are in a truly symbiotic relationship. Advancements in technology have enabled us to build chips that are exponentially faster than a few decades ago helping us keep pace so far, but pretty soon we may be reaching the limits of classical computing. The smaller the transistors get, we start to operate in the realm of quantum physics rather than classical. It opens up a world of possibilities and can of challenges.

Conventional computers built on the model of classical physics fundamentally manipulate and interpret binary bits into useful computational results. In classical computing, a bit is a single piece of information that exists in one of the two states at any given amount of time – 1 or 0. Quantum computing on the other hand uses quantum bits referred generally as “qubits” or “qbits” that can exist in more than one state at any given time. They can be in state 1 or 0, but they can also be in a superposition of these.

The field of quantum computing is founded on the assumption that the transformation of states in a superposition can be exploited to produce the effect of parallel classical calculations, i.e., that for a fixed amount of physical computational resources, quantum phenomena can permit a larger number of effective computations to be performed than is possible in the classical model within the same resource constraints. A single qubit can perform a calculation on 2 numbers at same time, 2 can handle 4, 3 can handle 8 and this grows exponentially. As we approach 20+ units, quantum computers can potentially perform calculations using millions of numbers. This is when quantum computers start to outperform the most advanced supercomputers that we have today.

Researchers around the world and large corporations like Google and IBM are investing significant resources to build the ultimate quantum machine. However there are significant technical hurdles to overcome. The machine needs to have high levels of tolerance for noise and corrections for errors. Correcting for errors requires redundancy, and the number of qubits needed quickly mounts. For example, factorizing a 2,000-bit number in one day, a task believed to be intractable using classical computers , would take 100 million qubits, even if individual quantum operations failed just once in every 10,000 operations.

However we don’t really need to wait for perfection to start harnessing the power of quantum computers. While building the dream quantum computer that is “perfect” is still probably a few years or maybe decades away, research at Google indicates that even the emerging quantum processors have a good chance of solving some key computational tasks in the areas of simulation, optimization and sampling and even become commercially viable in a few years. The growth in AI and machine learning is quite dependent on doing these things much more efficiently.

Modeling chemical reactions and materials is an area that could benefit immensely and bring enormous business and social value. Simulation software based on quantum computing could significantly accelerate the delivery pipelines. If robust algorithms are developed, it might be possible to simulate important materials without the overhead of full quantum error correction . For example, algorithms are already known (such as the ‘quantum variational eigensolver’ approach) that seem to be immune to qubit control errors. This has a far-reaching effect in the fields of medicine and materials engineering.

In fields of optimization, the most general classical algorithms rely on statistical methods such as thermal energy distributions. Assisted by quantum phenomena such as tunneling, they could help us find rare yet high quality solutions in a more efficient way.

Many AI/machine learning problems involve sampling from probability distributions. Quantum circuits can sample from a larger set of probability distributions than classical circuits can in the same time. This will have interesting applications in areas like inference and pattern recognition in machine learning.

We are really into uncharted territory both with quantum computing and AI. Quantum computing may help us solve problems that we haven’t even thought of yet. With its potential for performing calculations at ridiculous speeds, it’s hard to imagine what the killer apps will quantum computers lead us to.

Who knows, maybe this is the missing link that will finally take us to the holy grail of AI – Artificial General Intelligence.

Interesting times ahead !

References:

https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/45919.pdf

https://research.google.com/pubs/QuantumAI.html

https://www.domo.com/learn/data-never-sleeps-5?aid=ogsm072517_1&sf100871281=1

https://www.rtinsights.com/artificial-intelligence-needs-big-data-and-big-data-needs-ai/