From the Editor's Desk

Saragur M. Srinidhi

This issue of ACC carries an in-depth survey on Quantum Computing (QC), an area that is evolving and witnessing intense enthusiasm and research activity. The technology in this growing field has moved rapidly since the introduction of IBM Q’s 5-quantum bit (qubit) computing in early 2016 to the current 50 qubit offering by IBM that is closely followed by Intel’s 49 qubit machine. The face-off between the 50 qubits and the billion bit classical Turin computer is akin to that of David and Goliath confrontation! What matters here is not the number of qubits, but how good they are, and how efficient are the employed algorithms. It is expected that by late 2019, Quantum Computers will bring a revolution in computing.

In keeping with our mission of bringing the latest technological advances to our readership, we have planned a three part series on QC. The first in the series explores the fundamental differences between classical and quantum physics and dwells on essential features of QC such as superposition, entanglement and collapse of the wave function. The next part of the series describes a range of quantum algorithms with controlled superposition and entanglement of quantum states. Finally, we conclude the series with focus on the quantum measurements and comment on the future of QC.

As a follow up on a survey on cryptocurrencies published earlier, this issue includes a tutorial on blockchains – the underlying technology that drives cryptocurrencies. This novel technology, often described as a disruptive one, is essentially a new form of database where every participant can make updates which are shared amongst all participants. What is unique about this technology is that every update is ‘etched in stone’. It is impossible to change a record in any way for better or worse. This digital immutability is the essence of blockchains and is being deployed across multitudes of application domains.

Of late, significant research has been taking place in the area of Cognitive Computing (CQ) and Artificial Intelligence (AI). CQ is a blend of cognitive science and computer science that provides a realistic roadmap to achieve artificial intelligence. It is the simulation of human intelligence and thought process using networked computers that can think, reason out, and learn to effect a desired action. CQ is definitely the next step in computing that is fuelled by the need for automation and efficiency. This issue covers a high level introduction to the subject of CQ. We plan to follow up this with in-depth articles in forthcoming issues.

For self-learners, the experiential hands-on series on networking now has the computer codes and example snippets available on the GitHub repository for the current and all previous articles in the series. We hope this will be of immense value for the student community.
In parting, we would like to hear from you, our esteemed readers, comments and suggestions on ways to improve the format, choice and quality of the content in forthcoming issues so as to make ACC attractive for a wider readership. Please let us know if you think we are doing fine!

Happy Reading!
Saragur Srinidhi