Share This Post

Home >> Communities >> NASSCOM Insights

The future of Computing Technology

by Sumeet Swarup

At a recent discussion of the World Economic Forum Annual meeting, Mr. Antonio Neri, CEO for HP Enterprise predicted the end of Moore’s law due to the presence of Big Data. He said that the current digital computer has had the same structure for a long time – a microprocessor, a RAM, a storage, and electrons moving back and forth between these three components. He pointed out that with the advent of Big Data, the current setup sees three major challenges – (a) Architecture problem – it will become increasingly difficult to pack more and more transistors on a microprocessor (see note 1 below) (b) Scale problem – the world is generating too much data for the current computing power to keep up. And this imbalance is going to become worse with the Internet of Things taking off (see note 2 below) (c) Environmental problem – Current setup of data centres are already consuming 6% of global energy, and by 2025, it will exceed 10%.

Enter Quantum computing and photonic computing. Current digital computing uses bits – a stream of electrical or optical pulses representing 0s and 1s. All data is long strings of these binary digits. Quantum computing uses quantum bits (or qubits) that are subatomic particles that can be a 0 and a 1 at the same time. Photonic computing uses photons instead of electrons in computers, and hence use much lesser power.

Mr. Neri thinks the future of computing lies in Quantum computing combined with photonic computing – data is at the centre, qubits are in close proximity for computation, and photons to facilitate the transaction. This way we use much less energy, and reduce latency by many times. Such futuristic hardware is being developed by companies such as Google, IBM, Intel, Nvidia and a few startups.

I got a chance to speak to a Quantum researcher in India who is doing some fundamental research in one of the academic labs, and has been partially funded by the Central Government. He mentioned that even though the technology is still experimental, it is developing rapidly, and pundits believe it will be commercially used in 5 to 10 years. When used, it will be so powerful, that it can experiment at the molecular level to develop new drugs, new materials, and possibly even alter matter enough to develop new elements.

Given that data is becoming very important, and India is generating tons of data, part of which might be stored in India due to data localization laws, it might be a good time for us to think about what do we do with that data once we have it. How do we make it useful, and what are the hardware and software tools that we should employ to make it useful.

Editor notes:

  1. For comparison, the Intel 4004 introduced in 1971 had 2300 transistors vs. the Intel Core I7 which has over 700 Million transistors. The Nvidia chip introduced in 2018 has 9 Billion transistors
  2. 90% of all data generated ever has been generated since 2016 only. By 2020, it is estimated that 1.7 MB of data will be generated every second for every person on this Earth

Share This Post

Leave a Reply