The future of Artificial Intelligence: quantum, photonics and neuromorphics

Daniel Granados, expert with the Future Trends Forum at the Bankinter Innovation Foundation, foresees a hybrid AI in the future, where transducers will hold the key to interconnect all the technologies involved: analog, digital, quantum, photonic and neuromorphic. Here are the big topics on the future of Artificial Intelligence in our latest conversation.
Quantum artificial intelligence: A dream or a reality?
The idea of quantum artificial intelligence (QAI) sparks debate in the scientific and technological community. Opinions diverge on what exactly it means and whether it will ever materialize.
The #FIBKVoices initiative, which aims to connect the Bankinter Innovation Foundation community of experts to a broader audience, has explored the opportunities and challenges of artificial intelligence from the point of view of hardware. Daniel Granados, Executive Director of IMDEA-Nanoscience and Director of the Technological Innovation and Semiconductor Talent Cluster of the Community of Madrid, poses a couple of provocative questions: What is Quantum AI, is it what accelerates the development of quantum computing or is it what you run on a quantum computer? According to Granados, the second definition may be somewhat utopian.
AI, Von Neumann architecture and CMOS technology: A model for review
Granados points out that contemporary AI has been developed using the Von Neumann architecture with CMOS TECHNOLOGY, supported by FPGAs and graphics processors (GPUs) capable of performing tasks such as pattern recognition or natural language processing (NLP). Although these advances enable the construction of cost-effective data centers with significant computing power, this model is generating a worrisome environmental impact. For example, data center’s energy consumption accounted for 18% of all electricity used in 2022 in the country. For Granados, the CMOS architecture-based AI model is unsustainable and highlights the need for a more efficient AI architecture.
The challenge of quantum artificial intelligence
Quantum computing has been hailed as a solution to the energy efficiency problem. Quantum computers are based on different logic, physics and technologies than conventional ones, resulting in significantly lower energy consumption. However, quantum computers present a unique set of challenges for AI use—a major one being how to map or translate a problem from the analog-digital world to the quantum world. As more qubits are added to a system, the complexity of controlling the individual states of each qubit and getting them to interact with each other grows exponentially. In addition, there is the added complexity of accessing the data repositories that an AI constantly needs, because it is not obvious how to implement or map database information onto quantum computers.
In contrast, in the short term, AI could have tremendous impact on the quantum world precisely by translating-mapping problems from the analog-digital world to the quantum world. That’s perhaps the first thing we’re going to see, this expert tells us. AI and NPL will be interpreters to move from the analog/digital world to quantum qubits.
New perspectives: towards a hybrid future of computation and AI
Granados envisions a hybrid future for computing technologies in general and for AI in particular, in which different technologies will coexist and communicate with each other. For him, this is the real challenge of the future: designing interpreters that enable the transmission of information between the analog, digital, quantum, photonic and neuromorphic spheres.
These “interpreters” will not only be algorithms, but also hardware components, signal transducers, which will have to be implemented with technologies that, in many cases, have yet to be invented.
Right now, we are amidst a transformation, where several branches of computing that had been evolving independently are going to converge. This moment, described as “magical” by Granados, gives us a vision of a future computational architecture for AI where different technologies will be able to work together, complementing each other.
The expert indicates that silicon and von Neumann computing architecture will continue to have their place in the technological world, with computational accelerators of various types, such as programmable FPGAs and GPUs, and will be complemented by neuromorphic computing, photonic computing and quantum computing. All these elements are part of what he calls the “acceleration of computation”; it is the combination of all of them that will enable us to face hitherto unsurmountable computational challenges, or to simulate physicochemical properties of highly complex systems, such as proteins.
Sensors, and quantum sensors, must be added to these computing architectures. They play a crucial role as they will enable access to an additional amount of data with extraordinary precision and will become a source of real-time data to feed AI systems. Sensorics will play an essential role in the future of AI.
In addition to all this, signal transducers will enable the connection between sensors and different types of processors and will allow switching from one technology to another between processors in a multiprocessor system. In fact, it is suggested that many of these elements will most likely be orchestrated by a conventional silicon processor. This “orchestra conductor” will be responsible for deciding on which processor and with which technology it is most efficient to solve a computation. For example, if not much precision is needed and a pattern is to be recognized, a neuromorphic processor will be brought into play, and if complex path optimization is required, a quantum processor will be brought into play.
The expert anticipates a future in which many chips will work together in a system-on-chip (SoC)which will have very diverse technologies and architectures. However, it is not clear whether quantum computing will be part of that ecosystem of chips or whether it will be a cloud service, which will be called by a module of a SoC system. This is more likely.
The great promise: neuromorphic computing
In the field of artificial intelligence (AI), the expert points out that the human brain is still the most efficient system in terms of consumption and efficiency, outperforming current AI in tasks such as pattern identification. Trying to replicate the efficiency of the brain is an area of enormous interest, and there are sizeable financial and scientific investments in the quest for neuromorphic computing.
The brain is so efficient because there are a series of neurons that work harmoniously and most importantly: the database (the memory neurons) and the processor (the processing neurons) are in the same place. In the Von Neumann architecture, data are in one place and the processor is elsewhere: when doing a calculation, the processor is constantly calling the memory to receive the data—a highly energy intensive process. There are a number of research and technological developments aimed at in memory computing, which try to put the processor as close as possible to the memory. Neuromorphic computing is one step further, it seeks to understand how a brain works in order to replicate its functioning.
Solutions will probably come from new materials capable of behaving like neurons. The trade-off for energy efficiency in neuromorphic computing is reduced accuracy. But you don’t need high accuracy to identify a license plate, a car, a person. AI at some point is going to change hardware: it is no longer going to be implemented on an FPGA or a CMOS GPU, as their energy consumption when serving a large part of the population is unsustainable. So, this is a fascinating time for scientists and engineers who are dedicated to devising and developing hardware on new materials that enable computation and generative AI processes in a much more energy-efficient, and therefore more scalable and sustainable, way.
The need for technological talent
Granados points to a significant obstacle to this futuristic vision: the lack of technical talent. Universities are not producing engineers fast enough to meet demand, which is limiting economic growth and technological innovation within the aforementioned technologies. In the near future, we face a scenario in which economic growth and the welfare society will be constrained by lack of access to talent and workforce shortages.
The global challenge: Energy
In addition, Granados points out that energy is the main challenge facing our species, quoting another Future Trends Forum expert, Vaclav Smil. The exponential energy consumption of each new technology that is introduced is unsustainable. In this sense, the mass adoption of AI, be it quantum, neuromorphic, photonic or traditional, will be conditioned by our ability to make it economically viable and energetically sustainable. Nanoscience and nanotechnology could play a key role in environmental sustainability.
The future of AI: Sentient machines?
Finally, Daniel Granados foresees a future of AI where, if we want it to perform sophisticated tasks in changing and unpredictable environments, it should become a sentient being, i.e., capable of perceiving its environment through sensors and able to regulate itself according to the data coming from those sensors. This sentient AI could have the five senses that humans have, probably with greater ranges of sensitivity, and more senses such as, for example, the perception of electromagnetic fields, solar storms, or ionizing radiation, to mention just a few. If a machine were capable of sensing, it would be capable of being intelligent in the broadest sense of the word, and then, in the words of Antonio Damasio, another Future Trends Forum expert, it could even have a conscience.