A team of researchers at the Italian National Research Council (CNR) in Milan developed a quantum computer which beats the competition of Google.
A study published in the Nature Research journal Communications Physics, shows researchers developed a quantum computer to outdo the algorithms of Google.
Applying AI and deep learning
Applying artificial intelligence and deep learning to the compiler opened the way for programming an algorithm that adapts to any quantum computer based on logic gates.
With the collaboration of Matteo Paris of the University of Milan and Marcello Restelli of Milan Polytechnic, the team from CNR obtained the result.
“Similar to conventional computers, in which bits are subjected to calculations through logic gates, in quantum computers it is necessary to use quantum logic gates, which, however, must be programmed by a sort of operating system that knows which operations can be carried out,” Prati said in the study.
“However, there are many different versions of hardware that provide different achievable operations; like a small deck of playing cards to choose from,” he said.
Lorenzo Moro of CNR said the team therefore used deep learning to develop a compiler able to find the right order “for playing the five to six cards available, including with sequences hundreds of plays long, choosing one by one the right ones to form the entire sequence”.
“After a training phase, which goes from a few hours to a couple days, the artificial intelligence learns how to build new pieces for every quantum logic gate, starting from the available operations, but taking just a few milliseconds,” he said.
Research patented
CNR Italy patented the research. “Our model surpasses a similar patent by Google, which uses artificial intelligence after training but only for one logic gate at a time, after which it needs a new training”.
Google recently inaugurated its Quantum AI Campus in Santa Barbara, California. Eric Lucero, the lead engineer there, explained how quantum computing will be necessary in the coming years.
“Looking ahead 10 years, many of the biggest global challenges, from climate change to the management of the next pandemic, will require a new type of computing,” he said.