Quantum Computing Forwarding Logistics, Market, and Field Machinery Solutions

In mainstream culture, you hear about quantum computing in science-fiction movies and books. Santiago Nunez-Corrales, Research Scientist at the National Center for Supercomputing Applications (NCSA), University of Illinois Urbana-Champaign, presented the facts about quantum computing at the VISION Conference on Jan. 17-18, 2023, in Glendale, AZ.

Quantum computing is no longer an idea. Millions of dollars are being invested into research and development bringing development to a pre-competitive phase. Nunez-Corrales shared with Global Ag Tech Initiative where quantum computing is poised to take us in the next 10 years if crucial technological challenges are overcome.

Advertisement

What is quantum computing?
Santiago Nunez-Corrales: Quantum computing refers to an evolving collection of theories and technologies intended to allow us to perform computation using a different set of natural laws from those driving traditional computers. While a significant number of technical challenges need to be overcome to reach mainstream commercial significance, we seem to be on track for a major computing revolution in the next decade.

Computers that we currently use, while they benefit from quantum mechanics in the way transistors work, are deterministic – meaning that inputs are transformed by a function into the same outputs. You always know for the same input and operations, that you will get the same result. Quantum computers do not operate deterministically thanks to quantum mechanics, a theory whose effects involve probability and uncertainty. This theory describes phenomena driven by the laws of physics at the scale of atoms, laws which happen to be different to the world we usually experience. That is significant for how we define what quantum computing is, so let me give you an example.

If we look at the contents of a file stored on our disk and read it, the bits and bytes will not change. Reading the file will not irreversibly destroy its information. In the case of a quantum bit, the unit of information in the quantum world, observing that unit of information may change its contents in a way that is captured mathematically by complex probabilities in a possibly destructive manner. The act of observing a quantum state is called a measurement, and information may be irreversibly lost.

Another interesting phenomenon is called superposition. When we store data on a computer, only one possible value may reside in a memory location at any single time. In classical computers, registers are an example of memory locations that live inside the microprocessor. In a quantum computer, you can also have similar units of storage (e.g., quantum registers), with which you can obtain multiple combinations of states representing multiple possible worlds or alternatives at the same time. But there is a catch: reading that unit of memory leaves us with only one possible result, depending on the probability given to it by virtue of quantum laws.

The third interesting property is called entanglement. In classical computer systems, performing computation on data stored in a specific memory location ­–think of it as a unique address for data- does not alter other pieces of data already present in other memory locations. We hence say that classical computation exhibits locality, or the property ensuring independence of each location when another is transformed in a program. Another way to understand locality is to think of a single state mapping to a single physical location. But quantum computing changes this expectation.

Entanglement describes the existence of correlations between different physical systems that define a single state. We call these correlations non-local, meaning that they are no longer constrained by physical distances. Take the case of Bell states, or two-qubit quantum states created by a procedure that ensures these correlations exist. Even if these qubits are physically separated, transformations performed in one seem to instantaneously propagate to the other. This has been verified by a recent experiment at LMU and Saarland University, where two quantum memories were entangled across 33 kilometers. Succinctly put, we have a single quantum system split across two locations.

These counter-intuitive properties of quantum mechanics, when cleverly harnessed, allow us to build computers in a whole different way. So, why is that important?

Well, it turns out that there are computational problems that are really hard, with significant practical applications to everyday life. The mathematics behind quantum computing indicates that these new devices may have an advantage for many of them, and that quantum algorithms properties found in the structure of such problems. Searching in large databases quickly, simulating chemistry, finding optimal scheduling routes for logistics and transportation, and predicting stock market changes appear to be challenges where quantum computers will be able to provide an advantage once these devices mature.

For other problems, quantum computers do not provide an evident advantage, at least just yet. Take the following case. You have a pizza delivery person that must deliver pizzas across 10 different cities using the shortest possible path connecting and visiting only once all of them. It turns out to be a very difficult problem, since all possible paths need to be explored before deciding which is the best one. When you go from 10 cities to 20 cities, it becomes even harder. And with 40, not even the age of the universe suffices to wait for the solution. You need a lot of computing power if you want to solve it exactly or take shortcuts and obtain approximately good solutions. That is called the Traveling Salesman Problem or the TSP. However, even if we can solve this problem in a quantum computer, research so far indicates that doing so does not lead to a better solution compared to classical algorithms.

Can you explain how real quantum computing is in 2023?  Is this something that’s already created or is it a theory?
SNC: The idea of quantum computing appeared circa 1980 with work by Richard Feynman, Yuri Manin, and David Deutsch, who suggested that quantum computers could perform tasks classical computers could not.  Up to 2010, the field remained mostly a theoretical and experimental sub-field in physics. From then to the 2010s and ahead however, there has been an explosion of companies, technologies, and markets. Companies in particular have started to invest a significant share of resources, hiring a significantly large amount of people to build and commercialize these quantum devices.

We are at a point where it is no longer a science experiment. There’s significant economic backing from industry to the point where in 2022, the World Economic Forum estimates $35 million dollars in investments alone from government, academia, and industry worldwide. There is a lot of research and technology development being poured by companies and startups into moving from current systems, called noisy intermediate-scale quantum devices (NISQ) into fault-tolerant quantum ones. NISQ machines tend to have a small number of qubits, these decohere easily and these machines require constant calibration. Reaching fault tolerance would allow using quantum systems similarly to how you would work with a classical computer; you don’t have to worry about having scientists needing to calibrate your system before using it every day. Then there is also something called quantum advantage, which defines the point at which a quantum computer will overtake even large classical computers in terms of performance certain problems, but how and when remains a matter of debate.

In terms of available technologies, multiple kinds of quantum computing devices can be found. The most common ones are based on superconducting qubits, such as those developed by IBM, Google, and Rigetti. These operate at temperatures below that of deep space. Another type of system is based on ion traps, where lasers fix the position of individual atoms, cooling them down and performing operations on them; IonQ and QSCOUT at Sandia National Labs. Other companies use photons and particles of light to implement them. And D-Wave, one of the first quantum computing companies to put products to market, uses quantum annealing devices which attempt to map optimization problems to energy minimization tasks.

Right now, a lot of research and development is happening in a pre-competitive space. I think we are in a similar position with quantum computing today as they were in the 1950s-1960s with classical computers. If we go back to that time, there were many unknown problems found as computers were developed. People experimented with materials for building them, learned to identify and organize components that could be used to perform any calculation –we call this universality– and software as we know it today did not exist. We are in a similar situation now, with the advantage of hindsight of the general principles we have learned about how to build computers, but with unique properties of quantum systems that make that prior experience not necessarily transferable as is. In addition, we are still learning how to integrate both classical and quantum computers, a tasks NCSA will soon start contributing too as well.

I estimate it will take at least 5 to 10 years from now to find devices to start delivering performance at a scale that makes them useful at commercial levels. Several challenges need to be met, from the materials science needed to produce stable quantum systems to the science of how much can entanglement scale. Current systems go from having only a handful of qubits to more than 400 qubits of limited reliability. Known quantum algorithms require tens of thousands of qubits to solve useful cases, and there are scientific and technical unknowns to reach to that point. Also, we need to devise ways of programming quantum computers that are more familiar to today’s programmers rather than to physicists. Part of my own work is now dedicated to creating high-level programming languages for quantum computing, and finding useful algorithms that can run even in these noisy devices.

However, one of the key messages  is that we need to start right now, understanding how quantum computers work, what they can do now, the limitations that need to be overcome before reaching a point of full commercial viability, and what we can expect from then into the future. Programing quantum systems differs significantly from what we are used to, and the most important barrier to adoption across the board will be having trained professionals.  Thus, we need to start thinking about training people across sectors and industry verticals. Those companies and sectors of the economy that can leverage quantum computing in the future and know how to do it are going to profit the most.

What can a quantum computer do that the computers we use today can’t do?
SNC:
For the moment, quantum computers remain limited in what they do. But, if everything goes according to plan, we will see this change within a decade. Let’s look at an interesting application for quantum: cryptography.

There is a very emblematic problem, which is number factoring in cryptography. It is used by credit card companies, medical records, and national security agencies to secure sensitive information. Data is encrypted using algorithms to make sure nobody else can read what’s in there except those with the right authorization.

The reason behind why modern encryption is secure depends on properties of the underlying mathematical problem: multiplication of prime numbers in asymmetric private key encryption. If you multiply two prime numbers, obtaining and verifying the answer is really quick. If you have just a very large number, which is a product of two unknown factors, which happen to be prime numbers, it takes multiple computers multiple months in a data center just to encrypt a few parts of a message. No quantum computers can solve this problem of number factoring exponentially fast right now, but they will be likely able to do so when a sufficient number of quantum bits becomes available thanks to Shor’s (and other) algorithms.

For machine learning and linear algebra more generally, there is a great possibility that we can get exponentially faster performance. What does it mean in practice? It means that something that may take hundreds of years on multiple classical systems may be reduced to hours on a quantum computer. This again can be expected with fault-tolerant quantum computing hardware in the future.

How does this apply to agriculture?
SNC: Agriculture poses a series of several very interesting problems where quantum computing will likely provide a substantial advantage. For example, image recognition has become really significant for agriculture. The ability to do machine learning and to train systems using large databases depends on the speed at which machine learning works. By speeding up machine learning with quantum computing within the next 10 years or more, we might be able to process an increasingly large amount of data compared to today’s possibilities.

Another problem that quantum computers can be used for is to help predict the behavior of markets. This is particularly significant for producers, especially considering perishable products with a short shelf life. Having the ability to predict the behavior of markets can also help optimize entire value chains. The chain of events that goes from production to being on the shelf. There are several problems in graph theory, so the way we route products across the world, the way timing work needs to work for ships need to be loaded, moved, and unloaded. Many of these problems in logistics translate directly to computational problems, for which quantum computing has an advantage.

Finally, as we discussed during the VISION Conference 2023, simulating nature is expected to be one of the tasks quantum systems to excel at. This has profound implications for how we produce agrochemical compounds, in particular fertilizers.

How accurate do we know is quantum computing for doing these things? Is there any proof that we have like a percentage of accuracy?
SNC: It is very important to adjust expectations at this precise moment, to remain positively skeptical and constantly informed about advances in the field. We know that quantum algorithms can be developed, and that literature constantly shows progress towards applications. Computational complexity theory, the field of computing concerned with understanding the “hardness” of problems, indicates quantum advantages clearly exist if the devices behave as required by mathematics. On the hardware front, however, we need to wait until the existing quantum for these advantages to materialize. When talking about quantum computing, we need to remember that the pie-in-the-sky benefits from quantum advantage are 10 years or more in the future.

We know from theory, existing platforms and preliminary work toward that benefits from these technologies are possible. This fact means that we need to start thinking about quantum now, so that we are ready when the platforms arrive.

Is there anything else that you wanted to add?
SNC: One of the main aspects that makes quantum computing relevant is the way it has transformed how we conceptualize what computation is. The way we conceptualize a computer is as a piece of matter that has a keyboard and certain devices that work in a certain manner, we input bits and bytes to obtain new bits and bytes, which we can assign useful meaning to in the real world. The way these computers work also refers in our minds to a certain series of steps and rules dictating how we build computers and how we write programs for them.

Quantum computing is a revolution in the sense that it changes how we understand what computing itself is. As we expand on that horizon, it can lead us to identify and solve new problems through processes we could not conceptualize as computing before. That will be crucial. For example, we’ll be able to look at the problems that the agriculture sector faces that are tied to how complex coupled human-natural systems evolve spatially and temporally.

I think that the main point to keep in mind is that quantum computing has expanded our concept of what computing is, and with it, the nature and number of the problems that we may be able to solve in the not so distant future.

MORE BY RENEE TARGOS

Load Next Post