August 29, 2023 in Digital Twins

Quantum Business Intelligence and Quantum Augmented Digital Twins

SHARE: Share on PRINT ARTICLE:print this page https://doi.org/10.1287/orms.2023.03.07

Business intelligence (BI) is a crucial component to the day-to-day operation of most modern businesses. However, as a concept, it has not evolved much over its relatively long history. At its core, it is a means to make future predictions based on history. In other words, we can make a logical deduction that because we got x outcome for y scenario in multiple instances, it will likely happen again. Whereas this is a proven method of augmenting executive decision-making, the commoditization of data along with significant advances in computing allows for systems to more effectively and efficiently store and process data.  

The first practical use case of “digital twins” (DTs) is often credited to a 2002 presentation by Professor Michael Grieves at the University of Michigan. Grieves defines digital twins as “a set of virtual information constructs that fully describes a potential or actual physical manufactured product from the micro atomic level to the macro geometrical level” [1]. In 2010, NASA announced that they would use (what John Vickers called) digital twin technology to improve the production and maintenance of spacecraft because digital twins have been used primarily to represent individual components to test for failure, load capacity, etc., in manufacturing [1]. DTs have grown in popularity in parallel with the increased importance of Internet of Things (IoT) in industry. This allows for the constant generation of up-to-date data. However, the simulation algorithms used in DTs are incredibly compute intensive, limiting their current use – particularly, the amount of data the models can process. This creates an opportunity for quantum computing to enhance and evolve this tool. 

Quantum computing is rapidly becoming an enterprise tool that will bring about a dramatic shift in how businesses use computing devices. Quantum computers (QCs) by nature outperform classical computers. We will discuss how digital twin technology can be used in conjunction with quantum computers to create a new generation of business intelligence that is forward-looking, allowing businesses to receive situational representations of simulated activities (or chains of activities) and their respective outcomes while taking into consideration abundant live, up-to-date data and more complex simulation algorithms. 

Qubits 

To understand why quantum computers can process data rapidly and in abundance, it’s important to understand the qubit, or quantum bit. Classical bits are binary – they are either “0” or “1,” but not both. Qubits, on the other hand, can be “0” and “1,” or any point in between, in a state called superposition. Superposition creates a large state space (2n), allowing many possible variables, each with its own probability amplitude. The issue with qubits is that when they are “read,” they will assume a state of “0” or “1” like a classical bit. But unlike a classical bit, which can be read multiple times, a qubit is destroyed. In other words, it can only be read once. 

Because qubits aren’t a static 0 or 1, they can represent multiple variables at once, and in turn, these variables can be processed at the same time using parallel processing. This is a key difference between quantum and classical computers. To solve a common optimization problem, such as the traveling salesman, a classical computer would have to solve for every route individually, then find the most efficient route. A quantum computer, due to the properties of qubits, can solve for every route at the same time. 

Entanglement is another property of quantum physics that provides a substantial processing speedup compared with classical bits. Entanglement means that one qubit can take on the properties of another one. Once they become entangled, if one of the qubits is manipulated, it will affect the value of the other qubit. When you measure qubit 2 in this fashion, you are really measuring the opposite states of qubit 1. Thus, processing one qubit provides the results of two qubits. Reading one qubit will destroy the entangled qubit; however, it should be said that there are means to get partial information from a qubit without destroying the rest. (This is outside of the scope of this article.) 

Quantum Business Intelligence 

As of this writing, there is no method of permanently storing large amounts of data on quantum computers (QCs). This is due to the fragile nature of qubits – they need to be stored within very precise (and difficult to maintain) conditions. Any noise can degrade the information. Quantum error correction is an important field of work and research in quantum computing with the goal of creating fault-tolerant systems. Until QCs are more fault-tolerant, large data sets will still need to be stored on classical hardware (hard drives and solid-state drives). This is the current state of business data, so organizations would not necessarily have to migrate data to “quantum storage” in the short term. Additionally, it may be a very long time before it is cheaper and more effective to store data in quantum storage rather than classical storage. 

In classical digital twins (i.e., digital twins developed on classical computing hardware), middleware is used to connect the physical and digital worlds. Middleware takes live data from the physical world and transmits it to the digital world, where a data-driven simulation algorithm is run to predict outcomes given various circumstances in the form of visualizations and standard analytic tools. This has been widely used in manufacturing for process monitoring and equipment testing and maintenance. The middleware in a quantum digital twin would work similarly, only it would include a full suite of BI tools and utilities (reporting, dashboards, analytics, etc.). It would also be where the user can access the digital twin.  

The advantage to using quantum computing in this process is the increased amount of data that can be processed. A standard computer can only store one value at a time in a register. A quantum computer can store and use 2n variables, where “n” is the number of qubits being used [2]. To put that into perspective, IBM’s most powerful quantum computer available to customers at the time of writing has 127 physical qubits. When data is fed into the quantum computer via the middleware, calculations are completed to create various simulations, analytics, and visualizations. The quantum computer itself is not generating these insights. Quantum computers, depending on the algorithm, may be used to speed up specific types of calculations on large amounts of information brought into the quantum processor, leaving the remaining interpretation and calculations to the classical computer. Given a scenario, the quantum computer will calculate an array of potential outcomes. This can be thought of as a remarkably large Monte Carlo simulation. When the calculation is transmitted back into the middleware, the digital twin can be created. 

flowchart
Figure 1. Flow of data movement and processing. 

Essential Quantum Algorithms 

Shor’s Algorithm. Shor’s algorithm uses phase estimation to do factoring or find a period in a repeated pattern. One of the general concerns brought on by the advancement and increasing research of quantum computing is the potential ability for quantum computers to be able to break modern methods of encryption – most notably RSA-256. Shor’s algorithm, plainly put, finds all the prime factors of integer N in polynomial time. It is important to mention that it is in polynomial time that leads to the improved computation speeds of quantum hardware. On classical computing hardware, it would take finding prime numbers of integers in superpolynomial time to crack RSA. It does this by solving the equation ar mod N = 1, where a and N are positive integers and r is the measure of the period [3]. In essence, Shor’s algorithm (or quantum phase estimation) is a pattern-finding algorithm used to solve calculations that can’t reasonably be done on classical computing hardware. Pattern-finding algorithms like Shor’s will be useful in quantum BI because pattern finding is a core purpose of BI. It is likely that (similarly to new considerations in cryptography with Shor’s algorithm) new BI-specific pattern-finding algorithms will be developed, finding new insights that couldn’t currently be discovered given classical computational hardware. 

Grover’s Algorithm. Grover’s algorithm is an example of an algorithm that will altogether improve the capabilities of enterprise software (enterprise resource planning (ERP), human capital management (HCM), etc.) and reporting/BI. It is an unstructured search algorithm that performs quadratically better than a comparable algorithm on classical hardware. When searching in a list of unsorted data, classical computing has to check approximately N/2 number of items before finding the target, whereas Grover’s can find the target at a rate of √N [4]. This lends itself to the theory that quantum computing can significantly help in creating a digital twin of an organization by finding data items at speeds that cannot be accomplished on classical hardware. By speeding up queries, businesses can more efficiently get the data they need and even begin querying larger data sets with minimal runtime. This will enable more advanced dashboards and queries, and it is necessary for BI to survive based on the rate of data generation alone. In 2021, it was forecasted that 79 zettabytes of data and information were created, captured, copied or consumed [5]. That amount is predicted to grow to 181 zettabytes in 2025 – a growth of more than 129%. Although these estimates have not been updated since the data set was created and 2021 has passed, it stands to make the point that data is being created at an exceptionally and unsustainably fast pace. In 2023, reporting can already be slow and in some cases, depending on the data being retrieved, can be completely unusable (reports and/or analytics fail to process or take an exorbitant amount of time). Given Figure 2, we can realistically expect that without innovation, BI will be of no practical value by 2025 for some businesses, primarily large businesses using mass amounts of consumer data.  

graph of data volume worldwide from 2010 to 2025
Figure 2. Volume of data/information created, captured, copied and consumed worldwide from 2010 to 2025 (in zettabytes). Source: Statista.com 

This is not to say that smaller businesses, government entities or hospitals will not struggle with the status quo of BI. For example, due to the COVID-19 pandemic, public health entities are creating and tracking more data than ever before. To compound that, it is becoming easier to generate health data at home that can be ingested by healthcare providers thanks to increased innovation and popularity among personal health tracking devices and services. To get a good picture (or a digital twin, even) of the entire business of a hospital, many branches of data will have to be associated. Often, this data comes from a mix of sources, some hosted and some on-premise. It would be a major undertaking for classical computing to generate a digital twin with the amount of unstructured data to search. Quantum computing algorithms such as Grover’s and Shor’s, among other quantum algorithms, bring promise of tying multiple systems together to capture a real-time image of businesses, such as hospitals, that cannot reasonably be captured today. 

Use Cases 

The main use case addressed is that with QC technology, businesses will have enough compute power to constantly be processing their data quickly. In other words, they can simulate different scenarios on live, up-to-date data. Businesses can accurately evaluate the entire risk of decisions being made through quantum augmentation of BI tools, artificial intelligence and machine learning – whether it’s entering a new segment, acquiring a business, restructuring, simulating the effects of a natural disaster, etc. This is a new degree of business intelligence because it’s not just predicting what is likely to happen based on regression or heuristics for one specific indicator (sales, inventory, hiring data, etc.). It is a comprehensive cause-and-effect analysis of an entire organization. An example of a business case is as follows: 

If a product doesn’t meet sales expectations, which means quarterly goals aren’t met, will that affect the number of employees we can hire in the next quarter? 

Given the abundance of sales and people operations data in large companies, this is an insight that would take significantly more time to derive utilizing modern methods.  

How can this help businesses quickly navigate developing and changing situations, such as natural disasters? Currently, leaders have few analytical tools that present live data augmented by external forces. Quantum BI and digital twins will assist leaders in making precise decisions when time is a commodity. For example, a business has a major office in New Orleans, Louisiana, where a hurricane is upgraded to a Category 5 just before landfall, and the business needs to know what resources it will need to navigate the situation and help its employees and other assets in the area. A quantum system can ingest live weather data outlining the severity of the storm in incremental time. Historical data of people who evacuated or stayed during previous storms can be used to find an average risk of someone not evacuating. Census and payroll data can create a profile of the average resources an individual has to mitigate the damage caused by the storm, the number of people in the household and the value of the house. All this data is then fed into a quantum algorithm (or series of algorithms) to create risk profiles of areas of Louisiana in which employees live. When given back to the middleware, individual employee data can be compared with the results from the QC, and direct aid can be offered. 

It should be noted that although quantum computing will bring along a revolution in business intelligence altogether, traditional BI tools will also see enhancements, making tools more effective than ever. Time to generate visualizations and reports will decrease dramatically, leading to more dynamic and robust results. Dashboards can also be dynamic, rather than updating upon request or in a set time interval. Executives can get snapshots at a glance, in real time, without waiting for compute-intensive queries. 

Optimization and Quantum Annealing 

Researchers and developers have established that quantum computers are effective in solving combinatorial optimization problems (such as the traveling salesman problem previously discussed). Quantum approximate optimization algorithms (QAOA) are designed to solve these problems on gate-based quantum systems by working in conjunction with classical hardware. Alternatively, quantum annealing uses the properties of nature to find solutions that use the lowest possible amount of energy. It does this by giving facts and existing data to the QC, which then provides a solution or range of solutions based on the amount of energy needed. Naturally, the optimal solution is the one requiring the minimal amount of energy. 

Business intelligence is used to find an optimal solution to a problem. Businesses will benefit from both QAOA and annealing because of quantum computing. The improvement in processing efficiency granted by quantum computing enables more opportunity for leaders to explore optimal solutions to common and new business problems.  

Future of Quantum Computing 

Using quantum computers in conjunction with classical computers and redefining business intelligence is just the first iteration in quantum computing enhancement to the way that businesses gather insight. Rather than using historical data and regression algorithms to define one heuristic solution to one problem, quantum business intelligence can create an up-to-date digital twin of the business, allowing business leaders to see the effects of a decision or circumstance across an entire organization. Experiments can be run to simulate rapid growth or catastrophic events, allowing decision-makers to prepare more robust plans. In instances in which decisions need to be made quickly, leaders can analyze live data to obtain an instant situational representation of the business, rather than running multiple reports built on slow structured query language (SQL) queries. Upon the practical implementation of quantum algorithms, BI will enter its largest advancement period since it was computerized.  

Additionally, quantum computing will become necessary for efficient data processing and analytics within the decade. By 2025, data generation and consumption is projected to more than double. To maintain the ability to create information from this data, BI tools will need to integrate quantum computing and quantum optimization. 

Quantum computing has a long way to go, but hardware companies are working fast to make it increasingly more viable for businesses. Because multiple jobs are sent to few systems, queueing often increases the time needed to complete calculations. As quantum computers mature, this issue may likely go the way of time-sharing computers popular in the 1970s. Some companies, namely IonQ, promise a future of rack-mounted quantum systems that organizations could host locally. Quantum business intelligence will continue to grow as the underlying hardware grows. 

References 

  1. Grieves, Michael, 2016, “Origins of the Digital Twin Concept,” Florida Institute of Technology, DOI: 10.13140/RG.2.2.26367.61609. 
  2. Ménard, A., Ostojic, I., Patel, M. and Volz, D., 2020, “A Game Plan for Quantum Computing,” McKinsey Quarterly, February 6, www.mckinsey.com/business-functions/mckinsey-digital/our-insights/a-game-plan-for-quantum-computing 
  3. Shor’s algorithm, Qiskit, qiskit.org/textbook/ch-algorithms/shor.html.   
  4. Holst, A., 2021, “Total Data Volume Worldwide 2010-2025,” Statista, June 7, www.statista.com/statistics/871513/worldwide-data-created/ 
  5. Grover’s algorithm, Qiskit, qiskit.org/textbook/ch-algorithms/grover.html.    

Michael Heiner

Michael Heiner is an independent researcher and consultant at Heiner Innovation Company, specializing in emerging technologies. With a primary focus on quantum computing and digital twins, his research delves into the dynamic relationship between these fields. Michael possesses a strong professional background in people analytics, having applied his expertise to both the education and healthcare industries. His master’s studies concentrated on cutting-edge technologies, including artificial intelligence, blockchain, gamification and quantum computing. Presently, he is working on an innovative project that leverages quantum computing algorithms to redefine modern business intelligence by creating digital twins of entire organizations. 

SHARE: Share on

INFORMS site uses cookies to store information on your computer. Some are essential to make our site work; Others help us improve the user experience. By using this site, you consent to the placement of these cookies. Please read our Privacy Statement to learn more.