The opposite is true for quantum computing. It is futuristic, unseeable, and hard to wrap one’s mind around. Few understand it. After all, it is based on science that even Einstein famously struggled with.
The good news is that advances in quantum computing are expected to have tangible outcomes—massively enhanced computing power—and these outcomes can make a big impact in society. In this simplified context, I’ll address how quantum computing can make a difference and drive impact in the water industry.
Quantum’s role in facilitating access to water
When it comes to addressing UN Sustainable Development Goal 6, Clean Water and Sanitation, the targets are achievable via political will, capital availability and affordability. Technology will not solve for political will, but it can help with affordability and the efficient use of capital.
Quantum computing acts as a facilitating technology to increase affordability by way of reducing capital requirements and teasing out operational efficiencies. How? Quantum computing has two primary use cases relevant to the water industry: 1) machine learning and 2) complex system simulation and optimization. There is also an indirect use case related to quantum computing’s impact on advanced material science and chemistry to address water quality issues via development of advanced filtration media and membranes, but I’ll leave this topic aside.
Machine learning
The water industry, for the most part, has had a love-hate relationship with data. They need ever increasing amounts to be efficient, meet regulations, and serve their customers well, but it can quickly become overwhelming.
Water utilities generate significant data at their treatment plants and increasingly throughout their geographically dispersed distribution systems. However, in the U.S., the water utility workforce skews older, a very high proportion of whom lack college degrees. Also, 85% of water utilities in the U.S. have three or fewer employees. The problem is thus twofold: computer literacy and adoption of technology are lagging, while a wave of retirements over the next decade threatens loss of significant institutional knowledge. Utilities want and need the data but lack capacity to handle it.
A common refrain from utility managers is that data by itself is not helpful. In fact, it is unhelpful—if they have data to be better (somewhere?!) but have not acted on it, then they are liable for poor outcomes. What they really need are decision management tools, preferably in real-time. They need all the data fed into a “box” that spits out useful suggestions.
Machine learning can take millions of data points and, through complex algorithms, facilitate optimal real-world decisions. One of the most frequent machine learning applications in water today relates to infrastructure diagnostics. In essence, it helps to answer the question of which pipe should be replaced first. It is much cheaper to fix a pipe before it breaks. But if you replace it too early, that would be a waste of capital. A pipe under a school is more important than a pipe crossing a field, just as a pipe acting as the sole source to a system is more important than one with alternative routing options. Some pipes are likely to crack and leak slowly while others, like a large-diameter prestressed concrete cylinder pipe, could explode, with peripheral damage that turns a street into a river. Variables like age, material, soil characteristics, and proximity to other burst pipes all feature into the algorithms. The more variables used, the better the assessment but the more computing power required.
Simulation & optimization
The water cycle we were taught in elementary school is both simple to comprehend at a high level and incredibly complex to simulate. Integrating the natural cycle (evaporation, condensation, precipitation, infiltration, runoff) with the engineered water systems for municipal and industrial uses (abstraction, water treatment, distribution, wastewater treatment, discharge) offers significant opportunities to improve operations, reduce system stress, and reduce the industry’s contribution to climate change. For example, better predictive modeling of weather events within a watershed can provide insights into potential changes in operation, such as anticipating feedwater quality changes, and adjust the treatment approach or create additional available system capacity (e.g., preemptive emptying) to handle stormwater flows with less risk of overflow events.
Future climate is very relevant to designing infrastructure with intended lifespans upwards of a century or more. Water is the visible part of the climate change spectrum. We see climate change through rising seas, more droughts and floods, more severe weather events, and in water-quality degradation. Water infrastructure investments today may not make sense in a 1.5 degree or higher world. Those 1-in-a-100-year storms seem to come every few years these days. Climate models have advanced a lot over the years but are currently constrained by computing power capable of fully analyzing at smaller geographic resolution and ability to incorporate all the various feedback loops. Quantum computing advances our ability to understand what the world will one day look like.
Recent years have seen increased adoption of digital twins, which are parallel computer simulations of process industries, such as water utilities, incorporating a vast array of dynamic variables. Think of it like a flight simulator for utilities. They can “operate” risk-free in various hypothetical environments to learn how best to manage real-world situations, thus driving operating efficiencies and improved customer outcomes. For capital optimization, generative design software enables the utility process to be altered virtually, becoming a tool for trialing multiple potential capital investments “for free” within the constraints of the real-world system. In other words, instead of engineering a few alternative design solutions to address new growth or quality regulations or simply fix a dilapidated part of their system, software enables the testing of thousands of alternatives in the virtual realm, thus saving significant time and capital to arrive at the optimal solution. One water industry contact suggests that rapid simulations, powered by quantum computing, may help accelerate adoption of new, more cost-efficient technologies.
While machine learning and complex system simulation and optimization are already used today in the water industry, the advent of quantum computing changes their nature and impact. Much of the analysis run on today’s computing power is still sequenced—gather the data, run the models, spit out answers. The process could take seconds, or it could take hours or days. Data proliferation and the evolving nature of certain variables over time (think climate change) add to the wait. And to speed up models today, analysts might be using rule-of-thumbs or engineering algorithms instead of real-time data, which impacts the accuracy and usability of the results. The long waits and data shortcuts limit the analyses’ use.
Quantum computing has the potential to change the nature of the analysis to real-time. The gathering, computing, and application of answers can happen nearly simultaneously. This can lead to not only lower-cost water utility operations but also reduced harm from natural events that might impact their functioning. It also leads to better capital decisions, which can mean more capital available for more projects. The improvements in customer outcomes and affordability of water and sanitation services ultimately generate genuine impact in the world, a world that will face evermore climate challenges, manifested in water.
This article is one section of the report, “Quantum Impact — The Potential for Quantum Computing to Transform Everything.” Click here to learn more and access the full report.
Please see the PDF version of the full report for important disclosures.