The convergence of Artificial Intelligence and Cyberphysical Systems, including Electronic and Physical Security Systems and Digital Twins, offers unparalleled opportunity and unprecedented risk. Adopting a Zero-Trust framework will mitigate risks and enhance the effectiveness of Digital Twins, enabling the maximum exploitation of cutting-edge technology while improving safety and security.

“All models are wrong, some are helpful.”

A friend’s rephrasing of 20th-century statistician George Box’s axiomatic admonishment, “Remember that all models are wrong; the practical question is how wrong do they have to be not to be useful.” Digital Twins, high fidelity synthetic doppelgangers that model everything from automotive engines to the grid, are powerful tools that in the contexts of electronic, physical, and cyber security solutions enable security professionals to understand the effects of incidents ranging from environmental anomalies to physical and cyber intrusion attempts on operational networks, facilities, and systems. The insights gleaned from these disruptive technology tools, particularly Artificial Intelligence (AI)-enabled real-time variants, can equip planners to understand efficiencies, vulnerabilities, and optimization opportunities across the entire system life cycle, from design to obsolescence.

Despite risks inherent in Cyber Physical Systems (CPS) and high-profile attacks like Volt Typhoon, Sandworm, CyberAv3ngers, the market has embraced the promise of efficiency and revenue-saving potential. A recent Gartner report projects the Digital Twin market to grow from $35 billion in 2024 to $379 billion in 2034, representing a 900% increase over the coming decade. This technical article, the third in a four-part series exploring the convergence of cutting-edge technology and Zero Trust (ZT), examines the importance of getting data security right in the relentless pursuit of cost-effective, reliable, and frictionless security.

In the late 1950s and early 1960, mathematicians, computer scientists, and strategists started using the phase, “garbage in, garbage out”, or GIGO, to convey that the quality and credibility of a system’s outputs–actions, analysis, recommendations, and decisions–are dependent on the quality and credibility of that system’s inputs–data. Today, with AI dominating headlines and investment portfolios, GIGO is truer than ever as we rush to innovate at the speed of relevance. AI unleashed on dubious, poisoned, or corrupted data sets can drive equally flawed actions, analyses, and recommendations. The implications are even more dire as we recognize the reality we have been living with for at least two decades and will continue to do so for decades to come: the ubiquity of AI-informed or AI-rendered decisions.

Bad data in = bad data out. A more extended version of that statement might read: Questionable data can prompt both indecisions, driven by skepticism as to its credibility, and overconfidence, driven by the mistaken belief that unprotected data is accurate, credible, and unpoisoned.

In 1992, James Carville admonished, “It’s the economy, stupid!”. Thirty-three years later, political pundits may still debate the Ragin’ Cajun’s pithy tirade. Still, politics aside, the debate is over in the context of converged cyber and physical security systems in 2025. It’s the data, stupid! High fidelity Digital Twins mine petabytes of data generated by myriad edge devices deployed at scale across CPS and industrial control systems (ICS) including supervisory control and data acquisition (SCADA), operational technology (OT), or Industrial Internet of Things (IIoT) sensors (i.e. thermostats, motion, smoke detectors, etc.) and actuators (robots, lighting controls, valves, hydraulics, etc.). For today’s connected industrial controls, building automation, and security systems to deliver the promise of a better, safer connected future, it is paramount to protect the data these systems produce, ingest, and analyze from inception to archive and everywhere in between. Unless sufficiently protected, planners, decision-makers, and operators should be cautious of system-generated recommendations based on the data.

Digital twins are generally characterized by the physical or virtual object, entity, or system to be modeled, a digital counterpart to that physical or virtual entity, and, often, a data connection that feeds data from the sensors and actuators in the live entity or system to the digital twin.

Paradoxically empowered and encumbered by an intrinsic reliance on data, Digital Twin solutions require massive compute power and access to raw data streams from the myriad sensors and actuators of their operational opposites to realize their full potential. Though access to raw data uniquely empowers Digital Twins to synthesize systems states to deliver real-time analysis, which identifies vulnerabilities, streamlines processes, and highlights inefficiencies that could unlock massive bottom-line savings, the threat of manipulated, corrupted, or otherwise poisoned data is very real and must be mitigated.

Where Digital Twins and the operational systems they mirror are often ostensibly “air-gapped”, the concept is, realistically, a myth. Examples of zero-day exploits of “air-gapped” systems are numerous and unsettling in terms of the profound impact and relative ease of compromise.

So, despite their promise, given their limitations, vulnerabilities, and dependencies, can security professionals trust their data, assets, and personnel to digital twins?

Security environments are dynamic and require real-time monitoring, rapid response, and adaptive threat mitigation strategies. Digital Twins can bridge cyber and physical security without impacting operational systems, offering organizations the ability to enhance surveillance, improve threat detection, optimize system performance, and refine response strategies. In terms of both understanding the current state of a security system and maximizing system effectiveness, Digital Twins are game changers in security. Physical exercises, scenario-based events that pit Red (the Aggressor) against Blue (the Defender), are among the most expensive, resource-intensive, and operationally disruptive events in which security teams and organizations participate. Think of fire drills or active shooter drills on steroids.

Although these exercises have their place and no security policy or plan is complete without them, inherent artificialities, such as working hour limitations, off-limits areas, and safety restrictions, offer an abysmally low representation of overall system effectiveness. Furthermore, due to complexity, artificiality, and operational disruptions, even security-conscious organizations are often hard-pressed to execute more than a handful of small-scale events each year. Digital Twins offer high fidelity, physics-based models against which security professionals can perform thousands of simulations where they can tweak variables ranging from Red and Blue capabilities, construction materials, weather conditions, lighting conditions, etc.) to get a more granular understanding of baseline performance, breaking points, and the effects of potential upgrades. The same principle applies to cyber environments where Digital Twins can facilitate penetration or pen testing as both a vector and a model. In short, Digital Twins offer security professionals a more realistic and empirically defensible assessment of their systems’ status, and as such, are invaluable tools.

While the outlook is promising, practical challenges remain on the path to wide-scale adoption. Organizations often struggle with integrating Digital Twins into environments that include both legacy infrastructure and modern platforms, as well as aligning CPS that were not originally designed to work together. Fortunately, new technologies which leverage ZT to authenticate and encrypt data exchanges between the IT, OT, and IIoT devices, like Prometheus Security Group’s Digitally Encrypted Security Interface (DESI) and Universal Field Panel (UFP), are emerging to bridge these gaps—enabling modernization through attrition to enhance legacy systems rather than driving costly and disruptive rip and replace projects.