A digital twin is a digital replica of something real. The real thing could be anything. In NVIDIA Omniverse, the real thing could be everything. The digital replica includes the geometry, physics, and real-time operating conditions of the real thing. Thus, the digital twin provides an excellent way to monitor and manage. But the magic of a digital twin is that it enables simulation. For example, a particular supply chain could be simulated to find a hidden performance optimization. Or a bridge could be simulated to test failure points and vulnerabilities. Digital twins are enabling us to fundamentally improve every interaction we have with real things.
The earliest digital twins replicated expensive, complex systems like space flight. Digital twins helped NASA’s OSIRIS-REx to land on an asteroid and collect a core sample. With the increasing availability of IoT sensors and the reduction in the cost of computing power, digital twins are moving into more common physical spaces like hospitals and mission critical facilities. As cost continue to come down, digital twins will make their appearance in the rest of the built world where people will use them to assist with space planning, energy utilization, safety and security planning, and a multitude of other functions.
But haven’t we heard this story before?
I became involved with building technologies in an earlier wave of “smart buildings.” Some of those companies were successful and thrive today, but the promise of ubiquitous smart buildings has been elusive. It is reasonable to ask what will be different this time.
Here’s my take:
- IoT Sensors. A digital twin consists of static and dynamic data. The dynamic data comes from IoT sensors. In the last waves of “smart buildings,” the sensing and control systems were expensive, usually had to be retrofit into buildings, performed poorly and were poorly integrated into larger systems of use. Most devices today have a network-enabled option, whether it’s thermostats, lighting controls or VAV boxes and the slightly higher cost on the product side is more than offset by easier commissioning to say nothing of reduced operating costs. It simply does not make sense not to invest in connectivity. To ensure this, the designer needs to make connectivity part of the design intent. This needs to be shared with the owner and built into the specifications so that it does not get value engineered out somewhere down the line.
- BIM as a Process. Building Information Modeling is the process of creating 3D geometry to assist in all stages of the building lifecycle. We have written before about the Model Gap, the unfortunate fate of many models that aren’t made integral to the construction documents, but we also believe that there is an emerging role for BIM Coordinators or BIM-VDC Specialists who manage the digital assets of the building. While this role is common in the UK, where BIM is required for permitting, it is just now entering more common use. This position could be independent, captive to the owner, contractor or architect. We advocate for the architect since they are the agent with the best understanding of the design and best ability to ensure that execution on behalf of the owner. The most important thing is that the geometry retains high fidelity to the actual building so that it provides an adequate vessel for the digital twin to be used throughout operation.
- Library of Static Data. The digital twin has geometry and dynamic data, but it also has a massive amount of static data that point to the operation of its manufactured constituent parts. This includes chiller plants, elevators and automatic doors, but might also include pre-assembled or fabricated parts. The BIM at a more fundamental level is nothing more than a database and the operating manuals and cut sheets can be tagged so that building engineers can access them during maintenance events. Furthermore, networked devices can be reprogrammed, and firmware upgrades and release notes can become part of the permanent record.
- Dynamic Authorization Model. The foundation of a digital twin is a platform that authorizes people to manage the creation and maintenance of digital assets. By contrast, the last wave of smart buildings assumed a system integrator to add the “smart” elements to the building, and the finished product then became the responsibility of the building maintenance staff. This resulted in smart buildings that no longer reflected the building’s evolution. Building equipment whose sensor and control tags fell out of use required “retrocommissioning” to maintain operations. O&M manuals that sat on shelves in the utility closet collecting dust.
In the digital twin concept, the BIM model is a digital asset owned by someone with management authorization provided to particular people for various elements. For example, Otis Elevator might be responsible for the maintenance of the O&M manual and Firmware Release Notes for their elevators. A contracted mechanical engineer might be authorized to make a change to a particular chiller plant at the time of a repair and may be required to update the status prior to receiving payment. The sensors might self-audit on a periodic basis to ensure their effective and consistent operation. In this way, it is not a single thread of development provided to a single point of authorization.
Concert is the authorization platform for the digital twin. The BIM model, the O&M manual, the firmware – all are maintained by the manufacturers and building maintenance people granted authority to manage their work – but the vehicle that ensures file provenance and permissions is Concert.
Assuming these four steps are undertaken, we perceive an exciting future for digital twins in the built world.