The leap from BIM to Digital Twins

There are many definitions of BIM out on the internet and now the term Digital Twin is taking over, this is also being used and redefined to suit various consultants and vendors to sell their way of delivery.

I prefer to use the Institute of Civil Engineers definition of:

The ‘Digital Twin’ of a physical asset helps us to understand how assets operate in a wider system and how they interact with other assets. A Digital Twin is a ‘bridge’ between the physical and digital world. They can be predictive and adapt physical systems to reflect changes in the environment or operations. The benefits include harmonisation of operations to deliver optimal user outcomes, clash identification and automated remediation, and ultimately cost/risk reductions.

If you take my original definition of BIM, which likened it to the internet of databases of a specific asset then building on that to link the data in these servers with the actual physical asset and the other assets that interface and interact with them, is a relatively small step! However, if you are still in the world of thinking BIM is a dumb 3D model, then you have a long way to go!

We’ll discuss Digital Twins and how they will drive forward our industry and the requirement for a “National Digital Twin” in the next chapter.

Digital Twins

The concept of Digital Twin was first raised in 1991 in David Gelernter’s book, “Mirror Worlds” and first publicly introduced at a conference in 2002 by Michael Grieves of the University of Michigan.

This concept consists of 3 components: The physical asset, the digital representation of the asset and the connections between them.

From the beginning Crossrail’s digital strategy contained the delivery requirement for both a physical and a digital railway. These have both got to be in the plan right from the very beginning - pre preplanning, so that information is gathered, and connections are made before design or contraction is even considered. This is so the interfaces and impacts on each other are understood and any sensors or actuators are part of the plan, right from the start. This is not always possible, but it should happen as early as practicable.

It must be understood that information will be generated and managed by many different pieces of software, each with their own ways of structuring and classifying the data. Any true Digital Twin must take this into account and be able to federate together many sources of information so that the user can access it without translating or converting which would considerably increase the risk (and cost) during the lifecycle of the asset.

The key thing with this is to ensure that before you or your supply chain starts to create any information that standards are put in place on how to structure, classify and define it. With some form of quality assurance plan to enforce these standards for every participant in the creation of the Digital Twin

Open Standards

No technology vendor can possibly satisfy the vast array of authoring requirements for all the information that will make a Digital Twin. The only way that a true twin of the world we see around us delivering everything from social, economic and environmental modelling to structural, transport, health and industrial modelling is through ensuring the information they create is delivered in an open approach, so that the old risks of interoperability, translations and conversions don’t tarnish the long term vision.

Sensors and Actuators

One of those key components for a Digital Twin is the connectivity between the physical and the digital enabling the passage of data between the two. This data is not just from Sensors embedded in the Physical Asset that will update information in the Digital Asset, but also actuators that are controlled by the digital asset that can change things in the physical. This is a concept spoken mostly about when looking at the Internet of Everything.

The automotive racing industry has been linking the physical and digital together in a live environment for many years. I am proud to have been an ambassador for the original Bloodhound Super Sonic Car team. Many parts of the car, from its engines, steering, brakes and electrics have a Digital Twin that is both monitored and tweaked remotely when required to ensure the car achieves the outcome set by the team, to reach 1005mph and survive the attempt!

 

The current thinking with sensors is that they and data storage are relatively cheap, so we should be proliferate in their deployment. The data we collect today may have little value, but when looked at in the future and in relation with other interconnected data then it may flag up some very valuable trends showing any weaknesses in the systems and root causes for operational or maintenance issues.

As our Digital Twins get smarter and we add in artificial intelligence to help predict future issues, the ability for the Digital Twin to tweak and control elements of the physical asset will become more valuable allowing a much more efficient operation.

Taking that a step further and linking it to the supply chain systems whether that is for products or skilled people, then either the Digital Twin can start ordering its own parts when they are cheapest or advising agencies about the type of skilled worker needed in the next 6 months to conduct repairs or maintenance tasks!

Whether you see this as an extension of BIM, a leap into Digital Twins or just good practice it is essential that you get the strategy right, not only to suit your current portfolio of assets, but looking into the future as to what will be the outcomes demanded by your end users over the next 50 years. This book is all about getting the foundations right before you build the rest of your digital asset.

National Digital Twin

Whereas most organisations will concentrate on getting their own Digital Twin into a position where it both reflects and interacts with the physical asset, governments should be looking at how these twins will interface together to create a connected National Digital Twin, that will be able to help in forming a national strategy.

The key point being, understanding how each owner’s assets interface with and depend on each other. In my opinion this should start with a simple network study at facility level across all assets that will form a framework to hang much more in depth information for individual assets from.

This is something that the National Infrastructure Commission alongside other UK Government departments are looking at.

A part of the potential solution being could be provided by the Infrastructure Assessment methodology pioneered by the UK’s Royal Engineers. (see the prioritising the Digital Twin section). However, once complete something like this could be a double-edged sword and will need to be secured correctly to allow for access only by the right people, to the right level of security clearance at the right time!

© 2017 - 2020 by COMIT Projects Ltd.

 

  • LinkedIn Classic
  • YouTube App Icon