Logo

The Data Daily

Building the digital representation with Digital Twin using Microsoft stack

Building the digital representation with Digital Twin using Microsoft stack

Building the digital representation with Digital Twin using Microsoft stack
March 2022
Sujay Nandi, Associate Partner and Executive IT Architect, Data and Technology Transformation at IBM Consulting
Sanjay Panikkar, Lead IoT Architect and Consultant, Data and Technology Transformation at IBM Consulting
Víctor Sánchez Chumillas, Executive Architect Connected Edge & Digital Twin
What is a Digital Twin?
Figure 1: Digital Twin Context
Digital twin is a dynamic, virtual representation of its physical counterpart, usually across multiple stages of its lifecycle. It uses real-world data combined with engineering, simulation or machine learning models to enhance operations and support human decision making. Digital Twin mirrors a unique physical object, process, organisation, person or other abstraction and can be used to answer what-if questions, present insights in an intuitive way and provide a way to interact with the physical object/ twin.
The key elements in a Digital Twin:
While there are many different views of what a digital twin is and is not, the following more or less lists the common set of characteristics that is present in nearly every digital twin: -
Connectivity created by IoT sensors on the physical product to obtain data and integrate through various technologies. Alternatively, capture on field information of assets through drone photography or LIDAR scans followed by 3D reconstruction using different techniques.
Digital Thread, a key enabler interconnecting all relevant systems and functional processes Homogenisation: decouples the information from its physical form
Re-programmable and smart, enabling a physical product to be reprogrammable manually and in an automatic manner
Digital traces and modularity, to diagnose where the problem occurred
Digital Twin, is not a single technology play, rather it is realized through an amalgamation of a multiple of technologies and essentially it consists of functional building blocks that address specific business aspects: -
Sensors and Assets capable of transmitting telemetry/data/ real-time data ingestion gathered from physical asset/ objects related to state, conditions and events
Connectivity between platform and sensors (Internet, 3G/4G, CDMA, LoRa, …)
Data Platform core system for a) data management — To integrate, persist, transform and govern the data collected; b) analytics — Using machine learning framework and analytics to make real- time decision based on historical and streaming data; c) user experiences — combines the data and insights to present, advise and interact with the user or other machines
Digital Twin core capabilities that uses the managed data obtained by the Data Platform to create Applications for the Use Cases. Creates a digital thread based on semantic model, data dictionary and knowledge graph
Monitor applications for the operational team maintaining the Digital Twin solution
Enterprise systems, for authentication, security, libraries, utilities etc. reused by the platform
Use cases that enable the value obtained from the Digital Twin, these Use Cases would require either processes or combinations of the different functional components
Additionally the following blocks supports the underlying digital twin platform: -
Workflow and APIs: Extract and share data from multiple sources in creating the digital twin and/ or infuses the insights within workflow of digital twin
Infrastructure: Hybrid infrastructure including Cloud, Edge compute, In-plant infrastructure, Network (IT/OT) etc., OT infrastructure etc.
Figure 2: Digital Twin functional blocks
The Digital Architecture of Digital Twin:
The logical digital architecture for digital twin is the level 2 details from what has been described in the above section. Objective is to be able to support the functionality to connect, monitor, predict and simulate multiple physical objects, assets and/ or processes.
Figure 3: Digital Twin Logical Architecture
Building Digital Twin with Azure and hybrid technology stack:
We propose here a digital twin that is primarily based on Azure stack along with the other software: -
Figure 4: Digital Twin Technology Architecture with Azure and hybrid software platform
Architecture Layer: Data Platform: Ingestion, Normalisation, Persistence, Management, Integration, AI/ ML/ Analytics, Microservices
Technology: IoT Hub, Event Hub, PostGreSQL/ SQL DB, Cosmos DB, Azure Time Series Insights, Databricks, Data Factory, Azure ML, Azure DevOps, Synapse
Vendor: Azure
Purpose: Ingesting and processing device telemetry securely is one of the key requirements in Digital twins. A combination of IoT Hub and Event Hub will allow connecting to heterogenous data sources, multiple protocol and support streaming ingestion. Databricks and Data Factory meant to provide consistency of data through the validation, enrichment, cataloguing and transformation conforming to the standard and integrate with the other systems. Considering the diverse nature of data that will be handled by Digital Twin, polyglot storage is recommended. Thus a combination of relational, non-relational, time-series, data lake and warehouse is required. Cloud based Machine Learning Platform for building, training and deploying models based on the data collected via data platform. Also provides ability to perform analytics on streaming data.
Architecture Layer: Digital Twin Framework: Digital Twin Engine, Digital Thread
Technology: Azure Digital Twin Service, KITT
Vendor: Azure, IBM
Purpose: The Knowledge Graph represents a collection of interlinked descriptions of entities — objects, events or concepts. It will put data in context via linking and semantic metadata. Azure Digital Twin is a platform that enable creating a digital representation of real-world things, places, business processes and people. It is based on open modelling language to create custom domain models of any connected environment using Digital Twins Definition Language (DTDL). The Live execution environment brings digital twins to life in a live graph representation. Input from IoT and business systems to connect assets, including IoT devices, using Azure IoT Hub, Logic Apps and REST APIs. Output to Time Series Insights, storage and analytics using event routes to downstream services including Azure Synapse Analytics. The IBM asset KITT is a general purpose knowledge graph that enables the Digital Thread required to link lifecycle information and data together. KITT is proposed to be deployed on Red Hat OpenShift.
Architecture Layer: Digital Twin Solutions: Visualisation, Consumption, Intelligent Workflows, Intelligent Operations, Assets and Simulation
Technology: Unity, Reflect, Hololens, Spatial Anchor, Remote Rendering, RedHat Process Automation Manager, Logic Apps, IBM Maximo, OpenShift
Vendor: Azure, Unity, Redhat, IBM
Purpose: This layer is responsible for portal, real-time dashboards and command centres. AR and VR services are required to visualise diagnostics, predictions and recommendations for physical world. An extension of dashboard visualisation is also required in digital twin to provide a view in 3D/2D. Workflow Management in digital twin is intended to deal with the business processes, simulation and event based flows. Besides the Azure Logic Apps we also propose RedHat Process Automation which could be hosted in the RHOS on Azure. To provide secured access to application APIs and batch files we propose micro-service based applications built using AWS API Gateway, Node JS/ SpringBoot and exposed via API Gateway.
Architecture Layer: Governance & Operations: DataOps
Technology: Azure Databricks, Azure Data Factory (ADF), Azure Data Lake Storage (ADLS)
Vendor: Azure
Purpose: DataOps and Data Governance is an essential element of Digital Twin that sits atop the big data which must be available on time, be automated and managed well to extract value. Azure Purview with Data Bricks provide end-end data capability to discover, prepare, combine and goven data for analytics, machine learning is the right fit for the purpose ADF for data orchestration, Azure databricks for cleansing and standardisation, ADLS for storage.
Architecture Layer: Operational Processes
Technology: MLOps, DevOps,
Vendor: Azure
Purpose: Azure MLOps and DevOps helps to build a continuous integration or continuous delivery workflow and fulfils the AI@Scale goals providing capability to build, train, deploy and maintain machine learning models in production reliably and efficiently.
Architecture Layer: Edge Services, Fog Services, Device Connectivity, Reality Data, Reconstruction and Modelling
Technology: Azure Edge, CISCO, Bentley, Perspective, Siemens, Dassault, Boston Dynamics, Precision Hawk..
Vendor: Misc.
Purpose: Digital Twin infrastructure transcends beyond cloud and on-premises infrastructure. We propose a miscellaneous solution including Azure edge, drone infra, robotics to cover the entire gamut of data sources such as reality data, modelling data etc.
Conclusion:
Digital Twin connects the digital and physical world as it truly takes the connected solutions to the next level with IoT, machine learning, robotics and virtual reality at its foundation. Creating an end-to-end digital twin platform, requires lot more than single set of capabilities with heterogenous software and hardware stack, multiple set of architecture with the principal theme being data. There are specialised software vendors for each layer or architecture, the flexibility comes from adopting a hybrid approach wherein the hyperscalers forms the core part of the solution. While the individual customer environment will determine hyperscalers, the advantage of Azure with its strong IoT and one of its kind Digital Twin centric platform can be leveraged to build powerful digital twin solutions.
References:

Images Powered by Shutterstock