By Abhishek Jain
Houston, we have a problem! It was not a problem, it was a catastrophe in the waiting. Major systems had malfunctioned and there was a real danger of losing the astronauts and the vehicle. But NASA has planned for it, a module just like the real one had been created on ground. This was the “physical twin” of the Apollo 13, extensively used for training and ideation. Anything required to be changed in the Rocket would be first tested here. When problem happened, backup astronauts quickly came to the rescue and the rest as they say is history.
God forbid! What if we need to hear “Delhi! We have a problem”. India will not worry as much. Scientists from DRDO, ISRO, CDAC and many private companies will switch on the digital twin and study multiple scenarios on high performance computers that each one has. They will be able to simulate any catastrophe including trajectory, blast or unwanted separations, leaks, walkthrough in debris and many more. We will not be limited by the lone backup astronaut and best minds will come up with a best solution.
The Digital Twin
Like any buzzword, Digital Twin (DT) is widely misunderstood, vaguely defined in various sectors and sometimes has definition that is differently used in the same sector. Like the one with Defence Acquisition University “An integrated multiphysics, multiscale, probabilistic simulation of an as-built system, enabled by Digital Thread, that uses the best available models, sensor information, and input data to mirror and predict activities/performance over the life of its corresponding physical twin”. With so many definitions let me add to the Chaos or probably clear it. Let us just simply define it as a computer program that will give performance, process or visualization results as if we were operating a real physical system.
It should be noted that the term DT may have been recent but the use of computer model for the development of aerospace and defence systems is longstanding. First of the simulations were done in the late 60s where a vehicle re-entering the atmosphere and the corresponding aerodynamic forces was calculated.
Having defined a DT that can be understood by most of the people, we understand one obvious and one not so obvious thing. The obvious one is that digital twin means software and second not so obvious is that software will be need high performance computers (supercomputer term is not used nowadays).
The most fortunate part is that these two are capabilities that exist in the country in abundance. We have a pool of software engineers that are building cool stuff for the MNCs and we have multiple agencies having high performance computers. The part that is lacking is the ability to understand defence systems and create a software that will be able to model its physical operating characteristics. This will require high end mathematics and physics in addition to the computer programming and hardware.
Why Do We Need It?
Now that we have understood what a DT is and what are the basics required for making it, we need to answer the question, why do we need it. Answer lies in the fact that all the systems are now designed first on a computer, be it an aircraft flight, missile trajectory, environment control system of FICV or walk through inside a ship. The simulation models in aircraft applications are accurate to the third decimal for many applications of flight.
But design is not the main reason why we should start making multi-disciplinary digital twins, though it is a starting point. The main reason why DTs are very useful is:
2. Training and Walkthrough operations
3. Data visualization and performance
4. Root Cause Analysis and System Performance
In many cases in defence systems testing is extremely expensive or it is hazardous. These are the places where DTs can come in handy. It allows to do multiple simulations, multiple scenarios, and multiple malfunctions before we make the actual equipment and test it.
The other major benefit is the training and walkthrough. Put a human model in the entire system and then it shows the places where access is difficult, comfort is compromised, or it is downright not reachable. Operations of the system can be learnt from the digital twin. Imagine a large power plant control room where the operation does not see the plant in operation but is getting real time data of performance form the sensors. The operator can press buttons and change the way the system is functioning. Now imaging this all being digital, and no power plant at the back of the system. Just the software giving feedback regarding the performance when the user gives input. The beauty is that you can crash the software and no one will be at loss. Physical systems are equipped with many sensors and these keep churning data. The data is huge and can make sense only if an analytical software at the back tries to do so. DTs are usually fed with this data so that future performance of DT increases and training becomes more realistic.
However, in my opinion the system model part is the most useful. If we create a model where all the systems are working like an orchestra, then we can think if we can change one player and study the effect on others. We need to keep in mind that best individual players do not make a great team. With system models, we can even arrive at the specifications of the tender. The DT can tell what is feasible and what is not.
How to do it?
We need to make it compulsory to have a DT at the initial stages of development. Entrust a team that will look at the engineering change management. This will mean that at each stage in the life cycle of the product there will be team that will know the various versions of the product and be a guardian of it. People who know the defence system are the only ones who should be allowed to work on the DT. A software engineer with no background in defence is a bad idea to development or execute the DT. A person with background in defence can be trained to work on the DT but not vice versa. Finally, to create DTs, we need trained manpower. We will need to tie up with education institutes to give more focus on this aspect.
The major pitfalls in the DTs are atheists and the devotees. One set thinks nothing can be done on the computer and the other one thinks everything is possible digitally. As usual the truth is somewhere in between. Both the naysayers and yes-men are hazardous to the product development or testing. The developer must ensure that instructions regarding the limitations and accuracy are passed on transparently. The bigger pitfall is validations. I can write a software where gravity may repel you upwards. There has to be a robust mechanism of checks for the DT to be valid. NASA and European agencies create physical assets just to validate the DT. It is quite clear that they are ahead due to this cooperative approach. We need to follow the same approach where we will test physically, validate the digital model and then make the DT standard.
To conclude, the cost benefits of using a DT are high. Time saved will be huge too as we are setting up everything on a computer without any logistics cost. DT is an idea whose time has come, we need to just embrace it. We will save time, hazard, and most probably increase performance too. We will train well. In my previous article I argued “Future is precise” and we need guided weapons/UAVs. This will not happen without DTs being developed. Houston! We have a solution. Engage!
-The writer is Vice President, Strategic Partnerships, Zeus Numerix, a Pune-based defence and space solutions providing company.The views expressed are personal and do not necessarily reflect the views of Raksha Anirveda