There are just six industry trends that have been selected by research company Gartner as being among the 10 most important for the past two years. Many of the examples regularly make headlines, AI and blockchain in particular. ‘Digital twin’ technology – one of the other names on this list – remains little known in mainstream circles, however, even if it is increasingly grabbing the attention of business leaders.
Essentially, a digital twin is a software mirror of something that exists in the physical world. The real-world twin collects data, feeds it to the virtual twin in the software world and then, based on this data, businesses can conduct analyses, make predictions and prescribe interventions.
Digital twins are heavily reliant on sensors to collect vast quantities of data and, therefore, are closely connected to the development of the Internet of Things (IoT). In fact, 48 percent of enterprises that are already employing IoT solutions are using or plan to use digital twin technologies by the end of this year.
Digital twins reduce the likelihood of business failure by giving staff insight into when processes change and regress
Digital twins promise a host of benefits for a wide range of industries, and their prevalence is only likely to increase as organisations gain access to larger datasets and more advanced analytics tools. By 2021, 50 percent of large industrial firms will be using digital twins, resulting in a 10 percent improvement in effectiveness. As the benefits of digital twin technology become better known, their prevalence is likely to surge.
Currently, the primary users of digital twins are businesses based in the manufacturing and logistics sectors. As these industries make use of complex parts or highly integrated supply chains, the extra visibility provided by digital twins is of particular benefit. Digital twins provide a model that allows employees to monitor and analyse performance in real time. They can identify problem areas, find inefficiencies and implement solutions.
Unsurprisingly, the best method of convincing business leaders that their organisations should employ digital twins is by championing their financial benefits. One of the ways software-based simulations can bring monetary gain is through predictive analytics. According to market intelligence company Aberdeen, 82 percent of businesses have experienced unplanned downtime over the last three years, with each incident costing as much as $260,000 an hour.
Digital twins reduce the likelihood of business failure by giving staff insight into when processes change and regress. Preventative intervention can then take place to ensure revenue isn’t affected. For example, even if the technology is only applied to a small part of an assembly line, if the failure of that component would send an entire business process grinding to a halt, then digital twins could save businesses vast amounts of money.
Although in many cases digital twins are being used in relatively uncomplicated ways – say, by a catering firm to monitor refrigeration and food quality – they have the potential to be involved in any number of complex environments. Stephen Brobst, Chief Technology Officer for Teradata, told The New Economy that digital twin technology is becoming increasingly sophisticated.
of enterprises with IoT solutions already use or plan to adopt digital twins
of large industrial firms will be using digital twins by 2021
The potential per-hour cost of unplanned downtime
“Smart cities are a huge area for IoT and digital twin data because you’re basically simulating traffic patterns, energy consumption, pollution statistics and much more,” Brobst explained. “What you’re seeing in more advanced implementations is that digital twins are being used to make more complex predictions.
In agriculture, for example, digital twins could tell farmers the optimal time to plant, fertilise and harvest their crops. In healthcare, sensor data could be taken from the human body. What we’re witnessing is an evolution from digital twins being associated with simple machines to being paired with smart buildings, entire organisations,
cities and even living things.”
The IoT will result in household appliances, urban infrastructure and many other objects being connected online. The data they create will provide a huge boon to digital twin solutions, and as analytics programs develop further, so too will the advantages provided by this fledgling technology.
While data is the lifeblood that keeps digital twins going, managing the vast quantities of information required to keep simulations up to date and accurate brings its own challenges.
Having the network infrastructure in place to handle the storage, transfer and analysis of this data is not easy, particularly given the sheer volume of data. For example, for every 30 minutes of flight, a Boeing aircraft will produce over 10 terabytes of data.
Brobst explained that digital twins make use of a technique known as ‘fog computing’ to alleviate the burden, using devices at the edge of a network to determine which datasets are useful and which are not.
“When you’re training a digital twin system, you need a lot of data,” Brobst said. “However, during the normal operation, you cannot assume to deliver all the data to a centralised analytics infrastructure. Instead, you use a technique called ‘edge computing’.
“Once you have the rules for detecting and describing intervention, you push those rules out into the edge devices. For example, if you are monitoring an automobile engine, there are all kinds of data being generated during the normal execution of the car that is not that interesting.
“Although the edge device – the car – listens to this data, there’s no need to bring it back to the centralised brain. This is only done when something is recognised as unusual and there’s an opportunity to be had.”
Brobst notes that for a typical digital twin deployment, once the initial supervised learning stage is complete, less than 0.2 percent of the data is truly interesting. Even so, the rest of the data cannot simply be forgotten about.
Implementing robust security protocols around all the collected information will be vital if digital twins are to avoid the kind of breach that could undermine the technology.
Businesses that are keen to enjoy the cost benefits of deploying digital twins will, of course, have to ensure they have the right hardware and software in place first. While this may require a certain degree of investment, it is likely that many firms will already have a lot of the infrastructure in place.
“Most companies in the B2B sector are already collecting the necessary data for digital twins, so investing in sensors and network infrastructure is not the main barrier to adopting digital twins,” Brobst noted. “The main barrier is having the sophistication to actually use that data, so I would argue that it is not an economic barrier, but a skill set barrier. As this is a relatively emerging technology, the challenge is in getting the data scientists and staff expertise required to deploy digital twins.”
Moreover, some of the benefits of digital twins are likely to take precedence over financial concerns. Construction and manufacturing are two of the fields most likely to utilise digital twins and also have some of the highest incidences of workplace accidents.
By predicting when a fault may occur, businesses can identify dangerous situations and prioritise employee safety more effectively. While safety issues are always unplanned, by detecting a problem in advance, businesses may be able to intervene before an employee gets hurt. The technology could save a company some money, but it could also save a life.