Predictive Maintenance: Getting the most out of your Digital Twin
In manufacturing, most companies are already following a preventive maintenance schedule, but introducing a predictive maintenance strategy saves time and reduces costs.
As an example, first consider the economy car you may have purchased several months ago. It’s affordable, reliable and “good on gas”! You’d like to keep it in great shape for when you eventually sell it, and the user manual says to get the oil replaced every four months, or 8000 km. Since it’s now been four months, you take your car to the auto shop for that expensive oil change that supposedly keeps your car running in great condition. This is an example of preventive maintenance. This scheduled maintenance approach could lead to unnecessary costs and maintenance fixes regardless of the actual state of the vehicle, or parts.
With system-level modeling tools like MapleSim, the creation of a physics-based Digital Twin can begin alongside the design process. Now, we can create digital Twins before a physical product is finalized. This allows for a powerful test platform to validate product performance earlier than ever. Tools like MapleSim allow the creation of models as a whole, which becomes invaluable in understanding the complex interactions between different subsystems. Engineers thus can drill deeper into each component and optimize the system across domains earlier in their design process.
Using vast amounts of data from sensors on the product engineers can create predictive models. This assists with the overall diagnostics and improvements to the machine. There are many factors that determine the maintenance schedule for a machine, but the impact of dynamic loading on bearings, gears, and motors, is frequently overlooked because it is difficult to predict without a digital twin in place. By far, one of the most useful applications of a digital twin is predictive maintenance. Putting a digital twin through a proposed duty cycle can help to determine the loads on these components and the impact on the component life. This use of digital twins has long been an important tool in aerospace and automotive applications but advances in communications technology, sensors and AI are encouraging expansion of the digital twin approach across other industries.
Throughout the operation of a product, there will be some amount of sensor data flowing back into the Digital Twin on a regular basis. The amount of data generated by both the model and the physical product can become massive when dealing with new products, which has sometimes limited the insight that engineers could take away from the data. Now, with better software and more computing power, the industrial landscape is set for products that take full advantage of all the data coming their way.
Major Cost Savings
With preventive maintenance, since machine repair is based on
time and the breakdown rates of similar parts and components, maintenance is
only carried out based on expected failure date, rather than collected
performance data. As we saw with the car example previously, this can lead to
unnecessary costs and downtime. Although it can be scheduled, there may still
be unplanned downtime that occurs when machinery breaks unexpectedly. Breakdown
in critical equipment is costly to manufacturers both in terms of repairs as
well as loss of productivity.
By automating the data collection process, and integrating
that with a system-level model, manufacturers are now able to develop a better
understanding of how their systems work together as well as how and when they
will fail. The ability to predict when maintenance should be performed saves
manufacturers valuable time, money, and resources, keeping them competitive in
a highly demanding industry.
Digital Twin for Virtual Commissioning
The capabilities of modern, physics-based Digital Twins can now provide key insight along the engineering design process. Using modern system-level modeling tools, industries are creating Digital Twins before they deliver the physical product. To reduce risk during product integration, individual parts can be connected to the Digital Twin to test performance, by simulating input or output signals that resemble the rest of the physical product. With these industrial data-driven processes, virtual commissioning has become another example of how using good information can reduce the high costs of physical prototyping.
The increasing utility of Digital Twins is quickly positioning them as a necessity for new product developments. With a suite of performance data and a physics-based model in hand, Digital Twins are becoming the modern engineer’s assistant for many tasks. As the technology stands now, however, there’s still room to improve. Our ability to handle large sets of data has come a long way, but many companies are still just beginning to incorporate system-level modeling techniques into their processes. Performance data generated from sensors can be a very costly endeavour, especially when they are used to compensating for the lack of a realistic system-level model.
The Digital Twin has come a long way since its inception and with the right tools in place, manufacturers can continue to actively adapt their design processes. While the initial investment in predictive maintenance can be costly, it allows machine repairs to be performed only when necessary, helping manufacturers save time, reduce costs and maximize resources.
Charlotte Turnbull is the Editor-in-Chief of VirtualCommissioning.com and Customer Success Engineer at Maplesoft.
Charlotte holds an undergraduate degree in computer science and a master’s degree in biomedical engineering.
Graham Jackson is the Editor-in-Chief of VirtualCommissioning.com.
A nanotechnology engineer by education, he’s got a broad background in new technologies across domains. Graham has spent most of his career communicating technical concepts to those outside of the R&D department, and building support for system-level approaches to modeling and simulation.