Here's why your design framework needs to include a solid digital twin strategy.
Your cell phone, is it a convenience or a necessity? Here’s how you can tell. Several years ago, before the pandemic, I attended a conference in which the session host led an experiment. It was intended to demonstrate our attachment to or, alternatively, our independence from, advanced technology. Here’s how it worked: about a thousand people were seated in theatre-style rows. The host instructed everyone to take out their cell phones and pass them to the person on our right, which in many cases involved someone you didn’t know. Then he asked those people to pass that phone onto someone else on their right. And then once again, to the person to the right of them.
The whole experiment lasted just a few minutes, yet what started with nervous laughter turned rapidly into noticeable discomfort. You could feel the audience’s anxiety rising as their phones moved further from their sight. Cell phones: definitely, a necessity. The question is, when did they shift from a convenience item to a necessity?
The first smartphones with internet access entered the market in 2001 and by 2007, the year Apple and Google entered the smartphone market, there were over 1 billion phones sold according to Gartner. This year, according to Statistica, the global number of smartphone users reached over 6.6 billion, which translates into 83.7% of the world’s population. Now, not only are the phones a necessity, so are many of the applications they carry, even ones that recently hit the market.
Accelerated technology development and adoption
Cellular network upgrades like 4G to 5G are now an expectation. We expect service levels to increase without incident, be always on in any condition, and deliver the promise of a wide range of new connections we never had before. So too are faster processors, more memory, better battery life, and networked applications that transition seamlessly to any of our many devices.
We no longer view new technologies as novelties. We listen to the promise, we try them, and if they work, we rapidly adopt them. The more they work, the more we come to rely on them. The transition from ‘new novelty’ to ‘expectation’ went from years to months to sometimes weeks.
Now, imagine you are a developer filled with the excitement of working on something new that will revolutionize how we live and function. You come up with all sorts of new features, new algorithms, new protocols aimed at automating something that used to be manual and cumbersome. You envision a lot of new use cases; you consider all the operating conditions you need to emulate; you think through your power budget and how much power each feature will consume; you work through every operating mode and user interface considering all the ways in which someone might use your new technology, both as you intended and not.
Your test matrix is now quite large, and every software scrum session, every hardware interface, every operating mode’s impact on battery life needs to be tested every time you make the slightest change. What happens if you miss one – or two – and the feature doesn’t work as you intended? The consequence is that your customer’s expectations will drop, and you might just find your product never moves beyond novelty. You’ve missed the adoption window.
As connected technologies continue to evolve at breakneck pace, our 5G infrastructure will play an increasingly critical role on how much we rely on new technologies. Our expectation for 5G will be so much greater than what we expected when 4G or 3G networks were introduced. New technologies enabled by the 5G network will move quickly from a novelty to convenience to expectation to reliance – if they pass the initial audition.
The validation challenge and digital twin solution
All this—the complexity, the reliability, the interactions, and the faster pace of new technology introduction—presents a new development paradigm for technology companies wanting to push the limits of innovation: a need for farther reaching digital twin development environments.
Hardware developers have long relied on emulation environments as part of layout before prototyping. Using emulators, or digital twins, reduces the number of design variables by allowing them to measure the impact of different operating environments, conditions, and protocol evolutions against known good references. Similarly, software developers use scrum methods and test in emulation sandboxes to incrementally build and deploy new features in smaller groupings, also to limit the number of variables.
The rising complexity of product interactions—communication protocols that evolve, cloud platforms that evolve, continuous software and firmware updates—pose real challenges for developers as each represents a slew of new variables. Using continuously updated digital twins wherever possible enables development teams to reduce their variables that relate to their specific design. Reducing design variables using ‘known good’ digital twin references dramatically increase the likelihood that innovations that function in practice the same as they were envisioned in the developer’s mind. In short, using a digital twins methodology speeds a product’s time to market and reduces the risk of missing the adoption window due to product failures.
Whether you are working on the next IoT device, an autonomous driving vehicle, the next cellular standard, or quantum computing, you only get one chance to make a first impression with your new innovation. Expectations are high, so if you want to become the next necessity, not just another novelty, ensure your design framework includes a solid digital twin strategy to readily meet today’s rapid adoption expectations.
About the Author
Jeff Harris is the Vice President, Global Corporate and Portfolio Marketing, at Keysight Technologies.