How finding failure and testing tech is crucial for TREL, a collegiate team of rocketeers, to achieve their goal of sending their rocket Halcyon to space.
You likely have untapped potential waiting to be discovered in your test data. In fact, we’ve found that nearly 90 percent of all the test data ends up being stored and never utilized further. But what if we could use that test data to improve product processes and even product design?
It’s not enough to capture data—we must connect its digital thread in order to collect valuable insights. We’ve witnessed industry leaders like Jaguar Land Rover, Toyota, and Raytheon taking full advantage of their test data. Doing so has allowed them to optimize product processes and improve product design. What’s preventing you from using your data?
In the third episode of Test Talks—our video series that invites experts to weigh in on test, technology, and trends—Shelley Gretlein, vice president of software strategy, invites teams to challenge their perceptions of test data and harness the power waiting to be discovered.
Eight months ago, we were in quarantine, all stuck at home, and the kids were bored. They were driving us crazy. So we decided to send them outside. Well, a little while later, my son comes back like this. I don't know about you, but that doesn't look right. Sure enough, we get him to the ER, take an X-ray, both bones were broken. He is quite the overachiever.
A few months later, with his brace on, he was playing soccer again, PE class at home, and he broke it again, in the same place, both bones. A few months later, you guessed it, he broke it again, on a mulch-filled playground. Now, the doctors each time saw a kid with a broken arm and set the break. But as a parent, I saw a trend, a trend I really wanted to stop. I needed to find the root cause.
These were three incidents and three very important data points. And individually, they were super common. Active kids break bones all the time. But put together, this pointed to a potentially more serious issue. But in this case, without a robust electronic medical record infrastructure, I had to ask, are the steroids he's taking for asthma impacting his bone density? Is there a nutritional gap? Should we run some labs?
Instead of the system automatically linking known side effects or flagging that something is worth investigating, there was no one advocating for the data to be connected, for the test to be correlated, or to find a root cause. Now, I would hazard to guess that if you're an engineer watching this, the same thing is true of you and your test teams. You're probably doing a great job treating each individual broken arm. But you're not connecting all of the dots, and therefore, you're not harnessing the full power hidden in your test data.
Now how do I know this? Well, at NI, we've found that nearly 90% of all of the test data ends up being stored in a database on a hard drive somewhere or gets deleted. But there's another way. We've witnessed some of the biggest names in the industry harnessing the power of their test data.
I always remember a really cool story from Jaguar Land Rover, where they went from hardly using any of their test data just by formatting it to being able to analyze 95% of their test data, and they felt the impact immediately. They were able to report 20 times faster analysis than their previous manual methods. Toyota, they were able to reduce their people hours needed for analysis by 50%. And Raytheon reduced their reporting and analysis time by 95%. The list goes on and on.
Now, I've been working with test engineers and test departments for decades. And too often these leaders that I'm working with are frustrated by how the organization actually views their role. Many design teams see tests as a necessary evil, a step that slows their innovation and slows getting it to market.
But data, data can flip that misperception on its head. If you can be the one to find the insights and reveal the value in the test data, you can improve your product processes and even the product design can improve. You can feed your test data into simulations and improve their accuracy. You can feed valuable test data into manufacturing operations to find bottlenecks. You can increase your product quality and, paradoxically, you can actually decrease the time to market. Tests can speed things up, not slow them down.
Now suddenly my frustrated test engineering friends are now heroes in their companies. They're delivering new insights, new ideas, and new ways of looking at how they can improve their product's performance. Now, this might sound like a nice-to-have to some of you, but I believe the pace of innovation that you are dealing with right now makes this required.
The key to achieving the engineering challenges that we're all facing, like sample size one zero defects, or increasingly sophisticated consumer goods manufactured at scale, yesterday's methodologies with test reports being run manually, housing things in Excel sheets on a local drive, these cannot keep up with the desires we all have. You can't build a safe autonomous vehicle or get a person on Mars or build the green cities that we all want in the time frame we desire with those old methodologies.
So let's talk about the hidden power in your test data. You can use data for better simulations. That's kind of obvious when you can't reach the environment, like the helicopter that was successfully landed and deployed on Mars. The engineers, of course, couldn't go out and test their prototype to see how it actually performs in the 95% carbon dioxide atmosphere.
They had to rely on simulations derived from known data. How dense or, in this case, how thin is the atmosphere? How much lift will the blades generate in that atmosphere? What will the flight time be in such conditions? All of these have to be answered with data-driven simulations. But simulation isn't just helpful when you can't access the environment.
It's also helpful when you're developing something super complex and time isn't on your side. Let's think about autonomous driving. If we had to rely on test tracks to build the autonomy and figure out the algorithms, it could take up to 100 years to train those algorithms. Simulations can save massive amounts of time while also testing products more thoroughly than ever before. But for this to work, you need accurate simulations, which require good data. And where do you get that data? From test.
Now, the second area that I get pretty excited about is process optimization. And this is really exciting, because usually teams go into these types of projects focusing on one area of efficiency that they want to improve. But if they can zoom out and look at the whole process and look at all of the data holistically, they actually, usually, find even more savings.
My favorite example with process optimization is actually a lighting company we worked with. They found out that an intermittent network error was triggering a series of events that was causing perfectly good products to fail and then be rerouted for rework unnecessarily. They never thought there would be a connection between the IT infrastructure and their manufacturing process. But by identifying this through their test data, they actually saw a 13% increase in their manufacturing efficiency.
Now, the third area I wanted to cover how test data can help us is how it can improve your product quality. Again, this is intuitive. We know test can improve quality, and you're probably doing elements of this. But are you harnessing the full power of your test data and also discovering not just the defects very quickly, but are you able to discover them before they become a problem? The speed and precision with which you could do this if you have a fully connected data system is incredible. So what does this look like in action? At an automotive company, we were noticing interesting issues with one particular task at one particular station. And this was actually flagged because they had a centralized data strategy. As they looked at the data, they were actually able to trace this issue back to one specific set of defective screws where the heads were cut too shallow, causing them to fall out and taking longer to assemble the vehicle. They removed the entire batch of screws, addressed that with the supplier, and likely avoided a recall situation, all because of the test data. Where they may have otherwise just dismissed the machine or the tester being off, they were able to get ahead of a potentially pretty rough situation. Again, all by looking at test data.
Now we at NI, we know these benefits are real, not just because of these incredible success stories we hear from our customers, but because we're discovering them for ourselves at our own manufacturing facilities. A few years ago, we saw the opportunity to connect our historically disparate testers to automate more processes and, in the end, improve the overall performance of our products. So our manufacturing team set out to perform our own digital transformation at our manufacturing site in Hungary. I think they call this eating your own dog food, which is disgusting, but it works for the analogy.
Now we knew we had challenges with silo data sources—think data on actual local hard drives—lagging indicators leading us to reactive responses, and limited visibility into how we were actually using all of our test assets. So the team zoomed in on two areas of test—in-circuit test and our automated optical inspection. Now previous to this, we were using a homegrown system that we affectionately called Omni. Well, Omni, it turns out, had a lot of limitations.
Omni was actually adding cost to our test process. We were over-indexing on our test probe failures, because we couldn't see the component level test data. So instead of just replacing the 200 worn-out probes, we were replacing thousands of good test probes. In our in-circuit test, we gained significant efficiencies and increased throughput and found over $100,000 in operational savings.
Now we expected to improve the efficiency and our product quality, but the results have already exceeded our expectations. In the first year alone, we saw a 20% reduction in manufacturing cost, hardware returns, and test failures. And these early wins are exciting, but I bet like many of you doing your own digital transformation, we think we're only scratching the surface of the benefits here.
Our end goal is predictive decision making. Rather than just understanding more about the test failures and the process optimizations, our end game is to predict and prevent failures before they ever occur. So how do you get that high-fidelity simulation, find that network error that you weren't even looking for, or track down a faulty batch of screws?
If you are not seeing the benefits of simulation, process optimization, or defect detection right now, your first impulse might be I need more data. But maybe not. Likely you already have a lot of data that you're collecting. We know that by 2025, the world's data volume is expected to grow to 175 zettabytes. I'm guessing you capture enough.
But it's not just about acquiring the data. That's kind of the easy part. You have to do something with it. You must connect it. I like to think about actionable insights. The point isn't just to generate it but identify the right data for the decisions you're trying to make and how to efficiently put it to work for you.
So what's preventing you from using your data? Now, I could certainly give a much longer talk, and, in fact, our data scientists have, about the common pitfalls of data, how to use the five pillars of ontology, format, interface, integrity, and harmonization to measure the health of your data and how to ensure standardization, exchange, and quality. But we don't have time for that today.
So instead I will leave you with three questions. Does the data you collect lead to actionable insights on a weekly basis? Do you have a centralized function—or mom—looking at the data across your product life cycle? Does everyone who could benefit have access to your data in a meaningful way? If you answered no to any of these, you have untapped potential waiting to be discovered, and I believe some heroes on your team waiting to be made.
Now for the doctors dealing with my son's broken arm, it was really hard for them to see the bigger trend. We had an ER doctor, a pediatrician, an acupuncturist, and an orthopedic surgeon all involved and all experts but not sharing data. And we also know that a little boy with a broken arm is likely the least interesting thing a pediatric ER doctor faces on a given day, especially with the complexities and confusion in a pandemic.
But for me, he's so much more than just a little boy with a broken arm. He's my light. He's my buddy. Well for the doctors, it was hard for them to see that bigger picture. The test they took were only used to solve the most immediate trauma and then move on to the next. Each visit we had to start the discovery process from scratch, forced to manually, in this case verbally, collect the learnings from each previous event.
This sounds so similar to the conversations I've had with test engineers where they haven't quite gotten to the root of all of their test issues. Now we haven't gotten to the root cause of my son's bone strength yet, but I'm confident that we will. But it's also painfully obvious to me how much easier the last six months would have been if the health care systems treated data as the rich resource it actually is. And I hope you don't make the same mistake. Your test data is a treasure trove of insights if you choose to make it so.
So I'll close with thoughts from one of the ultimate test engineers, in my opinion, Adam Savage, the creator of Mythbusters. He said, "In the spirit of science, there's really no such thing as a failed experiment. Any test that yields valid data is a valid test." And I'll add any process that uses that data to improve your product's performance is opening the door to you being the hero you deserve to be.