The breadth of applications that engineers create with LabVIEW has rapidly grown over the last decade. With strides made in embedded processors, such as field-programmable gate arrays (FPGAs) and multicore processors, and the increased sophistication in developing large mission-critical applications using LabVIEW, the bar has been raised regarding product stability, code validation, and functionality verification. As such, most of the resources for LabVIEW product development have been dedicated to improving product stability to meet critical application demands and making the edit-time environment more responsive. The results of this effort are significant and can be seen throughout the entire LabVIEW platform. This document outlines several of the efforts NI R&D made to impact stability, includes benchmark results, and discusses LabVIEW features that more accurately address the common issues LabVIEW users experience.
1. Efforts to Improve Stability
NI tracks product problems through a database of corrective action requests (CARs). These CARs are traditionally reported by LabVIEW customers, partners, and even internal developers either online through the NI discussion forums or through the applications engineering department. Then developers prioritize, track, maintain, and ultimately fix these requests. A LabVIEW CAR is not necessarily a problem that results in a software crash—it is any problem that needs to be fixed within the product. This includes documentation, performance issues, incorrect calculations, cosmetic issues, undocumented errors, and high-severity issues such as crashes and hangs.
If it’s not measured, it’s not managed. The NI R&D team prioritizes tracking, documenting, and improving the stability of LabVIEW. Tracking improvements and degradations to product stability, other than in-house testing procedures, is a time-consuming process. It requires gathering input from users and collecting anecdotes and qualitative information. However, gathering real-world quantitative information can prove difficult. NI monitors forums, online discussions, and the one-to-one discussions involving developers, marketers, and local field representatives to better ascertain the effect of the changes.
Figure 1 shows the CARs that were created throughout the development process from July 2008 to June 2012. While LabVIEW users are a source of CARs, the LabVIEW development team uses the same database to track issues that arise during the development process. The drastic decline in this number implies that fewer problems are being introduced with new functionality, so NI developers can spend more time on existing issues from previous releases.
Figure 1. Qualitative Measure of the Number of CARs Created During the Development of the Last Five Versions of LabVIEW
What Are the Users Saying?
The LabVIEW Champions are a select group of advanced LabVIEW users who work closely with LabVIEW R&D to prioritize usability features and CAR fixes as well as heavily test LabVIEW throughout the development cycle. While working with the LabVIEW 2012 beta, they said the following:
“My experience has been excellent. I have some upcoming projects and I will not hesitate to start them off in the 2012 beta.”
– - Christian Altenbach, Jules Stein Eye Institute at UCLA (USA)
“I feel that LV 2012 seems more stable than previous beta versions.”
– Benjamin Steinwender, CLD (Austria)
It’s More Than Just Stability
Stability is not the only focus. Improving the edit-time responsiveness of the entire LabVIEW platform is also a priority. Decreasing the time it takes for common actions such as loading VIs, opening property dialogs, and loading help can greatly improve the experience of developing applications using LabVIEW.
Significant strides have been made to accomplish edit-time performance improvements in the LabVIEW platform.
With each release of LabVIEW, NI offers an extensive public beta period during which LabVIEW customers and partners can access, develop with, test, and upgrade their code using a stable beta build. The beta program includes a discussion forum where users can interact with the developers working on key features, participate in discussions with other beta testers, and ultimately provide the feedback that is necessary to shape the product to be most useful for the engineering community.
NI Error Reporter (NIER)
With LabVIEW 2011, NI released NIER, an error reporting tool that records information about any crash that occurs on a user’s system. With the help of LabVIEW users who used NIER to send their crash reports to NI for further investigation, NI has gathered valuable information on all occurring crashes and better prioritized the issues that are causing lost productivity.
NI has applied an industry-proven approach to addressing product issues by organizing all crashes into distinct categories and focused on investigating and addressing the top 20 crash categories, approximately 28 percent, of all reported crashes. NI has been working diligently to resolve the top reported crashes in every new version following the release of this technology.
NI has also enhanced the LabVIEW platform such that the company can get additional information when a crash occurs, giving NI the ability to better address additional reported crashes.
Because of NIER and those customers who have taken advantage of it, newer versions provide a significantly more stable platform than previous versions. NI will continue to make additional improvements to the stability of the LabVIEW platform through the assistance of customer reported crash reports via NIER.
Traditionally, CAR prioritization depends on three inputs: the existence of a workaround, the potential impact to users, and the amount of development time necessary to implement a fix for the problem. For example, with a one-year release cycle for LabVIEW, the required development time has a major impact on whether a solution is implemented in the upcoming version or deferred to a future version. Each of the three inputs is equally considered in prioritizing a CAR. Moving forward, the NIER will help provide additional information on how many users are affected by a particular issue. Additionally, the potential effect of the issue on a customer will play a bigger part in the determination of a fix.
2. NI Has a Proven Track Record of Listening to User Feedback
National Instruments has a long track record of listening to – and acting on – the feedback of the engineers who use NI products. When NI employees say that the success of the company lies in the hands of its customers, you can believe them. In the LabVIEW Idea Exchange, a forum where users can suggest new features for LabVIEW (R&D has implemented 25 features to date), user feedback is more important than ever. Just like new features, knowing where users struggle with LabVIEW or see room for improvement can only help NI enhance the product.
The service pack releases will continue to focus on high-profile and high-impact bugs while the next major release will continue to focus on product stability, edit-time responsiveness, and features to help you be more productive. Visit ni.com/beta to be sure you’re signed up for new releases of LabVIEW, so you can impact the stability of LabVIEW moving forward.