Data management for test systems involves more than logging test results. While result collection is a fundamental component, access to parametric data and data analysis are important features to consider as well. If databases are being used to store test-related information, then the technology for connecting to your database should be chosen wisely.
Database connectivity for your test framework will give you the means to dynamically read and write values to and from a database. While there are a variety of ways to interact with your database management system (DBMS), communicating with a database follows five basic steps:
1. Connect to the database.
2. Open database tables.
3. Fetch data from and store data to the open database tables.
4. Close the database tables.
5. Disconnect from the database.
(Note: As you may have noticed, these steps describe a session, much like how you use a session when controlling an instrument.)
Structured Query Language (SQL) is used commonly to perform steps 2 through 4. That is, once a connection to the database is created, SQL can select data in tables and read and write from the selected table of data.
TestStand has features to relieve you of being an SQL expert. You can configure database logging through a dialog that walks through configuring your database connection and your schema, a set of SQL commands for mapping TestStand data to your database tables. In addition to the database logging configuration, there are database step types for communicating with your databases directly from your test sequences. There will be more about database logging and the step types in the following sections.
Now, what about steps 1 and 5, i.e. what about connecting and disconnecting from the database? One solution is to use Microsoft ActiveX Data Objects (ADO) or the next generation ADO .NET, which is built on the .NET technology. ADO can be used to connect to databases through Object Linking and Embedding for Databases (OLE-DB) or Open Database Connectivity (ODBC) drivers. Using ADO, you get a standard programming interface to any DBMS that uses ADO.
Database connectivity and results logging with TestStand are discussed at length in the database logging section of the TestStand Help file. This section also gives you introductory material on databases and the technology TestStand uses to communicate with databases.
Though we are in the midst of a discussion on database connectivity, data management does not have to be synonymous with a database management system. Results can certainly be logged to a database, but you may also want results logged to a file.
Results logging is rarely done for its own sake. Granted, you want a historical record of tests that have been run, but you probably want to log results because you want to use them to debug your systems, to analyze productivity or quality, or to generate reports.
There are pros and cons to either using a database or using files. Using data in a database requires some experience with SQL or a DBMS. On the other hand, with the speed of database logging and reading you can get near real-time reports and alerts. File I/O has its own set of challenges such as performance, format, and security. An advantage of logging to a file is the inherent chronological nature of log files. If you experience a system crash, log files are a great asset for examining the cause of such failures.
In the last section, we saw that TestStand has out-of-the-box database logging along with a dialog-based configuration utility. Once configured, you can log all relevant information including the operator’s name, UUT serial number, test parameters, and results. Whether you are building your own database logging features or using the functionality in TestStand, you will build it on top of how you have implemented your database connectivity.
As an aside, the actual database logging functionality is not native to the TestStand Engine or the TestStand Sequence Editor. The default process model included with TestStand contains customizable sequences that implement the logging features.
Refer to Appendix A, Process Model Architecture of the TestStand User Manaul for additional information.