1. Remote Monitoring
No matter what the industry, NI customers tend to produce extreme amounts of data as part of their measurement or automation application. The data is of critical importance to the engineers and scientists involved, but aggregating and storing large amounts of data is often challenging. This is especially true for measurement or control systems deployed in remote or inaccessible locations, like inside a bridge, atop a tower in the rain forest, or within the chassis of an industrial earth-mover. Typical applications where remote monitoring and sharing of technical data is key include (among others):
- Structural health monitoring
- Environmental monitoring
- Machine condition monitoring
- Power monitoring
- Green energy monitoring (i.e. solar arrays, wind turbines, etc.)
In each of these application areas, it is extremely challenging for an engineer or technician to access one of these measurement devices to retrieve stored data, let alone hundreds or thousands of devices. Even if the data can be accessed (either physically or remotely), there still exists the secondary challenge of collecting data from multiple locations in one place and then sharing important data so that others can access it for analysis or reporting.
2. Traditional Solution
In order to aggregate, store and share data, engineers and scientists have historically had one option: build a custom solution internally. Generally, this type of solution consists of either raw data transmitted from a remote system back to a central server via TCP/IP or upload of data files created on the remote system. However, this approach is typically challenging for several reasons.
First, assuming that an NI customer elects to build an on-premise solution, it requires significant time, energy and capital to acquire the servers, databases and network infrastructure to build solution. However, this may not even constitute the most important aspect of the cost, which is the maintenance necessary to ensure that this solution remains online and functional.
More recently, a new option available to companies is to build their solution in a multi-tenant public cloud from companies like Microsoft, Amazon, IBM or others. However, while a cloud-based solution could lower the cost of getting started by removing the need to purchase and maintain the actual data center in which a data aggregation and storage solution would run, it requires new expertise in running an application in the cloud. Additionally, even cloud applications still require ongoing maintenance to ensure the application works as intended.
In order to put cloud storage capabilities into the hands of engineers and scientists such that they can store their engineering or scientific data in the cloud and retrieve it later from any location, NI has introduced a new product called Technical Data Cloud.
3. Advent of the Cloud
Cloud computing is a new generation of computing that uses distant servers to provide services and storage accessed over the Internet (or “cloud”), often on a consumption-based model. Typical cloud providers like Microsoft, IBM, Amazon and Google deliver applications or services online that are accessed from a Web browser while user data is stored on remote servers. In cloud computing, those servers are typically owned and maintained by a third-party provider on a consolidated basis in a remote data center.
The rise in popularity of cloud computing is due to the desire to outsource the maintenance burden of server hardware and software, the need to make capital investments to scale systems up or down on demand and the convenience of being able to access your data from anywhere with an Internet connection. Productivity gains for engineers and scientists will likely be seen from the cloud in the next few years in the areas of:
- Data aggregation
- Computing power
- Access anywhere
Since the cloud is simply interconnected computer networks housed in large data centers all over the world, one key benefit is the scalability of resources. Using the cloud, users are not confined by the amount of on-site server space they have at their facility. As data is collected and calculated, it can be stored in the cloud along with other data sets. As the amount of data increases, additional space in the cloud is provisioned accordingly.
4. Technical Data Cloud Overview
Technical Data Cloud (TDC) is a high-availability cloud-based service from National Instruments designed to enable engineers and scientists to securely consolidate, store and share measurement data. TDC is a complex application housed in large, professionally administered third-party cloud data centers that are accessible from anywhere in the world.
Sending data to and retrieving data from TDC is possible from any application running on any PC or intelligent device that is connected to the public internet via an Ethernet, WiFi, cellular or satellite connection. TDC has a set of web service APIs based on the RESTful standard that allow for sending and receiving data; users have two options for accessing those APIs:
- Use the LabVIEW TDC API. This API is a set of VIs (similar to a toolkit) for opening a connection to a particular TDC project and then reading, writing, updating or deleting data. Using this approach, a LabVIEW user works at a very high level (similar to file I/O) and does not need to understand low-level HTTP commands. This API also supports functionality like automatically retries in the event of a lost internet connection and VIs for temporary caching on disk so that data is not lost.
- Create low-level HTTP requests in a text-based programming language. TDC projects can also be accessed using C, C++, C#, and Python using the raw RESTful webs services directly. can be used to upload data to and access data in TDC because its APIs are based on standard REST web services. However, this approach is much more complex and requires knowledge of programming with HTTP and XML.
Image 1. This LabVIEW code snippet illustrates the use of the TDC API to send data to the cloud.
In either case, the HTTP traffic can be encrypted using SSL, the same encryption used by banks and other institutions to secure transactions on their websites.
In the most typical use case, a LabVIEW programmer will develop a new measurement or control application (or augment an existing one) that programmatically sends data periodically to TDC via the LabVIEW TDC API. Once stored in TDC, this data can then be retrieved programmatically from one or more client applications; these clients may range from another LabVIEW application to a web page to a tablet or smartphone application.
Image 2. Clients communicate with TDC using IT-friendly HTTP(S) commands.
5. Benefits of Technical Data Cloud
TDC offers many important benefits to engineers and scientists who need to consolidate and share information from measurement and control systems located in dispersed or remote physical locations.
- Integration with LabVIEW. LabVIEW users can quickly and easily send data to TDC through a simple G API without having to understand low-level HTTP commands.
- Agility. Getting started with TDC is fast and easy because users aren’t required to set up or configure any servers, databases, or network infrastructure.
- Cost. Although cloud-based products and services are not always cheaper in the long run than on-premise solutions (it highly depends on the use case), users aren’t required to make any capital investments when using TDC and benefit from economies of scale due to sharing the same service with a large number of other users.
- Reliability. The datacenters in which TDC runs are professionally operated by companies whose core competency is cloud computing and whose goal is to essentially deliver 24/7 uptime. The TDC application itself was developed by IT experts at National Instruments and is continually monitored for the unlikely event of downtime.
- Scalability. Cloud datacenters consist of hundreds to thousands of servers, and TDC is designed specifically to take advantage of additional servers if the incoming or outgoing customer requests warrant leveraging additional resources.
- Performance. The cloud has an incredible number of processors and enormous amount of memory for handling incoming and outgoing customer data.
- Security. All data transmitted into and out of TDC can be encrypted using SSL. Once the data is stored in the cloud, the datacenter operators bring to bear their considerable expertise in security to keep the data locked down.
- Maintenance. National Instruments, in combination with the underlying cloud vendor, constantly monitor the status of TDC and will address any issues that might arise within minutes. Additionally, cloud-based applications and the underlying hardware they run on can be upgraded behind the scenes without any impact on the customer.
6. Early Access Product
TDC is currently considered an “Early Access” product targeted at early adopters (especially those developing large, distributed systems). The term “Early Access” is intended to convey to potential customers that TDC may have usability or feature gaps in its current form. NI is actively seeking TDC early adopters, but plans to be selective in terms of allowing access to the technology.
For more information on Technical Data Cloud, visit the Technical Data Cloud homepage.