Machine Vision Systems with LabVIEW Real-Time

Overview

Building your machine vision system with LabVIEW Real-Time provides an environment for complete deterministic control. Because the underlying real-time operating system is a simplified kernel, you have fewer chances for latency or delays caused by interrupt service routines. Machine vision systems running LabVIEW Real-Time easily integrate with other measurement and automation devices, so you can use high-speed interconnects between devices in a PXI chassis for advanced measurement capabilities and facilitate communication between devices without host intervention.

Contents

Introduction

Machine Vision systems are becoming more powerful and affordable as processing power increases and price decreases. A quick glance at an automobile manufacturing line reveals dexterous robots responding effortlessly to camera-mounted feedback. In this high-throughput manufacturing environment, reliability means increased production and increased safety. Down-time can cost millions of dollars and errant moves can put workers in peril. In a high-throughput manufacturing environment, machine vision system reliability means increased production and safety.

Real-Time System Characteristics


Real-time systems process and respond to data in a fixed, upper time limit. While a real-time system is considered a deterministic system, it does not necessarily mean a fast system. ATMs provide real-time data, albeit in seconds. Though it is sufficient time for the average consumer, it is obviously too long for a servo control application. Both systems have different time requirements and both are real time, as long as their response time is bounded. The defining key element for a real-time system is that the process happens exactly in a set amount of time and exactly at the same frequency every time. Thus, a real-time system actually creates an upper bound on response time that may or may not be relatively fast.

Building Machine Vision Systems with LabVIEW Real-Time


An embedded machine vision system consists of several components. First you need a development machine or host computer. This computer allows you to do all the necessary software development, including acquiring sample images, prototyping vision strategies, and benchmarking the application. After you are satisfied with your imaging strategy, you must download the code to a ruggedized and embedded computer running a real-time operating system. Next, generate LabVIEW code from Vision Builder. You can make modifications to this code and integrate additional functionality, such as data acquisition, remote monitoring, or motion control. Next, you can download the code to the target embedded PXI controller. The PXI system runs autonomously from the host computer. The real-time operating system on the target PXI machine ensures system reliability. For monitoring the inspection, you can send image data to the host computer via Ethernet, or you can view image data, complete with graphical overlays and text, via the video-out port on the PXI controller.


Real-Time Vision Development System

Inspection System and Data-Driven Functions


LabVIEW Real-Time running data acquisition functions provides an environment for complete deterministic control. In contrast, an embedded machine vision system is not deterministic. By nature, vision algorithms are not deterministic. Image content, rather than image size, determines the speed of a particular image-processing step. Any data-driven algorithm is always unbounded. For example, performing particle analysis on an image with three particles will occur faster than an equivalent-sized (resolution) image with seven particles. This difference in computation time increases as the variance in the image increases, and as you perform more time-variant processing and analysis functions.

In order to accommodate for this lack of determinism, you should benchmark your worst-case scenario. In this instance, you analyze an image with the maximum number of particles using the image processing steps and then benchmark for speed. This technique determines the upper bound, or response time, of the system.

In the over 200 functions in IMAQ vision, some functions are less variable. These functions, such as threshold, are not entirely data-driven, and are therefore less time-variant. The threshold function operates on the image with significantly less jitter as long as it is the same image size and the same image type. Many image-processing functions fall under this category because they perform a discrete function on each pixel in the image, regardless of image content.

Integration with Other Automation and Measurement Devices


High-speed interconnects between devices in a PXI chassis allow for advanced measurement capabilities and facilitate communication between devices without host intervention. An eight-line trigger bus links all PXI slots in a bus segment, so multiple devices can interact and control each other's events through hardware. A local bus is defined to the right and left of each slot and is composed of 13 lines available to each device slot for private communication between adjacent slots. Pairs of peripherals can pass analog or digital signals back and forth using the local bus. An example would be passing trigger information to your DAQ, motion, and imaging devices, thus simultaneously capturing voltage, coordinate position, and image data.

Low-Cost, Reliable Systems


Manufacturing processes, many of which often rely on human inspection or expensive proprietary vision systems now can benefit from low-cost, high-speed, and easy-to-use embedded vision systems.
Related Links:
Products and Services: Machine Vision