Developing Real-Time Control for the World's Largest Telescope Using NI LabVIEW With Multicore Functionality

Jason Spyromilio, European Southern Observatory

"NI engineers proved that we can, in fact, use LabVIEW and the LabVIEW Real-Time Module to implement a COTS-based solution and control multicore computation for real-time results."

- Jason Spyromilio, European Southern Observatory

The Challenge:

Using commercial off-the-shelf (COTS) solutions for high-performance computing (HPC) in active and adaptive optics real-time control in extremely large telescopes.

The Solution:

Combining the NI LabVIEW graphical programming environment with multicore processors to develop a real-time control system and prove that COTS technology can control the optics in the European Extremely Large Telescope (E-ELT), which is currently in the design and prototyping phases.

The European Southern Observatory (ESO) is an astronomical research organization supported by 13 European countries. We have experience developing and deploying some of the world’s most advanced telescopes. Our organization currently operates at three sites in the Chilean Andes – the La Silla, Paranal, and Chajnantor observatories. We have always commanded highly innovative technology, from the first common-user adaptive optics systems at the 3.6 m telescope on La Silla to the deployment of active optics at La Silla’s 3.5 m New Technology Telescope (NTT) to the integrated operation of the Very Large Telescope (VLT) and the associated interferometer at Paranal. In addition, we are collaborating with our North American and East Asian partners in constructing the Atacama Large Millimeter Array (ALMA), a $1 billion (USD) 66-antenna submillimeter telescope scheduled for completion at the Llano de Chajnantor in 2012.

 

The next project on our design board is the E-ELT. The design for this 42 m primary mirror diameter telescope is in phase B and received $100 million (USD) in funding for preliminary design and prototyping. After phase B, construction is expected to start in late 2010.

 

Grand-Scale Active and Adaptive Optics

The 42 m telescope draws on the ESO and astronomical community experience with active and adaptive optics and segmented mirrors. Active optics incorporates a combination of sensors, actuators, and a control system so that the telescope can maintain the correct mirror shape, or collimation. We actively maintain the correct configuration for the telescope to reduce any residual aberrations in the optical design and increase efficiency and fault tolerance. These telescopes require active optics corrections every minute of the night, so the images are limited only by atmospheric effects.

 

Adaptive optics uses a similar methodology to monitor the atmospheric effects at frequencies of hundreds of hertz and corrects them using a deformed, suitably configured thin mirror. Turbulence scale length determines the number of actuators on these deformable mirrors. The wave front sensors run fast to sample the atmosphere and transform any aberrations to mirror commands. This requires very fast hardware and software.

 

Controlling the complex system requires an extreme amount of processing capability. To control systems deployed in the past, we developed proprietary control systems based on virtual machine environment (VME) real-time control, which can be expensive and time-consuming. We are working with National Instruments engineers to benchmark the control system for the E-ELT primary segmented mirror, called M1, using COTS software and hardware. Together we are also exploring possible COTS-based solutions to the telescope’s adaptive mirror real-time control, called M4.

 

M1 is a segmented mirror that consists of 984 hexagonal mirrors (Figure 1), each weighing nearly 330 lb with diameters between 1.5 and 2 m, for a total 42 m diameter. In comparison, the primary mirror of the Hubble Space Telescope has a 2.4 m diameter. The single primary mirror of the E-ELT alone will measure four times the size of any optical telescope on the earth and incorporate five mirrors (Figure 2).

 

 

 

Defining the Extreme Computational Requirements of the Control System

In the M1 operation, adjacent mirror segments may tilt with respect to the other segments. We monitor this deviation using edge sensors and actuator legs that can move the segment 3 degrees in any direction when needed. The 984 mirror segments comprise 3,000 actuators and 6,000 sensors (Figure 3).

 

The system, controlled by LabVIEW software, must read the sensors to determine the mirror segment locations and, if the segments move, use the actuators to realign them. LabVIEW  computes a 3,000 by 6,000 matrix by 6,000 vector product and must complete this computation 500 to 1,000 times per second to produce effective mirror adjustments.

 

 

Sensors and actuators also control the M4 adaptive mirror. However, M4 is a thin deformable mirror – 2.5 m in diameter and spread over 8,000 actuators (Figure 4). This problem is similar to the M1 active control, but instead of retaining the shape, we must adapt the shape based on measured wave front image data. The wave front data maps to a 14,000 value vector, and we must update the 8,000 actuators every few milliseconds, creating a matrix-vector multiply of an 8 by 14 k control matrix by a 14 k vector. Rounding up the computational challenge to 9 by 15 k, this requires about 15 times the  large segmented M1 control computation.

 

We were already working with NI on a high-channel-count data acquisition and synchronization system when they began working on the math and control problem. NI engineers are simulating the layout and designing the control matrix and control loop. At the heart of all these operations is a very large LabVIEW matrix-vector function that executes the bulk of the computation. M1 and M4 control requires enormous computational ability, which we approached with multiple multicore systems. Because M4 control represents 15 3 by 3 k submatrix problems, we require 15 machines that must contain as many cores as possible. Therefore, the control system must command multicore processing. This is a capability that LabVIEW offers using COTS solutions, making a very attractive proposition for this problem.

 

Addressing the Problem with LabVIEW in Multicore HPC Functionality

Because we required the control system engineering before the actual E-ELT construction, the system configuration could affect some of the construction characteristics of the telescope. It was critical that we thoroughly test the solution as if it were running the actual telescope. To meet this challenge, NI engineers not only implemented the control system, but also a system that runs a real-time simulation of the M1 mirror to perform a hardware-in-the-loop (HIL) control system test. HIL is a testing method commonly used in automotive and aerospace control design to validate a controller using an accurate, real-time system simulator. NI engineers created an M1 mirror simulator that responds to the control system outputs and validates its performance. The NI team developed the control system and mirror simulation using LabVIEW and deployed it to a multicore PC running the LabVIEW Real-Time Module for deterministic execution.

 

In similar real-time HPC applications, communication and computation tasks are closely related. Failures in the communication system result in whole system failures. Therefore, the entire application development process includes the communication and computation interplay design. NI engineers needed a fast, deterministic data exchange at the core of the system and immediately determined that this application cannot rely on standard Ethernet for communication because the underlying network protocol is nondeterministic. They used the LabVIEW Real-Time Module time-triggered network feature to exchange data between the control system and the M1 mirror simulator, resulting in a network that moves 36 MB/s deterministically.

 

NI developed the full M1 solution that incorporates two Dell Precision T7400 Workstationss, each with eight cores and a notebook that provides an operator interface. It also includes two networks – a standard network that connects both real-time targets to the notebook and a 1 GB time-triggered Ethernet network between the real-time targets for exchanging I/O data (Figure 5).

 

As for system performance, we learned that the controller receives 6,000 sensor values, executes the control algorithm to align the segments, and outputs 3,000 actuator values during each loop. The NI team created this control system to achieve these results and produced a telescope real-time simulation in actual operation called “the mirror.” The mirror receives the 3,000 actuator outputs, adds a variable representative of atmospheric disturbances such as wind, executes the mirror algorithm to simulate M1, and outputs 6,000 sensor values to complete the loop. The entire control loop is completed in less than 1 ms to adequately control the mirror (Figure 6).

 

The benchmarks NI engineers established for their matrix-vector multiplications include the following:

  • LabVIEW Real-Time Module with a machine with two quad-core processors,using four cores and single precision at 0.7 ms
  • LabVIEW Real-Time Module with a machine with two quad-core processors, using eight cores and single precision at 0.5 ms

 

The M4 compensates for measured atmospheric wave form aberrations, and NI engineers determined the problem could only be solved using a state-of-the-art, multicore blade system. Dell invited the team to test the solution on its M1000, a 16-blade system (Figure 7), and the test results were encouraging. Each of the M1000 blade machines features eight cores, which translates into the fact that engineers distributed the LabVIEW control problem onto 128 cores.

 

NI engineers proved that we can, in fact, use LabVIEW and the LabVIEW Real-Time Module to implement a COTS-based solution and control multicore computation for real-time results. Because of this performance breakthrough, our team continues to set benchmarks for both computer science and astronomy in E-ELT implementation, which will further scientific advancements as a whole.

 

 

For more information on this case study, please contact:

European Southern Observatory

Karl-Schwarzschild-Strasse 2

D-85748 Garching bei München

Tel: +49 89 320060

Fax: 3202362

E-mail: information@eso.org

For more information on LabVIEW for HPC applications, please contact:

Jeff Meisel, LabVIEW Product Manager

Tel: (512) 683-8795

For a size comparison, two humans and a car stand next to the E-ELT. The M1 primary mirror, which is 42 m in diameter, features segmented mirror construction.
The E-ELT features a total of five mirrors.
LabVIEW software controls the M1 system comprised of 984 segments at 1.5 m each with six sensors and three actuator legs that provide 3 degrees of freedom for movement deviation.
4/7 A thin, flexible mirror spread across 8,000 actuators, the M4 can be deformed every few milliseconds to compensate for atmospheric interference.
NI engineers validated the mirror control system (right) with the M1 mirror HIL simulation (left).
To achieve required loop rates, NI engineers set up a highly deterministic network and called it from an application using timed sequences and timed loops.
7/7 This illustrates the current NI approach to implement M4. The problem is approximately 15 times more demanding than the M1 controller