David Catto - Austin Consultants Ltd
Natalie Latham - Zoological Society of London, Interpretation Developer
Zoological Society of London
Founded in 1826, the Zoological Society of London (ZSL) is an international scientific, conservation, and educational charity. As part of the unveiling of a new £2m Centre for Elephant Care at the zoo’s Whipsnade site, ZSL wanted to further enhance public understanding of these magnificent creatures and promote the conservation of the endangered Asian elephant through a demonstration that visualises how elephants communicate. Some elephant communication is through grunts and grumbles that fall below the human hearing range. Many of these noises take place within the infrasound spectrum that elephants, with their sensitive hearing range, can pick up but people cannot.
Austin Consultants is an NI Alliance Partner. The company delivers innovative and flexible client-focused engineering design, development, integration, and test solutions to ensure clients fulfill their business and technical goals. From strategy, software development, hardware design, build and integration, to data analytics and machine learning, the company’s smart end-to-end services provide bespoke support at each stage of the product life cycle.
Walk With the Elephants, Talk With the Elephants
Austin Consultants and ZSL agreed to use a sensitive microphone capable of capturing sound down to 1–2 Hz to capture the noises being made by the elephants. We could then translate live data onto a projected display alongside the viewing platform in the centre, which visualises and explains live elephant communications to visitors. The system would also play back pre-recorded communications to ensure that all zoo visitors could experience and understand how the elephants communicate, even when the elephants were out of microphone range.
We also needed to meet a tight deadline for project delivery because of a royal visit to officially unveil the elephants’ new home by Her Majesty, Queen Elizabeth, accompanied by His Royal Highness, The Duke of Edinburgh.
Data Acquisition and Signal Processing
Noise performance and sensitivity were critical aspects of this project, so our team selected LabVIEW and the NI-9250 input module in a CompactDAQ chassis to manage signal processing and data acquisition with prebuilt functions.
We used the NI-9250 to measure signals from the integrated electronic piezoelectric (IEPE) microphones to deliver a wide
dynamic range, incorporating software-selectable AC/DC coupling and IEPE signal conditioning. When used with NI software, this module provides processing functionality for condition monitoring, including frequency analysis. The PCB Piezotronics IEPE microphone we chose had a working range that dropped below 20 nHz, but also had a flat frequency response over this infrasound range, making analysis slightly easier in the short timescales. The features of the NI-9250 combined with the ease and speed of configuring LabVIEW to the project requirements made both products a natural choice.
We mounted the CompactDAQ system and microphone within the elephant centre, keeping acquisition close to source to reduce
additional electrical noise. We acquired the signals at a sample rate of 25 kHz, fully utilising the anti-aliasing filtering and differential inputs of the NI-9250. We processed and logged the data using code written with LabVIEW on an industrial Windows PC positioned in a plant room within the elephant centre close to the viewing platform and display.
Web-Based Data Display and Migrating to LabVIEW NXG
The migration itself was a relatively simple process, as we used the built-in migration tool in LabVIEW NXG. We could simply supply the location of our existing project to create a new LabVIEW NXG version. The system generated a conversion report to tell us what did and did not migrate correctly, so we could quickly make minor changes to the code and fix the issue.
Austin Consultants user experience and web design specialists worked with ZSL to understand the requirements for the display
interface to ensure it fulfilled not only the educational and informational requirements, but also the accessibility and cognitive needs of the wide range of zoo visitors.
After reviewing many interface options, we identified a real-time frequency spectrum with ~10 seconds of history as the most effective and interesting. The public can hear audible noises, then turn and look at the display to identify any inaudible sounds that may have been made at the same time. The final display runs within a full-screen chromium browser and harnesses advanced web technology, including 3D WebGL rendering and visualisation engines.
Engage and Inform
We faced several challenges in designing this system. We were working in a niche area of the audio spectrum and distilling complex information into a user friendly, exciting, and inspiring display. Many people have commented on how interesting it is to just stand and watch the display!
The infrasound region is full of other noises such as the movement of buildings, low-end engine noise, and air conditioning systems. Our processing helps remove some of these distracting noises, but deliberately not all of them, to give the public reference even when the elephants are feeling a little shy.
The new approach with LabVIEW helped ZSL harness technology to deliver new insights and promote scientific understanding of the elephants and their behaviour. As part of the royal visit to officially unveil the centre, the Duke of Edinburgh was very interested to learn about the technology used.
Applying Machine Learning to Interpret the Elephant Noises
To further understand the elephant noises, the Austin Consultants machine learning team are currently working to develop an algorithm that can identify certain types of calls, patterns, and maybe even specific individual elephants. The longer-term ambition of this exciting project is to attempt to interpret these calls, for example, by associating them with activities and behaviours. This could even help us determine when an elephant is ill, but not displaying any visible symptoms.
This also has wide applications for industry. For example, we can apply machine learning algorithms to condition monitoring and predictive maintenance of industrial equipment. By analysing typical models of operation, we can use changes that fall outside normal parameters to provide an early indication of a potential fault; thus, highlighting a maintenance requirement before the equipment fails, reducing down time, and cutting costs.
Thanks to LabVIEW and CompactDAQ, we could quickly create a system to record infrasonic noises from the ZSL elephants and display the data to zoo visitors in an engaging way to promote science and wildlife conservation. With the new tools in LabVIEW NXG, we could quickly move all data acquisition, analysis, and control within the LabVIEW environment. We are now finalising the migrated system and hope to deploy in the near future. Maybe, with the help of machine learning, one day we can improve the welfare of elephants in captivity.
Austin Consultants Ltd
Tel: 0800 772 0795