Microwave ovens aren’t the only things cooking in the 2.4 GHz spectrum. RF signals coming from WiFi devices, Bluetooth devices, ZigBee devices, cordless phones, wireless game pads, toys, PC peripherals, wireless audio devices, and more make the 2.4 GHz frequency band a very crowded kitchen. Designing devices to function in this spectrum raises many challenges. A Bluetooth device, for example, not only has to fight for detection against WiFi and ZigBee, but it also presents issues such as high-power consumption and requests for higher data throughput. Figure 1 shows the 2.4 GHz spectrum with only Bluetooth, ZigBee, and WiFi channels.
1. Bluetooth Low Energy
Consider the history of Bluetooth and its fight for recognition in this spectrum. Since its inception, the standard has tried to avoid interference from other signals using the following tactics:
■ 2003—Bluetooth 1.2 with adaptive frequency hopping (AFH) is released so Bluetooth channels can avoid interference from other standards in the 2.4 GHz band.
■ 2004—Bluetooth with enhanced data rates (EDR) is released and achieves 2.1 Mbit/s data rates.
■ 2007—Bluetooth 2.1 + EDR is released and allows secure simple pairing. One billion Bluetooth chips are shipped.
■ 2009—Bluetooth HS (3.0) is released and results in 24 Mbit/s, making it usable for higher data streaming. This release also opens the use of 802.11b/g radios for higher speed data.
With the most recent version of Bluetooth, Bluetooth Low Energy (BLE), 4.0, or Wibree, as it was originally called, devices are expected to consume a fraction of the power that classic Bluetooth products consume. Because BLE chips spend most of their time asleep, BLE devices should last more than a year on a button cell battery without recharging. Even sending data takes only a few milliseconds—compared to the hundreds of milliseconds taken by classic Bluetooth.
BLE uses low duty cycles and is optimized to operate in small bursts so the devices consume less power. It offers the following advantages over classic Bluetooth:
■ More intelligent controllers that can keep the devices asleep for longer time periods
■ Ultralow-duty cycles that can be adjusted down to 0.1 percent (compared to 1 percent in classic Bluetooth)
■ Multivendor interoperability
■ Low cost and small size
Figure 1. WLAN, BLE, and ZigBee, among other standards, all occupy the 2.4 GHz spectrum.
2. Channels for Coexistence in the 2.4 GHz Spectrum
Another important change is that BLE uses just 40 channels, while classic Bluetooth uses 79. Each BLE channel is 2 MHz wide compared to 1 MHz for classic. As you can imagine, Bluetooth needs to be able to differentiate its signals from others in the same band. Three of the 40 channels are located exactly between the WLAN channels, which allows easier device discovery and connection. Bluetooth devices use these channels, also known as “advertising channels,” to broadcast their presence and search for other Bluetooth devices. This enables Bluetooth signals to coexist with WLAN signals.
3. New Test Requirements
BLE has new testing requirements such as “dirty packets” for sensitivity testing, and PER testing instead of the classic Bluetooth BER testing. The dirty packet concept creates nonideal packets every 50 packets, where the carrier frequency offset, modulation index, and symbol timing error are changed to specific value combinations described in the test specification. In addition, a frequency drift is also added to the signal characteristics. Dirty packets attribute to a 1 dB to 2 dB difference in the receiver input power, which results in a difference in the number of pass/fail devices during receiver testing.
Figure 2. Advertising channels allow Bluetooth devices to connect to each other even in the presence of WLAN signals.
The PER test requirement for BLE is less than 30.8 percent after at least 1,500 packets. This equates to a BER of 0.1 percent (as used in the classic Bluetooth scenario).
4. Software-Based Test Instruments
NI offers toolkits for Bluetooth, WLAN, and even cellular standards that you can use with any NI RF vector signal analyzer (VSA), vector signal generator (VSG), or vector signal transceiver (VST). The flexibility to use or upgrade hardware irrespective of the software environment is a valuable asset to test engineers who deal with the changing requirements of these RF protocols. Furthermore, engineers can upgrade test equipment to accommodate higher bandwidths, frequency ranges, or performance while maintaining their software stack. Figure 2 shows the NI PXIe-5644R VST being used with the NI Bluetooth Toolkit, which has an API that works with LabVIEW, NI LabWindows™/CVI, NI Measurement Studio, and Microsoft Visual Studio.
5. Test Equipment for Low-Power RF Signals
Low-power RF receiver and test equipment designers must be aware of interfering signals. When selecting test equipment, specifications such as adjacent channel rejection should be given careful consideration. You can use preselectors and IF filter banks to ensure receivers can detect small signals in the presence of a larger one. You can use an IF filter bank with amplifiers and attenuators in the IF chain to efficiently reject large signals and adjust the power of the signal reaching the A/D converter. This makes it possible to detect and demodulate the signal of interest. Without IF filters, an unwanted large signal would saturate the A/D converter.
6. A Software-Defined Solution
NI not only provides out-of-the-box software for testing standards in the 2.4 GHz spectrum, the company also provides optimized hardware for detecting low-power RF signals in the presence of higher power saturating signals. With the software-defined nature of NI’s solution, you can test your devices to the required specifications defined by the standard, as well as tweak tests to see how your device reacts to real-world signals.
For more details, visit ni.com/rf.
Automated Test Product Marketing Manager
This article first appeared in the Q1 2013 issue of Instrumentation Newsletter.