This article describes the following specifications applicable to both RF generation and analysis:
- Frequency range
- Instantaneous, or real-time, bandwidth
- Tuning speed
- Phase noise
- Voltage standing wave ratio (VSWR)
Note: All RF devices are subject to the same design rules as RF instrumentation.
2. Frequency Range
Frequency range is arguably the most important characteristic of RF instruments. For example, a WiFi test solution requires operation at frequencies of up to 2.5 GHz. Similarly, when performing analysis of a component that operates at 900 MHz, the instrument must operate at that frequency range to be useful. A number of components can affect the maximum frequency range of an RF instrument, including mixers, input filters, and local oscillators (LOs). However, configuring the instrument to work at a specific frequency is accomplished mainly by tuning the LO. Some instruments use a multiple series of LOs, but the simplified instrument block diagram shown in Figure 2 uses a single LO.
The LO is mixed with the RF input, which helps convert the RF signal down to an intermediate frequency (IF) signal. The same frequency synthesis techniques also apply to RF signal generators.
Frequency synthesis is accomplished using either a voltage controlled oscillator (VCO) or an yttrium iron garnet (YIG). Historically, RF instruments used a YIG-based architecture as a mechanism for generating the LO. The YIG is a current-controlled oscillator known for its tight phase noise and wide frequency ranges (up to 20 GHz or higher). However, YIG-based instruments consume significant power and can be quite costly. In addition, tuning the YIG from one frequency to the next requires longer tuning times than other methods. As a result, VCO-based LO architectures have recently become more common. The VCO has a smaller frequency range than the YIG, but its tuning speed is much faster.
3. Instantaneous (Real-Time) Bandwidth
The term instantaneous, or real-time, bandwidth is used to describe the maximum continuous RF bandwidth that an instrument generates or acquires. For instance, a vector signal generator might generate a signal at a center frequency of 2.45 GHz, but the instantaneous bandwidth of the instrument, or signal bandwidth, might only be 20 MHz wide. This signal bandwidth means that the device can continuously acquire 20 MHz of RF spectrum without re-tuning the LO.
Instantaneous bandwidth is largely determined by the RF analog front end of the instrument. To better understand the instantaneous bandwidth specification, it is helpful to understand the basic architecture of an RF instrument. Current technology cannot digitize every signal in the gigahertz range. Thus, RF instruments use a series of LOs, mixers, and filters to bring an RF signal into an IF or baseband frequency range. Figure 2 shows the block diagram of a highly simplified vector signal analyzer.
In Figure 2, the vector signal analyzer downconverts a portion of the RF spectrum to an IF that is recognizable to an ADC. The instantaneous bandwidth of an RF instrument is determined by the following two main components:
- Filters implemented in the instrument
- Sample rate and bandwidth of the ADC
The relative importance of the instrument's instantaneous bandwidth largely depends on the application. For example, generating a narrow-band FM signal requires only 200 KHz instantaneous bandwidth. However, generating and analyzing wide-band signals, such as IEEE Standard 802.11g (WiFi), requires at least 20 MHz of instantaneous bandwidth. Other applications such as spectral mask testing perform more quickly when the instantaneous bandwidth is significantly wider than the signal of interest. In the event that a spectral mask test requires more instantaneous bandwidth than the instrument provides, the instrument has to be re-tuned to acquire the frequency information in sections.
4. Tuning Speed
Tuning speed measures the length of time required for the LO to change from one center frequency to another within a specified accuracy level. When tuning an oscillator to a different frequency, the LO settling time dictates the tuning speed.
In typical systems, when tuning from one frequency to another, the LO usually slightly overshoots the desired frequency and then settles to the desired frequency within a certain time period. In most cases, the tuning speed is a function of the frequency step size. The greater the frequency step, the longer it takes the LO to tune within a specified range. Table 1 illustrates the settling time for a YIG-based LO.
Table 1. Tuning Speed of YIG-Based LO
Tuning speed is an important specification in applications such as an 802.11g transceiver automated production test. Because the 802.11g standard specifies that devices function at one of 14 channels between 2.4 GHz and 2.48 GHz, RF instruments must be used to test device operation across a variety of frequencies. The quicker the test signal sweeps from one station to the next, the quicker the receiver is tested.
5. Phase Noise
Phase noise describes the short-term frequency stability of an RF instrument. Phase noise is caused by small, instantaneous LO phase jitter and results in signal power at frequencies adjacent to the carrier.
An easy way to visualize the effects of phase noise is to analyze a single tone in the frequency domain. Figure 3 represents two simulated carriers—one ideal carrier and the other carrier with phase noise.
The left plot in Figure 3 illustrates single tone generation, which ideally results in a single peak of power concentrated at a very precise frequency. A slightly different result is shown in the right plot where phase noise (essentially time-domain jitter) results in a slight periodic spreading of the signal in the frequency domain.
Phase noise is characterized by measuring the signal amplitude at various offsets from the desired carrier. On the right plot in Figure 3 we measure a phase noise of –95 dBc at a 1 KHz offset and –146 dBc at a 10 KHz offset.
The significance of RF instrument phase noise varies from one application to the next. Tight phase noise is required in the detection of low-level blocker signals that are close to a particular signal of interest. When using an LO with significant phase noise, the phase noise is amplified in the resulting IF signal. Figure 4 shows LO phase noise translates to phase noise of the resulting IF signal.
In this particular application, the phase noise of the two signals interferes with one another, making it more difficult to identify the specific blocker signal characteristics.
Visualizing demodulation of a signal with a constellation plot is another way to illustrate phase noise effects. A signal with significant phase noise shows slight periodic rotations of the constellation plot. Figure 5 compares an ideal 4-phase-shift keying (4-PSK) modulated signal with four symbols, represented by black dots, being transmitted in the left plot to a signal with significant phase noise in the right plot.
Phase noise affects actual measurements by degrading the error vector magnitude (EVM) performance of an RF instrument. For bit error rate (BER) tests, phase noise actually contributes to higher error rates.
6. Voltage Standing Wave Ratio (VSWR)
Voltage standing wave ratio (VSWR) is closely related to transmission line theory and becomes more important as the frequency range of an instrument increases. At a high level, VSWR measures signal reflections that occur as a result of impedance mismatch along a transmission line.
In a perfect world, the impedance of an RF instrument (typically 50 Ω) matches the impedance of each of the cables and the input impedance of the device under test. However, various imperfections such as asymmetric signal traces and part-to-part component variation alter the characteristic instrument impedance. As a result, signal reflections occur in the RF transmission and affect the amplitude and phase accuracy of the signal.
The signal reflection amplitude is dependent both on properties of the material used and on the frequency range. Impedance mismatch in the transmission line directly causes VSWR, which is generally more problematic at higher frequencies. For example, a VSWR of 1:1 represents a perfectly matched system. By contrast, a VSWR of 1.1:1 means that up to 10% of the signal amplitude is reflected in the transmission line.
Because VSWR is dependent on material properties as well, its value can be calculated based on a reflection coefficient, Γ, as shown in the following equation:
VSWR substantially affects a test signal because it causes adjustments in the phase or amplitude of the signal. Moreover, the generated signal amplitude either increases or decreases depending on the VSWR reflection phase. Figure 6 illustrates how VSWR reflections affect signal amplitude.
A reflection that is out-of-phase with the original signal causes a slight canceling affect. The resulting composite signal in Figure 6 shows slightly reduced amplitude. In most cases, VSWR is reduced through internal or external attenuator use. Thus, increasing the instrument reference level reduces VSWR through internal attenuation.
VSWR is an important specification because it significantly affects the amplitude accuracy of the instrument. Some applications, such as RF filter characterization, require the highest amplitude accuracy possible. Because an RF filter is characterized by measuring the amplitude loss according to the stimulus signal frequency, amplitude accuracy of both the stimulus signal and analysis instrument is paramount.
Understanding RF Instrument Specifications Part 1 provides basic information about relevant RF specifications. Remember that many of the specifications apply to all RF devices and not just to instruments. Thus, you will likely encounter some of the same specifications in your own designs. The next article in this three-part series explains the specifications used to characterize RF generators including frequency tolerance, linearity, power output, 1 dB compression point, and third-order intercept.
Refer to the National Instruments RF Developer Network for more information about making RF measurements.