This article describes the specifications listed below. These specifications are applicable to both RF generation and analysis.
- Frequency range
- Instantaneous (Real-Time) bandwidth
- Tuning speed
- Phase noise
- Voltage standing wave ratio (VSWR)
Note: All RF devices are subject to the same design rules as RF instrumentation.
2. Frequency Range
Frequency range is an important characteristic of an RF instrument. For example, a WiFi test solution requires operation at frequencies of up to 2.5 GHz. Similarly, while performing the analysis of a component that operates at 900 MHz the instrument that uses this component must operate at that frequency range to be useful. A number of components, such as mixers, input filters, and local oscillators (LOs), can affect the maximum frequency range of an RF instrument. However, the instrument is configured to work at a specific frequency mainly by tuning the LO. Some instruments use a series of LOs, but the simplified instrument block diagram, shown in Figure 2, uses a single LO.
The LO is mixed with the RF input, which helps convert the RF signal down to an intermediate frequency (IF) signal. This frequency synthesis technique also applies to RF signal generators.
Frequency synthesis is performed using either a voltage controlled oscillator (VCO) or an yttrium iron garnet (YIG). Historically, RF instruments used a YIG-based architecture for generating the LO. The YIG is a current-controlled oscillator that is known for its tight phase noise and wide frequency ranges (up to 20 GHz or higher). However, YIG-based instruments consume significant power and are expensive. In addition, tuning the YIG from one frequency to another takes a longer tuning time than other methods. Hence, VCO-based LO architectures have recently become more common. The VCO has a smaller frequency range than the YIG, but has a greater tuning speed.
3. Instantaneous (Real-Time) Bandwidth
The term instantaneous, or real-time, bandwidth is used to describe the maximum continuous RF bandwidth that an instrument generates or acquires. For example, a vector signal generator might generate a signal at a center frequency of 2.45 GHz, but the instantaneous bandwidth of the instrument, or signal bandwidth, might only be 20 MHz wide. This signal bandwidth means that the device can continuously acquire 20 MHz of RF spectrum without re-tuning the LO.
Instantaneous bandwidth is largely determined by the RF analog front end of the instrument. You can understand the instantaneous bandwidth specification better, if you know about the basic architecture of an RF instrument. Current technology cannot digitize every signal in the gigahertz range. Thus, RF instruments use a series of LOs, mixers, and filters to bring an RF signal into an IF or baseband frequency range. Figure 2 shows the block diagram of a highly simplified vector signal analyzer.
In Figure 2, the vector signal analyzer downconverts a portion of the RF spectrum to an IF that an ADC can recognize. The instantaneous bandwidth of an RF instrument is determined by the following two main components:
- Filters implemented in the instrument
- Sample rate and bandwidth of the ADC
The relative importance of the instantaneous bandwidth of an instrument is largely dependent on the application. For example, generating a narrow-band FM signal requires only 200 kHz instantaneous bandwidth. However, generating and analyzing wide-band signals, such as the IEEE Standard 802.11g (WiFi), requires at least 20 MHz of instantaneous bandwidth. Other applications, such as spectral mask testing, perform faster when the instantaneous bandwidth is significantly wider than the signal of interest. If the spectral mask test requires more instantaneous bandwidth than what the instrument provides, you must re-tune the instrument to acquire the frequency information in sections.
4. Tuning Speed
Tuning speed measures the amount of time required for the LO to change from one center frequency to another within a specified accuracy level. When tuning an oscillator to a different frequency, the LO settling time dictates the tuning speed.
In typical systems, when tuning from one frequency to another, the LO usually slightly overshoots the desired frequency, and then settles to the desired frequency within a certain time period. In most cases, the tuning speed is a function of the frequency step size. The greater the frequency step, the longer it takes the LO to tune within a specified range. Table 1 illustrates the settling time for a YIG-based LO.
Table 1. Tuning Speed of YIG-Based LO
Tuning speed is an important specification in applications such as an 802.11g transceiver automated production test. Because the 802.11g standard specifies that devices must function at one of the 14 channels between 2.4 GHz and 2.48 GHz, RF instruments must be used to test device operation across a variety of frequencies. The quicker the test signal sweeps from one station to the next, the quicker the receiver is tested.
5. Phase Noise
Phase noise describes the short-term frequency stability of an RF instrument. Phase noise is caused by small, instantaneous LO phase jitter and results in signal power at frequencies adjacent to the carrier.
A simple way to visualize the effects of phase noise is to analyze a single tone in the frequency domain. Figure 3 represents two simulated carriers, an ideal carrier and a carrier with phase noise.
The left plot in Figure 3 illustrates single tone generation, which ideally results in a single peak of power, concentrated at a very precise frequency. The right plot in Figure 3 illustrates a slightly different result, where phase noise (essentially time-domain jitter) results in a slight periodic spreading of the signal in the frequency domain.
Phase noise is characterized by measuring the signal amplitude at various offsets from the desired carrier. The right plot in Figure 3 illustrates the measurement of a phase noise of –95 dBc at a 1 kHz offset and –146 dBc at a 10 kHz offset.
The significance of RF instrument phase noise depends on the application. Tight phase noise is required in the detection of low-level blocker signals that are close to a particular signal of interest. When using an LO with significant phase noise, the phase noise is amplified in the resulting IF signal. Figure 4 illustrates how LO phase noise translates to phase noise of the resulting IF signal.
In the above application, the phase noise of the two signals interferes with each other, which makes it more difficult to identify the specific blocker signal characteristics.
Visualizing the demodulation of a signal with a constellation plot is another way to illustrate phase noise effects. A signal with significant phase noise shows slight periodic rotations of the constellation plot. Figure 5 compares an ideal 4-phase-shift keying (4-PSK) modulated signal with four symbols, represented by black dots, being transmitted in the left plot to a signal with significant phase noise in the right plot.
Phase noise affects actual measurements by degrading the error vector magnitude (EVM) performance of an RF instrument. For bit error rate (BER) tests, phase noise actually contributes to higher error rates.
6. Voltage Standing Wave Ratio (VSWR)
Voltage standing wave ratio (VSWR) is closely related to transmission line theory and becomes more important as the frequency range of an instrument increases. At a high level, VSWR measures the signal reflections that occur as a result of impedance mismatch along a transmission line.
In a perfect world, the impedance of an RF instrument (typically 50 Ω) matches the impedance of each of the cables and the input impedance of the device under test. However, various imperfections, such as asymmetric signal traces, and part-to-part component variation, alter the characteristic instrument impedance. As a result, signal reflections occur in the RF transmission and affect the amplitude and the phase accuracy of the signal.
The signal reflection amplitude is dependent both on properties of the material used and on the frequency range. Impedance mismatch in the transmission line directly causes VSWR, which is generally more problematic at higher frequencies. For example, while a VSWR of 1:1 represents a perfectly matched system, a VSWR of 1.1:1 means that up to 10% of the signal amplitude is reflected in the transmission line.
Because VSWR is also dependent on material properties, its value can be calculated based on a reflection coefficient, Γ, as shown in the following equation:
VSWR substantially affects a test signal because it causes adjustments in the phase or amplitude of the signal. Moreover, the generated signal amplitude either increases or decreases depending on the VSWR reflection phase. Figure 6 illustrates how VSWR reflections affect signal amplitude.
A reflection that is out-of-phase with the original signal causes a slight canceling affect. The resulting composite signal in Figure 6 shows slightly reduced amplitude. In most cases, VSWR is reduced through the use of internal or external attenuator. Increasing the instrument reference level reduces VSWR through internal attenuation.
VSWR is an important specification because it significantly affects the amplitude accuracy of the instrument. Some applications, such as RF filter characterization, require the highest amplitude accuracy possible. Because an RF filter is characterized by measuring the amplitude loss according to the stimulus signal frequency, amplitude accuracy of both the stimulus signal and analysis instrument is of high importance.
Understanding RF Instrument Specifications Part 1 provides basic information about some of the RF specifications. Note that many of the specifications apply to all RF devices and not just to instruments. You may encounter some of the same specifications in your own designs. The next article in this three-part series explains the specifications that are used to characterize RF generators such as frequency tolerance, linearity, power output, 1 dB compression point, and third-order intercept.
Refer to the National Instruments RF Developer Network for more information about making RF measurements.