1 dB Gain Compression Measurement (P1dB)

Publish Date: Feb 02, 2012 | 20 Ratings | 3.15 out of 5 |  PDF

Overview

An amplifier maintains a constant gain for low-level input signals. However, at higher input levels, the amplifier goes into saturation and its gain decreases. The 1 dB compression point (P1dB) indicates the power level that causes the gain to drop by 1 dB from its small signal value.

Table of Contents

  1. Measurement Setup
  2. Understanding RF VSA Compression Limits
  3. Choosing the Optimal RF VSA Attenuation Setting
  4. Resources
  5. Related Products

1. Measurement Setup

Measuring the 1 dB gain compression point of a device requires driving the UUT into compression without driving the RF Vector Signal Analyzer (VSA) into compression. This requires proper attenuation at the RF VSA and a PXI-565x RF Signal Generator to compress the UUT. You may apply attenuation by programming the internal input attenuators or using external attenuation.

The 1 dB compression point is derived from the gain relationship between output power and input power. Using the measurement setup shown in the figure below, source amplitude is slowly increased while the UUT output is monitored.


Typical 1 dB Gain Compression Setup


Output power is plotted against input power as shown in the figure below:


Gain Compression Plot


The straight line on this graph is an extrapolation of the small signal gain of the UUT. The input 1 dB compression point is the input power that causes the UUT gain to drop by 1 dB from this small signal value, or approximately –12 dBm in this case.

Back to Top

2. Understanding RF VSA Compression Limits


Like all signal analysis devices, the RF VSA is not completely linear and will eventually reach compression. However, the RF VSA architecture possesses a high degree of linearity, and its compression point is typically 5 dBm or higher.

Ensure accurate UUT compression measurements by limiting the signal at the RF VSA input mixer to 20 dB below the compression point listed in the NI PXI-5660 RF Vector VSA Specifications document included in your RF VSA kit.

Back to Top

3. Choosing the Optimal RF VSA Attenuation Setting


Choosing the optimal attenuation settings for a UUT compression measurement requires taking a couple of factors into account:

· The maximum output signal of your UUT must be attenuated to 10–20 dB less than the compression point of the RF VSA.
· The resolution bandwidth setting of the RF VSA must be low enough that small signals used to determine the linear gain of the UUT are not overwhelmed with noise from the RF VSA.

To set the proper RF VSA attenuation level for a compression test on a UUT with known output compression estimate and known approximate gain, complete the following steps:

1. Set the RF VSA mixer level to –20 dBm and its reference level to 10 dB above the estimated UUT compression point. When using the RF VSA Demo Panel, mixer level = reference levelattenuation.
2. Set the RF VSA center frequency to your intended testing frequency, its span to 1 MHz, and its resolution bandwidth to 1 kHz.
3. Inject a signal into the UUT small enough that its output level is at least 20 dB below the estimated UUT compression point. If the UUT output signal level is too close to the noise floor of the RF VSA, decrease the RF VSA resolution bandwidth.
4. Increase the input signal to the UUT. If the output signal has reached 5 dB below the RF VSA reference level and compression of the UUT has not been reached, increase the reference level by 10 dB.
5. Repeat step 4 until compression appears in the UUT.
This setting is the optimal attenuation setting.

Back to Top

4. Resources


Common RF and Microwave Measurements
This is the main page of a series of tutorials focused on common RF measurements involving signal generators and analyzers.

Back to Top

5. Related Products


NI PXI-5660 2.7 GHz RF Vector Signal Analyzer
The National Instruments PXI-5660 is a modular 2.7 GHz RF vector signal analyzer with 20 MHz of real-time bandwidth optimized for automated test.
 
NI PXI-5671 2.7 GHz RF Vector Signal Generator
The National Instruments PXI-5671 module is a 3-slot RF vector signal generator that delivers signal generation from 250 kHz to 2.7 GHz, 20 MHz of real-time bandwidth and up to 512 MB of memory.

NI PXI-5652 6.6 GHz RF and Microwave Signal Generator
The National Instruments PXI-5652 6.6 GHz RF and microwave signal generator is continuous-wave with modulation capability. It is excellent for setting up stimulus response applications with RF signal analyzers.

NI RF Switches
The National Instruments RF switch modules are ideal for expanding the channel count or increasing the flexibility of systems with signal bandwidths greater than 10 MHz to bandwidths as high as 26.5 GHz.

NI LabVIEW
The National Instruments PXI-5660 is a modular 2.7 GHz RF vector signal analyzer optimized for automated test. It provides high-throughput RF measurements in a compact, 3U PXI package.

NI Advanced Signal Processing Toolkit
The National Instruments LabVIEW Signal Processing Toolkit is a suite of software tools, example programs, and utilities for time-frequency analysis, time-series analysis, and wavelets. It also includes a full version of the NI LabVIEW Digital Filter Design Toolkit. which is also available separately.

NI Digital Filter Design Toolkit
The National Instruments Digital Filter Design Toolkit extends LabVIEW with functions (LabVIEW VIs that install into the palette) and interactive tools for design, analysis, and implementation of digital filters.

Back to Top

Bookmark & Share


Ratings

Rate this document

Answered Your Question?
Yes No

Submit