Figure 11 shows the test setup for the following advanced calibration procedure. Attenuators have been added to the input and output ports of the DUT to reduce error contributions due to impedance mismatches, as discussed previously. Also, the following discussion uses the NI PXIe-5673 RF vector signal generator, NI PXIe-5663 RF vector signal analyzer, and NI USB-5680 RF power meter as the instrumentation in the setup. You can make the following assumptions:
- Good quality SMA attenuators are used with return losses of < -30 dB.
- Again, the mismatch error caused by impedance mismatch between the RF signal generator and the power meter is ignored since power sensors normally have return loss < -25 dB. The overall contribution is very small compared to the mismatch error between the RF generator and the UUT.
- Return loss of cables and adapters is insignificant (in other words, only the VSWRs/impedances of the RF signal generator and UUT are considered).
Figure 11: Normal Distribution Confidence Intervals
This calibration procedure consists of three main steps:
- Transfer power meter reference to signal generator at one power level.
- Transfer signal generator reference to the signal analyzer at various power levels leveraging the linearity of the NI PXIe-5673 digital-to-analog converter (DAC).
- Transfer signal analyzer references at various power levels back to the signal generator at various power levels.
The discussion of these steps includes screenshots of an NI TestStand sequence based on LabVIEW using the NI PXIe-5673, PXIe-5663, and USB-5680, as well as resultant data from this sequence.
Step 1—Transfer Power Meter Reference to Signal Generator
Figure 12 shows the setup for the first step of the calibration process. The purpose of this step is to essentially transfer the power meter accuracy to the signal generator over a range of frequencies and at a single power level. In place of the UUT, an SMA female-SMA female barrel adapter is attached to one end of the signal generator’s signal cable after the attenuator, which would normally be attached directly to the UUT. The power meter is connected to the other end of the SMA adapter. Assume that you are working with an ideal adapter with no impedance mismatch or insertion loss.
Figure 12: Step 1 Setup
A power level of 0 dBm is used for this step since the USB-5680 is itself calibrated at 0 dBm. This is essentially the same process described at the beginning of this discussion, except the spectrum/signal analyzer has been replaced with a USB-5680 power meter.
This setup is familiar to many RF instrumentation users who use power meters to calibrate their signal generators. The difference is many of these users do not need to generate a wide range of power levels and can expect to operate their generators in power ranges that are also within the range of the power meters, in which case the power meters can be used as the measurement devices at all times. These users can simply perform this calibration procedure at each desired frequency and power test point in their test plans. The procedure described here transfers the power meter accuracy to the signal generator at a single power level. Ensuing steps describe the procedure for transferring this accuracy back to the signal analyzer at differing power levels. Again, the motivation for this procedure is to develop a method for obtaining power accuracy on par with a power meter across the wide range of the signal generator and signal analyzer.
Figure 13 shows a screen shot of the automated NI TestStand calibration sequence instructing the user to attach the USB-5680 to the other end of the SMA-SMA adapter.
Figure 13: NI TestStand Calibration Sequence Dialog—Step 1
Figure 14 shows data acquired from Step 1 over a frequency range of 1.8 GHz to 2.1 GHz.
Figure 14: Step 1 Signal Generator Calibration Data
Step 2—Transfer Reference From Signal Generator to Signal Analyzer
The next step in the procedure is to transfer the reference that has been effectively mapped from the power meter to the signal generator back to the signal analyzer. This is the unique portion of this procedure because it uses the linearity performance of the baseband arbitrary waveform generator’s (AWG’s) DAC to reliably transfer the measured reference to the signal analyzer.
Figure 15: Step 2 Setup
The NI PXIe-5450 dual-channel AWG is the baseband generator that composes one part of the overall NI PXIe-5673 vector signal generator (other components are the NI PXIe-5611 I/Q vector modulator and NI PXI-5652 RF signal generator, which acts as the LO source).
Step 1 provided a reference power measurement of the NI PXIe-5673 at the specific power level of 0 dBm. The measured power levels and associated offsets at each frequency capture the signal generator uncertainty at a particular frequency and output level. An intuitive approach in this step is simply to program the NI PXIe-5673 to sweep across the same frequency range but also across a set of desired power-level test points. However, programming the NI PXIe-5673 to output different power levels and simply reusing the same compensation values at these different power levels ignores the additional uncertainty introduced by nonlinearities of the signal generator output and the uncertainty from potentially different attenuator states.
Because the DAC linearity of the NI PXIe-5450 is better than the linearity of the NI PXIe-5673 RF output stage, you can use this with the NI PXIe-5673 module's "attenuator hold" feature to perform a very linear output power sweep with the NI PXIe-5673 by manipulating the output power digitally at the baseband stage rather than the RF stage via attenuators and RF amplifiers. The NI PXIe-5673 keeps all attenuators fixed in the same state at which the power meter reference measurement was obtained in Step 1. It then sweeps the power output downward simply by manipulating the NI PXIe-5450 DAC. At each of these steps, the signal analyzer is programmed with a new reference-level setting, so that the reference measurement obtained captures the current state of NI PXIe-5663 attenuator settings as well as the effects of the analyzer attenuator and analyzer signal cable path loss.
The result of this step effectively transfers the power meter reference obtained by the signal generator to the signal analyzer at multiple analyzer reference-level/attenuation settings. At the conclusion of this step, you have a signal analyzer calibrated at various signal levels and a signal generator calibrated at a single power level.
Figure 16 shows a screenshot of the automated NI TestStand calibration sequence instructing the user to attach the NI PXIe-5663 to the other end of the SMA-SMA adapter via the analyzer signal cable and attenuator.
Figure 16: NI TestStand Calibration Sequence Dialog—Step 2
Step 3—Transfer Reference From Signal Analyzer to Signal Generator
The final step in the procedure leaves the setup intact from Step 2 (see Figure 12). Step 3 involves calibrating the signal generator across multiple power levels using the signal analyzer you calibrated at various reference levels in Step 2.
The reason for this step is that you have calibrated the signal generator at a single output setting in Step 1, and, in Step 2, you calibrated the analyzer by varying the signal generator output digitally via the DAC. However, you have yet to calibrate the signal generator when the signal generator is programmed to output different power levels and the RF attenuators are allowed to change state.
Figure 17 shows some resultant data of the calibrated system when the generator and signal analyzer cables are connected directly via an SMA-SMA adapter, as shown in Figure 12.
Figure 17: Residual Error After System Calibration Across Different Frequencies at -10 dBm
Stability Over Temperature
An important consideration in all calibration situations is temperature drift. As temperature changes, it physically affects the entire system by changing cable and PWB trace lengths, increasing or decreasing thermal noise contributions, and making other changes as well. Ideally, the system operates in an environment whose temperature is exactly the same as the temperature when the system was calibrated. This section looks at the potential uncertainty added to your example test system using the NI 6.6 GHz RF platform and the USB-5680 power meter.
- The NI PXIe-5673 has a thermal drift of <0.2 dB/10 ˚C.
- The NI USB-5680 power meter has a thermal drift of <0.06 dB/50 ˚C.
- The NI PXIe-5663 has a thermal drift of <0.2 dB/10 ˚C.
Due to the small nature of these values, you can safely ignore temperature drift (depending on the application), provided the operating environment stays reasonably close (+2 degrees) to the calibration temperature.
Another consideration with respect to temperature is settling time. Every DUT and instrument has thermal settling time for the device. Figure 18 shows how the measured power changes over time. The calibration procedure you have discussed performs multiple measurements until the difference between two consecutive readings is below a threshold (set to 0.05 dB by default).
Figure 18: Thermal Settling: If calibration is performed, the DUT should be settled.
Repeatability
The proposed method improves accuracy since the test setup has been characterized. For the frequencies and power levels at which it was characterized, the test system has the following experimental results.
Before characterizing the system, if you measure overall defined power levels and frequencies, the total error looks like Figure 19.
Figure 19: Precalibrated Data Based on the Simple Approach: Effects due to cable loss, mismatch, and system accuracy are combined in the graph.
If you look at the histogram of Figure 19, you see (Figure 20) that the mean is about -1.5 dB (average cable loss). The standard deviation is 0.25 dB and the range is about 1.23 dB.
Figure 20: Histogram of Precalibrated Data
Calculating the distribution of Figure 17 shows the following histogram (Figure 21).
Figure 21: Distribution of the Post-Calibration (Verification) Errors
This histogram has a mean at 0.008 dB, a standard deviation of 0.01 dB, and a range of 0.145 dB.
The presented data is the measured (composite) error of all the other variables that have different distributions. You can use this model to track what error is affecting the uncertainty the most and identify problems with the different setups.