System Self Calibration
- Updated2025-07-24
- 4 minute(s) read
The RTG performs self-calibration to allow it to provide accurate delays and attenuations as well as to provide optimized spectral performance.
This calibration is only valid for a given set of parameters:
- VST Center Frequency
- Input Reference Level
- Data Path (if using a coprocessor)
- Coprocessor Minimum Latency (if using a coprocessor)
If any of these parameters are changed, the current calibration will not be used. Use existing saved calibration data every time an RTG session is restarted to achieve the best accuracy. The software will automatically apply calibration data that is found to be matching the operational parameters.
The different parts of system self-calibration are run back-to-back. Each section is described as follows.
- Static scenarios—Update the offset frequency value either in the UI or through the API. This frequency will be used to apply corrections to all configurations.
- Dynamic scenarios—Configure the RTG with an enable frequency correction option. When set to TRUE, the RTG will measure the frequency of the radar pulse and apply the appropriate corrections. When set to FALSE, the RTG will use the value entered in the offset frequency parameter to apply the appropriate corrections. The only caveat is that if the desired target delay is less than the amount of time required to measure the frequency and calculate the compensation, the user-entered offset frequency will used in place of the measured frequency. Refer to Attenuation for more information about on-the-fly (OTF) correction.
Related Information
- Attenuation
The Radar Target Generator (RTG) applies and manages attenuation to achieve desired target signal amplitudes while maintaining optimal dynamic range.