By Flavio Falcinelli
Each tool analyzes a physical quantity according to a specified scale of measurement units.
This is true also for a radio telescope: in fact, a very important and delicate part of its functioning concerns the calibration. It is necessary to establish a calibration procedure to obtain, at the output of the radio telescope, data consistent with an absolute scale of brightness temperatures (or of flow units).
The constructional tolerances and environmental conditions cause changes in the operating parameters of the components of the receiver, moreover each instrument is unique in its response and is difficult to compare measurements from different telescopes or those of the same system carried out at different times.
By repeatedly observing a radio source you may experience changes in the intensity of the emission peak. It is important to understand whether these fluctuations are due to real changes in the source stream or to unwanted variations in the response of the instrument: it is therefore necessary to use a universal measuring system.
The calibration procedure of a radio telescope is used to establish a relationship between the brightness temperature of the scenario observed [K] and a given amount output from the instrument [ADC count].
In-depth documents will describe some calibration procedures for RAL10 receivers.
A radio telescope measures the intensity of the radiation coming from the scenario observed in arbitrary units [ADC count]. These units represent the numerical value of the output of the analog-to-digital converter (ADC), internal to the instrument, of the "digitized" analog quantity (revealed radio signal). A calculation transforms the response of the instrument in absolute temperature units [K] using the calibration relation of the radio telescope.