Mathematics – Logic
Scientific paper
Aug 1996
adsabs.harvard.edu/cgi-bin/nph-data_query?bibcode=1996eso..pres...12.&link_type=abstract
ESO Press Release, 08/1996
Mathematics
Logic
Scientific paper
How Future Astronomical Observations Will be Done
The past four centuries have seen dramatic improvements in astronomical equipment, in terms of better and larger telescopes, more accurate and sensitive detectors and, not the least, by advanced space instruments with access to new spectral regions. However, until recently there has been little progress on another equally important front, that of quantifying the unavoidable influence of this equipment on the astronomical data they produce .
For a long time, astronomers have desired to remove efficiently these `instrumental effects' from their data, in order to give them a clearer understanding of the objects in the Universe and their properties. But it is only now that this fundamental problem can finally be tackled efficiently, with the advent of digital imaging techniques and powerful computers.
Two researchers at the ESO Headquarters, Michael R. Rosa of the Space Telescope European Co-ordinating Facility (ST/ECF [1]) and Pascal Ballester of the Data Management Division (DMD) are now developing a new approach to this age-old problem. These results are important for the future use of the ESO Very Large Telescope (VLT) , the Hubble Space Telescope (HST) and other large facilities as well [2]. The observational process
Observations are crucial to the progress of all natural sciences, including astronomy. Nevertheless, the properties of the observed objects are rarely revealed directly.
First, observational data are gathered at the telescopes with instruments such as cameras and spectrophotometers. Then these `raw' data are processed with advanced computer programmes to produce scientifically meaningful data which are finally scrutinized by the astronomers in order to learn more about the observed celestial objects.
A basic problem in this chain is the influence of the telescopes and instruments on the data they produce. The `raw' observational data carries the marks, not only of the celestial objects that are observed, but also of the `recording equipment' and, in the case of ground-based observations, of the atmospheric conditions as well.
These disturbing effects, for example straylight in the telescope and light absorption in the atmosphere, are referred to as the instrumental and atmospheric `signatures'. Only when they have been `removed' from the data, can these be properly interpreted. In fact, unless these effects are completely known, an observation may not result in any new knowledge at all or, even worse, may lead to erroneous results.
The history of astronomy contains many examples of the battle with instrumental effects; see also the Appendix. With the advent of new and advanced astronomical facilities like the VLT and HST, the need for an efficient solution of this fundamental problem has become particularly acute. The calibration challenge
Until now, the usual procedure to tackle this common problem has been to observe socalled `reference sources' (celestial objects with well-known properties [3]) with exactly the same instrument and observational mode and under same atmospheric conditions as the celestial object under study, referred to as the `target'.
A comparison between the `raw' observational data recorded for the reference sources and their known properties then allows to determine, more or less accurately, the instrumental and atmospheric signatures. Subsequently, these effects can be removed during the data processing from the raw data obtained for the programme targets. This leaves behind - at least in theory - `clean data' which only contain the desired information about the celestial object under investigation. This fundamental, observational procedure is known as `calibration'.
Nevertheless, serious limitations are inherent in such a calibration procedure. In principle, it is only logically valid if the reference source has the same properties as the target and both are observed under identical instrumental and atmospheric conditions. These requirements, however, are never fulfilled in practice. One way around this obstacle is to observe a sufficient number of reference sources, the properties of which are supposed to bracket the properties of the targets. Likewise, repeated observations must be made whenever the observing conditions change. This way one hopes to obtain estimates of the instrumental and atmospheric signatures at the time of the observation of the target by means of interpolation.
Until now, this empirical calibration process was the only one available. Unfortunately, it demands a lot of the valuable telescope time just for repeated observations of the reference sources, significantly diminishing the time available for observations of the scientifically important objects. Moreover, every time the instrument is even slightly changed or some condition is altered, a new calibration procedure must be carried through. Maximizing observational efficiency
In just over one year from now, ESO will begin to operate the largest optical telescope ever built, the Very Large Telescope (VLT) at the new Paranal Observatory in Chile. Because of its enormous light-collecting area and superior optical quality, the VLT is destined to make a break-through in ground-based observational astronomy.
The demand by astronomers for observing time at this unique facility is overwhelming. Even with the unsurpassed number of clear nights at Paranal, each available minute will be extremely precious and everything must be done to ensure that no time will be lost to unnecessary actions.
This is a major challenge to the scientists. For instance, how long a time should an exposure last to ensure an optimum of new knowledge about the object observed? In addition, how much time should be spent to define in sufficient detail the `signatures' of the atmosphere, the telescope and the instruments which must be removed from the `raw' data before the resulting `clean' data can be interpreted in a trustworthy way?
In short, how can the scientific return from the VLT and other telescopes such as the HST best be optimised? It is exactly for this reason that astronomers and engineers at ESO are now busy developing new methods of telescope operation and data analysis alongside with the VLT instrumental hardware itself. The new solution by means of models
The appropriate strategy to make progress in the inherent conflict between calibration demand and time available for scientific observations is to obtain a physically correct understanding of the effects exerted on the data by different instruments . In this way, it is possible to decide which calibration data are actually required and on which timescale they have to be updated. One can then use computer models of these instruments to predict calibration solutions which are now valid for the full range of target properties and which handle environmental conditions properly. Such computer models can also be used to simulate observations.
This brings a lot of benefits for the entire observational process. First, the astronomer can prepare observations and select instrumental modes and exposure times suited for optimal information return. Secondly, it provides confidence in the validity of the calibration process, and therefore in the cleanliness of the corrected data. Finally, once a theory about the target and its properties has been developed, one may simulate observations of a set of theoretical targets for which the properties are slightly modified in order to study their influence on the raw data.
For the observatory there are also advantages. Optimization from the point of view of data analysis can now take place already during instrument design, calibration and data analysis procedures for any observational mode can be tested before real observations are obtained, and the maintenance staff can make sure that the instrument performs as expected and designed. How far have we come along this road?
The present project consists of a close collaboration between the ESO Data Management Division (DMD) and Space Telescope European Co-ordinating Facility (ST/ECF). The VLT and the
No affiliations
No associations
LandOfFree
New Method for Data Treatment Developed at ESO does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with New Method for Data Treatment Developed at ESO, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and New Method for Data Treatment Developed at ESO will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-1450275