APE segment pattern recognition in new phasing techniques

Computer Science – Performance

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

The aperture of future Extremely Large Telescopes will be composed of hundreds of individual segments which require the development of new robust phasing techniques based on the concept of pupil plane detection. The misalignments of the segments produce amplitude variations at the location of the segment edges recorded on the phasing camera. To analyze the signals which contain the information about the segmentation error, the position of the segment borders on a CCD image must be determined with a sub-pixel accuracy. In the framework of the Active Phasing Experiment (APE) carried out at ESO, we have developed two methods to retrieve the segmented pattern. One is based on the Hough transform and the other one on the correlation of the images with a hexagonal pattern. After a description of both methods, we shall present the results achieved so far with simulations. Finally, the performances of the two methods will be compared. This project forms part of the ELT Design Study and is supported by the European Commission, within Framework Programme 6, contract No 011863.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

APE segment pattern recognition in new phasing techniques does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with APE segment pattern recognition in new phasing techniques, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and APE segment pattern recognition in new phasing techniques will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-1623094

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.