Novel Zooming Scale Hough Transform Pattern Recognition Algorithm for the PHENIX Detector

Computer Science – Performance

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Single ultra-relativistic heavy ion collisions at RHIC and the LHC and multiple overlapping proton-proton collisions at the LHC present challenges to pattern recognition algorithms for tracking in these high multiplicity environments. One must satisfy many constraints including high track finding efficiency, ghost track rejection, and CPU time and memory constraints. A novel algorithm based on a zooming scale Hough Transform is now available in Ref [1] that is optimized for efficient high speed caching and flexible in terms of its implementation. In this presentation, we detail the application of this algorithm to the PHENIX Experiment silicon vertex tracker (VTX) and show initial results from Au+Au at √sNN = 200 GeV collision data taken in 2011. We demonstrate the current algorithmic performance and also show first results for the proposed sPHENIX detector. [4pt] Ref [1] Dr. Dion, Alan. ``Helix Hough'' http://code.google.com/p/helixhough/

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Novel Zooming Scale Hough Transform Pattern Recognition Algorithm for the PHENIX Detector does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Novel Zooming Scale Hough Transform Pattern Recognition Algorithm for the PHENIX Detector, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Novel Zooming Scale Hough Transform Pattern Recognition Algorithm for the PHENIX Detector will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-1371168

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.