Data processing model for the CDF experiment

Physics – Instrumentation and Detectors

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

12 pages, 10 figures, submitted to IEEE-TNS

Scientific paper

10.1109/TNS.2006.881908

The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets. The design of the processing control system faces strict requirements on bookkeeping records, which trace the status of data files and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Data processing model for the CDF experiment does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Data processing model for the CDF experiment, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Data processing model for the CDF experiment will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-542286

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.