Advertisement
News
Advertisement

Supercomputer assists in crunching LHC data

Fri, 04/05/2013 - 9:42am
undefined

click to enlarge
 
This image of a supersymmetry event shows the transverse momentum imbalance due to dark matter particles escaping the detector (direction indicated by red arrow). Red and blue rectangles indicate energy deposited in the electromagnetic and hadronic calorimeter, respectively; green tracks in the center show charged particles with transverse momentum larger than 2GeV. Yellow-outlined triangles indicate jet cones or the presence of subatomic particles called quarks. Image: Matevz Tadel, UC San Diego/CMS   

Gordon, the unique supercomputer launched last year by the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, recently completed its most data-intensive task so far: rapidly processing raw data from almost one billion particle collisions as part of a project to help define the future research agenda for the Large Hadron Collider (LHC).

Under a partnership between a team of UC San Diego physicists and the Open Science Grid (OSG), a multidisciplinary research partnership funded by the U.S. Department of Energy (DOE) and the National Science Foundation, Gordon has been providing auxiliary computing capacity by processing massive data sets generated by the Compact Muon Solenoid, or CMS, one of two large general-purpose particle detectors at the LHC used by researchers to find the elusive Higgs particle.

“This exciting project has been the single most data-intensive exercise yet for Gordon since we completed large-scale acceptance testing back in early 2012,” says SDSC Director Michael Norman, who is also an astrophysicist involved in research studying the origins of the universe. “I’m pleased that we were able to make Gordon’s capabilities available under this partnership between UC San Diego, the OSG, and the CMS project.”

The around-the-clock data processing run on Gordon was completed in about four weeks’ time, making the data available for analysis several months ahead of schedule. About 1.7 million core hours—or about 15% of Gordon’s total compute capacity—were dedicated to this task, with more than 125 terabytes of data streaming through Gordon’s nodes and into SDSC’s Data Oasis storage system for further analysis. Just one terabyte of data, or one trillion bytes, equals the information printed on paper made from 50,000 trees.

“Access to Gordon, and its excellent computing speed due to its flash-based memory, really helped push forward the processing schedule for us,” says Frank Wuerthwein, a professor of physics at UC San Diego and a member of the CMS project. “With only a few weeks’ notice, we were able to gain access to Gordon and complete the runs, making the data available for analysis in time to provide crucial input toward international planning meetings on the future of particle physics.”

“Giving us access to the Gordon supercomputer effectively doubled the data processing compute power available to us,” adds Lothar Bauerdick, OSG’s executive director and the U.S. software and computing manager for the CMS project. “This gives CMS scientists precious months to get to their science analysis of the data reconstructed at SDSC.”

The UC San Diego-OSG collaboration comes as the LHC was shut down in February 2013 to make numerous upgrades during the next two years. One major activity during the shutdown includes the development of plans for efficient, effective searches once the LHC is back in operation. To do that—and to have time enough to upgrade equipment—researchers must also sift through massive amounts of stockpiled data to help define future research agendas.

“Unfortunately, the shutdown schedule meant that the parked data would not be available for analysis this summer, and possibly not even for deriving meaningful contributions to planning documents for future upgrades of the experiment that are due this fall,” explains Wuerthwein.

The hunt for dark matter
With the recent discovery and later confirmation in March of the Higgs boson—the last missing piece of the standard model of particle physics—scientists are now setting their sights on discovering new physics beyond the standard model. The next big thing is to search for dark matter, according to Wuerthwein.

“For the Higgs, we knew exactly how to search for it given theoretical predictions based on past experimental results,” says Wuerthwein, who is heading up the search for dark matter for the entire CMS team. “For dark matter, the situation is much more hazy. We hope to produce dark matter at the LHC in cascade decays of a whole spectrum of new fundamental particles, the lowest mass of which is dark matter. But the details of this spectrum of masses are unknown. To have sensitivity to a larger range of possible mass spectra, we needed to write more data to tape so we would be able to carefully analyze it later.”

The origin of this spectrum of new fundamental particles is a new kind of symmetry of nature called Supersymmetry, or SUSY. “Underlying this symmetry is a fascinating but theoretical conjecture with little to no physical evidence so far,” notes Wuerthwein. “It's fascinating because it could provide an ordering principle that allows for all known physical forces to be unified during the earliest times of the ‘Big Bang’ or birth of the universe, while providing an explanation for dark matter, and resolving some of the outstanding questions about details of the Higgs mechanism and mass.”

Source: University of California, San Diego

Advertisement

Share This Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading