Advanced LIGO subsystems
are the organizational units of the overall project. Follow the links below to view the mission and progress of each subsystem.
|Auxiliary Optics||Core Optics|
|Data Acquisition||Data and|
Subsystem: Data and Computing Systems
The Data and Computing Systems (DCS) subsystem must acquire the data stream from the LIGO interferometers and share these data with the 1000-strong LIGO Scientific Collaboration, and our sister collaboration Virgo, with a significant amount of the data going out in real time. Additionally, DCS resources must enable the computation that's necessary to search for gravitational wave signals.
Different gravitational wave signal types require different analysis approaches. Consider, for instance, a binary coalescence -- two compact objects like stars or black holes circling around each other, approaching and then coalescing into one larger single massive object like a black hole. This event should produce a well-defined gravitational wave signature known as a chirp, starting at low frequencies (wherever the instrument has the sensitivity to "hear" them, perhaps at 20 or 50 Hz) up to the frequency of coalescence (depending on the mass, it could be from a few hundreds of Hz for black holes or a kHz for neutron stars). The time series below shows a model chirp (referred to as a template) with no noise. LIGO runs a chirp template such as this along the time series data from the interferometers, looking for a match. The analysis code will flag any matches that it finds in the data stream, and scientists will subject these matches to further detailed scrutiny.
Because the chirp signature depends on the astrophysical objects' masses, their spins, and the angle from which the detectors view the event, LIGO must run thousands of templates over the data. Happily the problem can be divided among many computers (CPU cores). LIGO can assign one signature template to a core. The code will compute the overlap of the data stream with the template, and return the result to a master computer that assembles the results from all of the cores. This is known endearingly as an "embarrassingly parallel" computational problem, meaning that LIGO can use garden-variety CPUs and normal communications protocols (gigabit Ethernet) to approach the problem. GPU's (graphical processing units) are becoming more numerous in LIGO's computation architecture. These relatively inexpensive processors (prevalent in gaming hardware) can be purposed for fast parallel computing.
In addition to handling the pure problem of searching for gravitational wave signals, DCS must make provision for a deep and robust detector characterization capability. Noise sources within LIGO's interferometers can sometimes imitate a gravitational wave signals. LIGO must continuously search thousands of detector channels along with the gravitational wave channel, looking for noise episodes that contaminate the data. The new DCS program has placed modest clusters of CPUs at the LHO and LLO Observatories. These are principally used for detector characterization, which is now a significantly expanded activity given the increased complexity of the advanced detectors' hardware configurations. The main cluster, at Caltech, is devoted to analysis. The LIGO Data Grid combines the aLIGO DCS computers with other clusters around the world to handle the huge computing load for the gravitational-wave analysis from the LIGO and Virgo detectors. LIGO continues to make preparations to utilize computing resources associated with large projects such as the Extreme Science and Engineering Discovery Environment (XSEDE) and the Open Science Grid (OSG). And of course Einstein@Home will remain a key computing asset for LIGO's pulsar data searches. Register for Einstein@Home today!
A DCS cluster
A single DCS rack of nodes
A DCS air conditioner
Explore Advanced LIGO
Instrumentation and Astrophysics
An Overview of the Upgrades
The International Partnership
LIGO Technology Transfers
LIGO Scientific Collaboration