Southern California Earthquake Data Center

About

History of the Data Center

The SCEDC facility was initiated in October 1991 as part of the Southern California Earthquake Center (SCEC). At that time, our primary mission was to archive the seismograms recorded by the Southern California Seismic Network (SCSN) and to provide a simple mechanism for scientists to retrieve these data online. User access to this archive was initially through direct login to the Data Center machines via research accounts. When the Data Center came online in January 1992, we had 95 scientific users and 214 GB of seismic and geodetic data, which included over 200 GB of short-period network data recorded by the SCSN from April 1981 to 1992 and the TERRAscope triggered waveforms from September 1990 forward.

 

The SCEDC translated thousands computer tapes from the Caltech/USGS Seismic Processing (CUSP) into a custom ASCII database containing earthquake parameter data, with pointers to the waveform data. The triggered seismograms for over 300,000 earthquakes were stored on an Internet-accessible 0.6 TB optical WORM mass-storage system. Parametric data (e.g., hypocenters and phase picks) were made available through the Web and these public interfaces were unexpectedly popular and were distributed to a much larger audience than anticipated when the Data Center came online in 1992. The demographics of the users that access to the SCEDC system changed significantly throughout the 1990s. While the access rate by individual researchers increased steadily via the web, "finger quake" and individual research accounts, access to the SCEDC from the ".net" and ".com" sectors of the Internet exploded. The Data Center responded to this latter group by providing www and other interfaces to the most commonly referenced parts of the archive, e.g., earthquake catalogs, seismicity reports, and recent earthquake listings.

 

1999 was a major period of transition. Waveform data was transferred from the 12" WORM Sony 610 Jukebox system archive to the new DISC 5 1/4" WORM archive. The database system dbsort/scecgram was replaced by a database/archiving system, which required the SCEDC to be directly coupled to the seismic processing system, resulting in a more interdependent relationship of the real-time and data analysis functions of TriNet and the Data Center. The development of the new database system was designed to facilitate a seamless exchange of waveform and parametric data with the Northern California Earthquake Data Center (NCEDC).

 

Until late 1999, when TriNet, the new modern digital broadband array, came online, the SCEDC archive consisted primarily of short-period, 100 sample/second waveforms from the SCSN. Since September 1999, the Data Center archive has contained waveforms from over 150 broadband and 200 accelerometer instruments, as well as he original SCSN short-period vertical stations. In the fall of 1999, we began to archive 20 sps waveform data continuously from the TriNet digital stations in SEED format. Data transfer between the monitoring network and the Data Center has also changed significantly since its inception. A time delay of a few days used to be the standard for new data to be available at the SCEDC. With the inception of the TriNet system and the changes in the daily operations and daily archiving of the Data Center, new earthquake data are available to the community in near real-time.

 

Since 2003, the Data Center has been archiving all incoming waveform data and converted all historic waveform data onto RAID magnetic disks running a Linux operating system. The initial motivation for this change in archival hardware was to reduce storage cost, but it also has the advantage of nearly eliminating the time latency associated with retrieving data from a robotic system.

 

In 2003, the SCEDC completed compiling and converting the remainder of the historic seismic data, so there will be a single source for online access to southern California earthquake data from 1932-present. Older parametric data has been loaded into the SCEDC Oracle 9i database; waveform data has been converted into the modern archival format and is available for download via STP. The CUSP data sets from 1981 to 2000 have been converted from a VAX system into the modern archival format. The parametric data has been loaded into the SCEDC Oracle database for a continuous catalog from 1932 to present; the 1981-present waveform data has been converted to mSEED format. This historic data includes events from 1981 and 1983, which have never been timed accurately using the electronic data. Because the data has been converted to the modern format, the analysts are working with the same data as the scientists. This means the results of their work are immediately available to the scientific community with no translation errors.

 

Today, the Data Center provides access to a large and well-maintained Oracle database of earthquake data in a very active seismic region and is able to provide near real-time access to data, approximately 2-3 minutes after an event. The SCEDC currently archives nearly 3000 data channels from 400 stations, processing and archiving an average of 15,000 earthquakes each year.

 

The objectives of the Southern California Earthquake Data Center are to:

 

  • Maintain the primary online, near "real-time" archive of seismological data for southern California and add new data as it becomes available from the seismic network.
  • Archive and distribute the seismic data required for scientists to develop a comprehensive, physics-based understanding of earthquake phenomena in southern California.
  • Add other data to the archive of seismological data as it becomes available.
  • Develop new interfaces and improve interfaces that integrate data and data products. Integrate with other data centers via standard interfaces as are currently in place at the Northern California Earthquake Data Center (NCEDC) and the IRIS DMC and via new interfaces.

 

The mission of the SCEDC is to maintain an easily accessible, high-quality, searchable archive of earthquake information for research in seismology and earthquake engineering. The database and archive of earthquake phases will provide insight into the structure of the Earth. The high-fidelity records of the ground shaking during earthquakes will help clarify the earthquake source, and document what level of shaking earthquakes produce and what levels buildings, both damaged and undamaged, endured, providing the knowledge policy-makers and planners need to build a resilient civil infrastructure. Researchers studying foreshocks and early aftershocks have recently benefited from the triggering of continuous high-frequency broadband recordings for several hours before and after significant events. STP (Seismic Transfer Program) provides data in a variety of formats, including mSEED for seismologists and COSMOS-V0 format for engineers.

 

In response to users' requests, the Data Center has put significant effort toward standardizing catalogs and developing uniform access interfaces to other data centers. The Data Center is conducting this work with the goal of increasing and enhancing accessibility to its data, so it can be used as extensively in the scientific research community as possible. The Data Center is actively working cooperatively toward greater standardization within regional data centers.

 

top