How are earthquakes measured?
The strength of earthquakes is measured using two main scales: the Richter scale and the Moment Magnitude Scale (MMS).
Richter Scale: Developed by Charles F. Richter in 1935, this scale measures the magnitude of an earthquake. It’s a logarithmic scale, meaning each whole number increase on the scale represents a tenfold increase in measured amplitude and roughly 31.6 times more energy release. For example, a magnitude 5.0 earthquake releases approximately 31.6 times more energy than a magnitude 4.0 earthquake. The Richter scale was used for many years but has been largely replaced by the Moment Magnitude Scale.
Moment Magnitude Scale (MMS): The MMS, developed in the 1970s, is now the most widely used scale for measuring earthquake magnitudes, especially for medium to large ones. This scale is also logarithmic and is based on the earthquake’s seismic moment, which measures the total energy released by the earthquake. The MMS provides a more accurate and comprehensive estimate of earthquake size, especially for larger quakes. The values given by the MMS are more directly related to the physical characteristics of the earthquake, such as the area of the fault that slipped and the amount of slip.
Both scales are based on the measurement of seismic waves recorded on seismographs. Still, the Moment Magnitude Scale tends to provide a more accurate representation of the earthquake’s total energy, especially for huge earthquakes.
Use the images below to explore related GeoTopics.