Researchers from California Institute of Technology (Caltech) have developed the first computer model of an earthquake-producing fault segment that reproduces the available observations of both the fault's fast and slow behaviour.
The model details the entire history of an earthquake-producing fault and the interaction between the fast and slow deformation phases.
For those who study earthquakes, one major challenge has been trying to understand all the physics of a fault - both during an earthquake and at times of "rest" -- in order to know more about how a particular region may behave in the future.
"Our study describes a methodology to assimilate geologic, seismologic and geodetic data surrounding a seismic fault to form a physical model of the cycle of earthquakes that has predictive power," explained Sylvain Barbot, post-doctoral scholar in geology at Caltech.
Using previous observations and laboratory findings, the team modelled an active region of the San Andreas Fault called the Parkfield segment in central California.
Parkfield produces magnitude-6 earthquakes every 20 years on average. They successfully created a series of earthquakes (ranging from magnitude 2 to 6) within the computer model, producing fault slip before, during and after the earthquakes that closely matched the behaviour observed in the past 50 years.
The findings also show that a physical model of fault-slip evolution, based on lab experiments that measure how rock materials deform in the fault core, can explain many aspects of the earthquake cycle.
"Earthquake science is on the verge of building models that are based on the actual response of the rock materials as measured in the lab," the authors wrote.
This implies we are getting closer to understanding the physical laws that govern how earthquakes nucleate, propagate, and arrest, added the paper that appeared in the journal Science.