New algorithm holds promise for earthquake prediction

January 31, 2010

vuik-Professor Kees Vuik is a professor, and Mehfooz ur Rehman is a PhD candidate at Delft University of Technology. The opinions expressed are their own.-

The Haiti earthquake was a truly appalling tragedy and it is little wonder that the United Nations has described it as the worst humanitarian disaster it has faced in its history.  The 2010 earthquake follows several earlier ones, including in 1751, 1770, 1842 and 1946, which have struck the island of Hispaniola (the tenth most populous island in the world) which is shared by Haiti and the Dominican Republican.

While world attention is rightly focusing now upon the aid effort in the country, much media coverage has so far obscured the fact that the science of earthquake prediction is improving and holds much promise in the next few years.  While this will be of no solace for the people of Haiti, what this means in practice is that scientists might be able in the not-too-distant future to provide warnings for at least some similar events, thus helping to minimise loss and life and wider devastation.

Predicting earthquakes was once thought to be impossible owing to the difficulty of calculating the motion of rocky mantle flows.  However, thanks to an algorithm created by the Delft University of Technology, we now know that it is possible to model these underground streams.

Much of the early experimentation with the new algorithm has been based around the North Anatolian Fault.  This is a major active geologic fault which runs along the tectonic boundary between the Eurasian Plate and the Anatolian Plate.

The fault extends westward from a junction with the East Anatolian Fault at the Karliova Triple Junction in eastern Turkey, across northern Turkey and into the Aegean Sea.  The last time there was a major earthquake along this fault line, at Izmith in Turkey in 1999, around 17,000 people were killed.

Our colleagues in Utrecht are currently applying our algorithm to create a model (consisting of some 100 million underground grid points) of the North Analtolian Fault (essentially the underground in Greece and Turkey up to 1,000 kilometers deep).  What this information allows us to ascertain is where the underground stresses are strongest — an often tell-tale sign of the most dangerous potential earthquake trigger points.

As good as this model is, however, it still needs refinement.  This is because the link between earthquakes and underground flows is complex and hard to compute.  In practice, calculating such flows means constructing very complex mathematical systems made up of millions of pressure and velocity values at all of the underground grid points.

This has given rise to the need to scale up the solution time linearly, a feat that researchers had previously found too difficult.  While this scaling up has now been achieved, thus making the model more accurate and comprehensive, the project’s complexity has been increased considerably.

Nevertheless, after finishing off our work on the North Anatolian Fault, Delft and Utrecht Universities intend to try to model the tectonics of the entire earth — a truly ambitious project that will involve perhaps some 1 billion grid points (what we call our “fine grid”).

To make the computations for these 1 billion points will require surmounting yet another major hurdle — the ‘parallel computing’ problem.  That is, increasing the number of computers in a system generally means that they work less efficiently.  However, the Utrecht team has already been working with a test-sample of 500 computers and we now believe we have mitigated this problem with our algorithm.

Despite this breakthrough, computing the one billion parts of the fine grid is a long-term programme and, in order to push forwards in the meantime, we are also working on a technique called ‘coarse grid’ acceleration.  Our coarse grid utilises only a small number of sample points in all of the earth’s various strata, thus allowing us to obtain fast, accurate solutions for all of these sample points, leading to considerable savings in computer time.

Finally, we also plan to implement the algorithm on video cards which can speed up the computation by a factor of 100 or more.

While much more hard work and innovation lie ahead, this new frontier of seismology is therefore genuinely path-breaking and already achieving exciting results.  However, as the Haiti earthquake has painfully reminded us all, true success will only be achieved when we reach the stage at which human lives are saved by applying our research in practice.

While much more hard work and innovation lie ahead, this new frontier of seismology is therefore genuinely path-breaking and already achieving exciting results.  However, as the Haiti earthquake has painfully reminded us all, true success will only be achieved when we reach the stage at which human lives are saved by applying our research in practice.

Comments

The authors suggest that earthquake prediction suffers mainly from inadequate computational power for modeling. However many seismologists feel that prediction is limited by lack of knowledge of the fundamental physical processes that take place within fault systems and drive the earthquake processes. Consider the following report from the Fifth International Workshop on Statistical Seismology, Erice, Sicily, Italy, 31 May to 6 June 2007 (ref: EOS, vol. 88, issue 30, 24July2007, p 302):

“However, because our understanding of the fundamental physical processes that take place within fault systems and drive the earthquake processes is poor (e.g., what is the appropriate frictional behavior of faults? are the tectonic stresses high or low? how are earthquakes triggered? what is the role of fluids? how do earthquakes start or stop?), physics-based earthquake forecast models are currently generally outperformed by purely data driven, statistical models, and even those models remain rather limited in their predictive power”.

Posted by Steve Numero Uno | Report as abusive
 

This is a reaction on the comment of Steve Numero Uno.

I partly agree with the comment that prediction is limited by lack of knowledge of the fundamental physical processes. It appears that earthquake prediction needs information from experiments/measurements, theory/modelling and scientific computing. The proposed method enables us to compute much larger problems than before, so limitations with respect to scientific computing are pushed further away. This implies that limitations due to experiments and theory become more urgent now.

 

Professor Vuik: I would agree that efficient computational methods are needed when sufficient physical understanding of earthquake processes is available. Unfortunately such understanding does not yet exist even with respect to the subject region of your study, the North Anatolian Fault. Consider the following comment in this regard from last week’s issue of EOS (reference below):

“Because of concerns for their earthquakes, continental transforms – particularly California’s San Andreas Fault and Turkey’s North Anatolian fault – have been intensely studied. Nonetheless, the link between the largest and most dangerous earthquakes and fault segments along transforms remains elusive. A better understanding of the structural singularities bounding these segments…may clarify why some of them stop earthquake ruptures while others do not.”

The work of you and your colleagues with respect to computational methods is helpful. But we must make clear to others that progress in earthquake prediction depends on better physical understanding of earthquake processes.

Reference: Seeber, L., C. Sorlien, M. Steckler and M.-H. Cormier, 2010, Continental Transform Basins: Why Are They Asymmetric?, EOS (Transactions, American Geophysical Union), Vol. 91, Number 4, 26Jan2010, pp 29-30.

Posted by Steve Numero Uno | Report as abusive
 
  •