GPS, AIS, and More: Diversifying Tsunami Prediction Technology
Science Disaster Technology- English
- 日本語
- 简体字
- 繁體字
- Français
- Español
- العربية
- Русский
Reducing the Human Impact of Disasters
Nine years have passed since the Great East Japan Earthquake of March 11, 2011. Despite all of Japan’s scientific and technological advancement, nearly 20,000 people lost their lives that day, mostly to the resulting tsunami. As a scientist, I cannot help but feel immense regret at that failure.
Japan is no stranger to disaster. In addition to earthquakes, it has long been subject to powerful typhoons, which once killed thousands. The Isewan Typhoon (Typhoon Vera) of 1959, for example, was the most powerful typhoon since World War II, and it claimed more than 5,000 lives.
However, that disaster sparked a revolution in typhoon forecasting. The deployment of the Mount Fuji Radar System, as well as new weather satellites, enabled better advance warning and evacuation, and this has led to a dramatic drop in the number of typhoon deaths. Recently, around 100 people died when 2019’s typhoon Hagibis struck eastern Japan, but without advance warning from radar and weather satellites, there would likely have been many times that number lost.
Lessons of the 1960 Chile Earthquake
Since the number of typhoons that hit Japan each year and the level of flooding they bring has not significantly changed since the 1950s, this shows that improvements in prediction technology have succeeded in saving lives. Any improvements in tsunami prediction precision could likewise decrease their human impact.
A magnitude 9.5 earthquake, the most powerful ever measured, struck Chile in 1960. The tsunami it caused crossed the Pacific Ocean and reached Japan in about one day. Observers in Hawaii had actually sent information on the tsunami to the Japan Meteorological Agency, which handles tsunami warnings domestically, immediately after the earthquake. However, the JMA only issued a warning once the wave actually reached Japan’s coastal areas. Its great distance from the epicenter should have given Japan nearly 24 hours to evacuate coastal areas, but 142 people still lost their lives. This drove home the need for a better warning system.
Japan has since strengthened international cooperation in this area, for example by increasing coordination with the Pacific Tsunami Warning Center in Hawaii and sharing more tsunami data. When an earthquake’s epicenter is in the seas near Japan, though—as was the case in 2011—the problem still remains of how to warn people and reduce casualties during the relatively short time before the wave actually hits.
The Difficulty of Tsunami Prediction
Why is it so difficult to predict tsunamis? The short answer is because it is difficult to spot how they start. If we know where, and in what manner, a tsunami begins, we could use computer simulations to calculate its progression quickly and accurately.
I imagine everyone faced homework problems like this one back in high school:
Q: If a ball is dropped from a certain height, how long will it take to hit the ground, and what will its velocity be when it hits?
Even those unfamiliar with physics would know you cannot answer that question without knowing the initial height of the ball. If you know the height, then calculating the answer is a simple matter of Newtonian physics.
To draw an analogy with the problem of tsunami prediction, then, the issue of the tsunami’s beginning state is the same as the height of the ball. If we could know how a tsunami began—in other words, where and to what extent the sea surface at the epicenter rose just after the earthquake—we could use accurate simulations to predict its size and movement.
Database Tsunami Prediction
It is no simple matter to spot that beginning, however. Using satellites to keep vast stretches of the sea under constant visual observation in order to spot a tsunami that could appear anywhere, at any time, is a daunting challenge both in terms of technology and cost.
There are also limitations of the current prediction approaches. The conventional method of predicting tsunami propagation is based on reverse calculating the earthquake’s epicenter and magnitude. There are two modes of seismic wave: P waves (compression waves), which travel at 7 kilometers per second, and S waves (shear waves), which travel at 4 kilometers per second. Measuring these waves, and the lag between them, from multiple observation points allows calculation of a quake’s epicenter, scale, and magnitude within minutes.
The JMA has estimated potential tsunamis based on tens of thousands of possible locations and magnitude combinations. By comparing the epicenter and magnitude of an actual quake to the information on file, it can offer a prediction of how a tsunami from the event will propagate.
However, this method is limited by its dependence on calculations of epicenter and magnitude from P and S waves, a method that is only accurate for earthquakes up to magnitude 8. Since the magnitude of megaquakes like the March 11 disaster cannot be quickly and accurately calculated using this method, the JMA’s initial tsunami estimate misread the danger, basing its evaluation on a magnitude 8 temblor.
Thus, despite the vast scope of data available to the JMA, it ended up underestimating the tsunami by a whole order of magnitude. A magnitude 9 earthquake releases almost 30 times the energy of an 8, so that failure made accurate tsunami prediction impossible. With the Great East Japan Earthquake, the failure to achieve an accurate, real-time magnitude measurement was linked to the insufficient tsunami warning. Based on the information they had, coastal residents judged that the coming tsunami would not overtop existing seawalls. This slowed down evacuation orders, thus increasing the eventual death toll.
Tsunami Prediction Post-3/11
Since the Great East Japan Earthquake, the National Research and Development Agency’s National Research Institute for Earth Science and Disaster Resilience has established the Japan Trench Earthquake and Tsunami Monitoring Network (S-Net), stretching from the seas off Bōsō in Chiba to Tokachi in Hokkaidō, in an attempt to help prevent future tsunami deaths. The network is already up and running. Its measurement facilities combine earthquake and tsunami sensors, and are connected by a total of 5,500 kilometers of undersea fiberoptic cables to share real-time data 24 hours a day. To prepare for a predicted Nankai Trough earthquake, the agency also began deploying the Nankai Trough Seafloor Observation Network for Earthquakes and Tsunamis (N-Net) off the coast of Shikoku.
N-Net uses equipment installed in areas likely to become the epicenter of future earthquakes, and since they have direct access to the seafloor, they are a particularly powerful tool. However, since the cables need to be laid on the bottom of the sea and connected directly to the equipment, installation costs are high, as are maintenance operations.
These problems of cost inspired one researcher to think of a way to calculate tsunami status quickly, for free. Associate Professor Inazu Daisuke at the Tokyo University of Marine Science and Technology demonstrated that data from the Automatic Identification System (AIS) that all ships must have installed allowed reverse calculation of a tsunami’s initial state based on changes in ship velocity caused by the Great East Japan Earthquake tsunami’s passing.
AIS transmits information in real-time from most ships within a few dozen kilometers of shore, and in the near future satellites will allow transmission of this data from ships that are out of range of the coast. In other words, it will allow every ship on the sea to become a tsunami measurement device. It may be somewhat imprecise since the number and distribution of ships is not fixed, but the fact that it uses only existing infrastructure marks a real benefit.
Spotting Tsunami from Space
There is also research into spotting tsunami from space, rather than on the oceanic surface, which I am personally involved in.
The sea-surface bulge of a tsunami pushes the air upward as well, generating slow sound waves in the infrasonic range. Those sound waves reach an altitude of 300 kilometers in about eight minutes.
At that altitude, solar radiation ionizes part of the thin atmosphere to create a state of matter called plasma. The infrasonic waves generated at the tsunami origin also oscillate the plasma, and if we could measure that oscillation, it would allow indirect observation of the tsunami’s beginning state.
This becomes possible with GPS, a technology that is part of everyday life for many with the advent of car navigation systems and smartphones. GPS satellites orbit at an altitude of 20,200 kilometers, and several are within view of Japanese airspace at any given time. The Geospatial Information Authority of Japan has also set up over 1,300 Continuously Operating Reference Stations for geodetic measurement purposes, and these points receive constant transmissions from GPS satellites, collating this data at one central location in real time.
The GPS transmissions sent to each base station also carry information picked up while passing through the plasma layer. This can be used to measure changes in the plasma, allowing accurate determination of a tsunami’s beginning. Currently, it takes around 20 minutes to register a tsunami generated by an earthquake in the magnitude 9 range, and technology to cut this down to around 12 minutes is currently under development.
The biggest benefit of this technology is its use of currently existing GPS infrastructure. We can establish a system to observe every tsunami in the seas near Japan from space without any initial installation costs. What is more, this technology can also be expanded to cover nations that lack the finances to implement such technological infrastructure on their own. If we can achieve accurate tsunami predictions at low cost, it would be a truly appropriate way for Japan to help repay the vast outpouring of global support it received after its tragic tsunami disaster.
Science, Technology, and the Will to Save Lives
Just as our predecessors advanced storm prediction techniques after the terrible losses of the Isewan Typhoon, we researchers today have a duty to push research and development to prevent a repeat of the terrible losses of the Great East Japan Earthquake, should another massive tsunami hit Japan.
Since March 2011, many tsunami prediction projects other than the ones described above have been set in motion. All are still under development, but it seems clear that given the difficulty of accurate earthquake and tsunami prediction, a diverse approach that mixes different prediction systems, taking advantage of each one’s strengths, would be most effective. Matching complementary systems can allow one to cover the shortcomings of another, and thus help to preserve even more lives.
At the same time, we must realize that there is still no “magic bullet” prediction technology available. It is still vital for individuals to use their experience, intuition, and daily preparation to help mitigate tsunami disasters. People living in coastal areas should check the location of their nearest elevated safety zones and evacuation centers. If they feel a large shake, they should be aware of the possibility of tsunami and evacuate.
Japan is located in an earthquake hotspot, and so there is no way to completely avoid tsunami damage. But many scientists are directing their efforts toward reducing the loss of life, if even just a little.
(Originally published in Japanese. Banner photo: View near Matsukawaura fishing port in Sōma, Fukushima Prefecture, after damage from the March 11, 2011, tsunami. Courtesy of the Fire Science Center.)
earthquake disaster technology tsunami Great East Japan Earthquake satellite JMA GPS