LA times posted the first news report on this small quake earlier today. What is interesting, the news report was posted/aided by
Quakebot. .......
If that sounds faster than humanly possible, it probably is. While the post appeared under Schwencke’s byline, the real author was an algorithm called Quakebot that he developed a little over two years ago. [/i]
The LAT had their story up about as fast as the USGS posted the data on their website.
The Atlantic has an
article with a bit more detail on "Quakebot". To me the real magic isn't Quakebot, which after all just reformats USGS a USGS data release. The most impressive story is how the USGS and cooperating universities and other organizations is able to network all those seismometers, and process the data to generate the info Quakebot gets. The software must first recognize that an earthquake has occured (rather than random noise), recognize the same earthquake on at least two other seismometers, solve a three dimensional problem to locate the epicenter and depth, also compute the magnetude, and do it all in real time.
As I mentioned upthread, more or less the same approach is used by the tsunami warning centers quickly decide whether to send out a warning. A couple of years ago I toured the Alaska and West Coast Tsunami Warning center in Palmer, AK (recently renamed the "National Tsunami Warning Center"). Really big earthquakes shake for several minutes, and one of the staff told me that when the big Japan earthquake happened, they were sending out the initial alert before the ground stopped shaking in Japan!
Another practical use of this type of technology is to give a short local warning of a big earthquake. It can take 30 seconds or more for the waves to travel across a city to nearby areas. This is enough time for people to duck and cover before the shaking starts. I've read that in Japan, seismomters closer to the epicenter sent warnings to further away areas, which gave people some warning.