Near-Space Study Helping to Predict Storms
Editor’s Note: Every so often there is a point in history where you can mark what life was like before and how life is much different after. Such is the case with the “Big Data” movement—directing the power of high-performance computers and supercomputers to manipulate massive amounts of information—to unlock the mysteries across science and societal boundaries from genomics, environmental engineering, and high-energy physics, to law enforcement and security. This is one in a series of articles that demonstrates how the University of Illinois is a leader in the field of Big Data.
The 2011 Tohoku earthquake that devastated Japan had an epicenter some 19 miles below the earth’s surface and 70 miles from the Japanese mainland. The ensuing tsunami had devastating effects on the east coast of Japan and traveled across the Pacific Ocean. A consortium headed by engineers from the University of Illinois is discovering ways to study these types of events, not by sensing the movement far underground or in the ocean, but by measuring activity in the near-space environment, more than 100 miles above the earth’s surface. The near-space environment consists of both the ionosphere (plasma) and the neutral atmospheres (non-ionized molecules).
“Although the quality of the measurement from a single GPS receiver may not be that high, when you put all the stations together, you start to see some phenomena and structures that could previously only be seen by using seismometers or ocean buoys,” said Jonathan Makela, an associate professor in electrical and computer engineering, who is leading the study on campus “On the other hand, the imaging systems provide a view of a large swath of the ionosphere from a single instrument, but the interpretation of the data from the imagers can be a bit more complicated.”
“The ionosphere is actually the sensor that tells us what is going on below,” Makela explained. “We were able to see waves in the ionosphere two hours before the tsunami got to Hawaii, which is more lead time than provided by the ocean buoys. In the case of the tsunami, the wave front in the airglow layer was in front of the waves of the ocean below.”
Funded through a grant from an Office of Naval Research Basic Research Challenge, Makela is collaborating with colleagues at the Institut de Physique du Globe de Paris, Northwest Research Associates, and ASTRA, LLC. Together, they are creating a new type of physics-based model coupling the ocean through the atmosphere into the ionosphere. “These three domains have traditionally been treated separately, and coupling them together is quite a challenge,” said Makela. “However, this is a necessary step in terms of developing a model through which we can understand and interpret our data.”
Makela is also leading the deployment of new instrumentation to provide additional data for future tsunamis, including a new imaging system in Tahiti and a new radar system to Hawaii. “Until recently there hasn’t been a wealth of measurements of the necessary parameters to be able to study this phenomenon,” Makela said. “This is especially true in the Pacific ocean. Whereas the GPS network in Japan has over 1,200 receivers capable of observing the ionosphere, there are only around 50 in Hawaii. These receivers don’t provide the same density of data that our imaging system does. The real power, however, comes from considering both types of data together, allowing them to operate as a single-sensing system.”
“This is where Big Data and data science come in to play.” Makela said. “Multi processors (which allow a high volume of computations) are important in analyzing the various data sets efficiently and eventually assimilating them into models being developed to come to an understanding of the physical process through which the tsunami is visible in the ionosphere. It’s not as computationally burdensome as some of the genomics applications, but our computational demands will grow as we start to exercise more complex models and the number and diversity of sensors explodes. Computational capability is an added concern as we consider a real-time warning system based on ionospheric sensing.”
These types of sensors also provide a wealth of data useful for studying other types of phenomena in the ionosphere system and how it reacts to solar storms. “As you get a large coronal mass ejection coming from the sun, the energy funnels through the earth’s magnetic field and some of the energy is converted into heat in the atmosphere,” Makela explained. “As it heats up, the neutral portion of the atmosphere is going to flow from hot to cold. In storms, we see a large surge in the wind traveling from the polar region over the U.S. to the equator and beyond.”
Funded by the National Science Foundation, Illinois is one of a consortium of research groups that comprise the North American Thermosphere-Ionosphere Observing Network (NATION), joining research groups from Clemson University, the Pisgah Astronomical Research Institute in South Carolina, the University of Michigan, Eastern Kentucky University and Virginia Institute of Technology. The most novel data from this network comes from Faby-Perot interferometers, an optical instrument with high spectral resolution that can show the nature of light that is emitted by chemical reactions in the upper atmosphere. By studying how wide that spectrum is, researchers can get a measurement of temperature and by studying the relative Doppler shift (relative frequency shift) of that light, they can get a measurement of motion of that region of the atmosphere (some 250 kilometers away).
Makela admits that much of the process is a sit back and wait approach until there is an event to study. The stars aligned in that realm around an Oct. 2, 2013 storm.
“It wasn’t a special storm, but having the entire NATION network operating allowed us to analyze the data in a way that no one else has been able to do, providing us with some unexpected results, more accurate than previous observations,” said Makela, whose group is close to publishing its findings.
Studying the near-space environment is useful in not only explaining and predicting weather events, but will prove beneficial to minimize disturbances on satellite transmissions and malfunctions or anomalies in the power grid.
“A lot of technological infrastructure depends on what’s going on in the ionosphere,” Makela said. “Having this distributed sensing network helps us determine what’s a spatial change and what’s a temporal change, which in turn allows us to understand the physics a lot better and develop improved models. As the number of measurements steadily increases, we should see a step forward in our ability to understand and, more importantly, predict space weather”
Makela also indicates that there has been some preliminary talk of using GPS capabilities on civilians’ cell phones as crowd-sourced networks to make additional ionospheric measurements in the future. More instruments mean more data to be analyzed. “The capability exists for civilians to collect data that would be useful in these pursuits, but some work needs to be done to bring the cost and power usage down before it would be integrated into cell phones.”
“The days of a single person going out there with their instrument, making a measurement and being able to discover something are ending,” he said. “The ionosphere is the biggest error source when it comes to satellite navigation and can disrupt satellite communication networks. If you don’t have a good model of what the ionosphere is doing, you’ll have some uncertainties. Our research will lead to more resilient and robust satellite communication systems, a better understanding of uncertainties on GPS and the ability to forecast outages, which is important for the airlines, the power grid, military and all the people who rely on radio communication.”