Weather, it's said, is what people discuss when there is nothing else to talk about. Recent events, however, have elevated this kill-the-time topic to new heights: The recent climate summit in Kyoto, Japan hosted representatives of 150 nations.
With record hurricanes, evidence of increasing global temperatures, marked fluctuations in the ozone layer, uncharacteristic weather patterns, and the almost fanatical focus on the El Nino phenomenon, scientists, politicians, and laymen alike seem to really want to know just what the weather will be, and why. Sensors, simulation systems, computer models, and data acquisition equipment are among the technologies helping answer those questions.
Water-vapor wizard. The University Center for Atmospheric Research (UCAR) in Boulder, CO, found a way to exploit the effects of atmospheric distortion on Global Positioning System (GPS) signals to determine the amount of water vapor in the air. "Water vapor changes the atmosphere's index of refraction--it slows down the GPS-satellite transmission's electromagnetic waves," says Chris Rocken, UCAR's head scientist for GPS research. "The more water vapor, the slower the signal."
What's so important about water vapor? Scientists and forecasters consider it a vital link in understanding and predicting weather and climate change. "It's the most important atmospheric constituent for the transfer of energy," says Rocken, "and it has also been one of the most poorly measured and described."
Current methods of gathering water-vapor information include radiosonde balloons and reports from commercial airliners. The first is limited to discrete locations and times, the second only provides data along flight routes. UCAR's GPS-based method offers the potential to gather worldwide data from any point desired.
It works by comparing the theoretical travel time with the actual, measured transmission time for a signal broadcast from a GPS satellite to a known point on earth. Any time difference between the two--in the neighborhood of 8.25-8.50 nanoseconds--is due to the refractive effect of the atmosphere. And of this tiny nanosecond delay, only about 1/8th of the total is due to water vapor.
To compute the precipitable water vapor held by the atmosphere in the zenith direction at a tracking station, each station processes multiple GPS satellite signals and applies a little trigonometry. Over a given point, the atmosphere might hold 30-40 cm of water in vapor form. UCAR's system can measure that amount to an accuracy of better than 2 mm.
On the ground, each measurement site requires not only a high-end, dual-frequency, phase-cor-relating GPS receiver, but also an instrument package to gather temperature, humidity, and air-pressure readings used in calculations. Engineers created such a system, called the Climate and Meteorological Sensor package (CLAM). But since UCAR isn't in the hardware business, they passed the technology on to two companies who now offer their own commercial CLAM variations. One, Paroscientific (Redmond, WA), has its Digiquartz® MET3 system. The other is being manufactured by Helsinki, Finland-based Vaisala, which has a U.S. office in Woburn, MA.
Rocken and his crew are currently working on enhancing the system to provide profiles of water-vapor versus altitude. And a related project involving the Jet Propulsion Laboratory and the University of Arizona showed that space-based GPS measurements could provide sensitive temperature readings used to study global warming. But the current capability already can provide scientists with a formidable tool to study the climate. "The ball is in their court," Rocken says, "to see if they can ingest this information and produce better predictions."
Measuring the winds of change. Atmospheric water content is, of course, only one piece of the meteorology puzzle. Global and local winds play a role in not only the weather, but also the propagation of air pollutants and the safety and efficiency of aircraft. Two separate projects--one at the Georgia Institute of Technology, the other headed by NASA--might soon let researchers know which way the wind is blowing a bit more precisely than before.
Georgia Tech researchers have developed a non-Doppler, laser-based method of measuring average wind speed across regions up to several kilometers wide. "You use it when you want to characterize a wind field over a large area," says Michail Belen'kii, principle research scientist.
The sensor's design and operation are simple. It consists of a 4-mW HeNe laser co-located with a Newtonian telescope, electronics and detectors. The laser projects a beam onto a retroreflective target. Reflected light is gathered by the telescope, sent through a series of optics, and focused onto a photovoltaic silicon quadrant detector.
As the beam passes through the atmosphere it suffers degradation known as residual turbulent scintillation effect. The degradation manifests itself at the detector as shadowy waves moving across the laser beam. "The motion of these waves is directly related to motion and homogeneity of the media," says Belen'kii. "You get a path-integrated total value of the wind field."
In the lab, the method has exhibited sensitivity of less than 0.2 m/sec, five times that of anemometers and up to ten times that of radar. In addition, Doppler-based methods such as radar and LIDAR can only determine components of wind-speed along the axis of the beam. Several devices are needed to gather two- or three-axis readings, which can be costly.
Using several inexpensive laser reflectors in orthogonal directions, the Georgia Tech system can cover a large area with a single beam. To gather the same data with discrete sensors over an area of complex, non-uniform terrain would require an array of anemometers.
A system based on the technology was expected to be installed at the Idaho National Engineering and Environmental Laboratory (INEEL). The proposed facility was intended to study the effect of hurricanes and tornadoes on full-size buildings and structures with mechanically generated winds up to 200 mph. But a funding cut in late 1997 has put the project on hold. Other applications being investigated include the monitoring of airport terminal areas and, in conjunction with a chemical concentration sensor, the study of air pollution.
Early next century, NASA hopes to send aloft its own laser-based wind-sensing system aboard a New Millennium Space Shuttle mission. Called SPARCLE (Space-Readiness Coherent Lidar Experiment), the project hopes to prove the feasibility of measuring global winds to study climate change, improve hurricane predictions, and help your local weatherman with the daily forecast.
While coherent LIDAR has been used for decades to measure wind speed, only within the past couple of years did a breakthrough in lasers make them spaceworthy. NASA rejected the CO2 lasers commonly used for the task as too bulky and short lived due to eventual leakage of the gas. To replace them, engineers at NASA Langley Research Center labored to create an eye-safe (2-micron wavelength) solid-state laser of sufficient power. Their work resulted in a holium thulium: YLF laser which in spring '97 demonstrated a pulse energy of 600 millijoules. "Just a year earlier the record pulse energy was just 20 millijoules," says Michael Kavaya, principle investigator for SPARCLE. NASA has contracted Coherent, Inc. (Boulder, CO) to produce the flight lasers for the project.
The system will work by aiming the laser down at earth from the Shuttle bay at a 30-degree angle from normal. Six times a second it will fire a pulse. Some of the light's photons will reflect off natural aerosol particles in the air, be gathered by the LIDAR's receiver aperture and sent to an indium-gallium-arsenide detector. There they will be heterodyned with a local oscillator laser to measure the frequency shift due to the speed of the wind (after accounting for the Shuttle's own speed, of course).
Only about four photon of the initial pulse's 1018 photons will be lucky enough to be reflected off the atmosphere, enter the aperture, and survive the processing.
SPARCLE will be able to measure the wind within a 100-km square box in 14 seconds with a vertical resolution of 250 meters from 0 to 20 km high. Along the Shuttle's trajectory, the system will create a seamless quilt of boxes offset two to the left and two to the right.
If all goes well, the demonstration will prove the feasibility of launching an array of wind-speed sensing satellites. "The global wind measurements could stretch the accuracy of current weather predictions at three days out to five days out," says Kavaya.
Gathering accurate data is only half the task facing weather research-ers. The other half is organizing, interpreting, and presenting the results.
Simulate the solution. Meteorological data is typically presented like most other numerical information--in tables, charts, and graphs. It is also frequently overlaid on maps as well. But even this method leaves out the critical dimensions of altitude and time. Recently, work by an engineer at Southwest Research Institute (SwRI) has led to a unique means of creating computer-generated, 30-frame-per-second, photorealistic weather imagery based on real-world data.
Called the Weather Environment Simulation Technology (WEST), the technology was originally developed for aircraft flight simulators. "Simulators have historically not modeled weather well if at all," says Bruce Montag, SwRI's manager of Advanced Simulation and Training and Training Concepts. "WEST will let you experience, in a simulator, the wind field from a microburst as you fly through it," says Montag, "nothing else comes close to providing this level or realism."
WEST takes its data from standard sources such as the National Weather Service, radar, and satellites. It converts that data into a grid format that is distributed spatially and temporally. Grid points also contain parameters for temperature, pressure, liquid water content, and wind.
The software's secret lies in its ability to render realistic images of the data from any viewpoint in real-time. "Clouds and weather are very volumetric," says Montag, "so we focused on developing a fast way of volume rendering." His team leveraged the renowned imaging capabilities of Silicon Graphics (Mountain View, CA) ONYX workstations for the project, but the program can be adapted to run on many other platforms (though not all in real time).
Montag sees many applications outside flight simulators. WEST's ability to recreate views of hurricanes, wind shear, and other weather phenomenon make it ideal for TV forecasters and researchers. And he is currently working on applying the powerful--and patented--rendering technique to fire and smoke modeling. "Lots of people said realistic volume visualization of clouds was impossible," says Montag. "It's sort of like a holy grail that lots of people had attempted to do but no one had ever been successful with before."
Gathering and interpreting data by:
- Developing software to accurately model climate change over tens to hundreds of years.
- Engineering new water-vapor and wind sensors.
- Designing simulation programs that can project photo-realistic weather patterns for study or pilot training.
Other applications for climate technology
- Software to render real-time visualizations of complex, volumetric phenomenon, such as fires.
- High-power solid-state lasers at eye-safe wavelengths.
- Non-Doppler, laser-based flow-rate and pollutant sensors for gases or liquids.
- Portable, rugged sensing packages.