Today, JPL Earth scientist Hui Su joins thousands of other bloggers in more than 130 countries around the world for the Blog Action Day '09 Climate Change.
Blog Action Day is an annual event that unites the world's bloggers in posting about the same issue on the same day, with the aim of sparking discussion around an issue of global importance. The theme of this year's event, climate change, affects us all and will be the topic of international climate negotiations taking place in Copenhagen, Denmark, this December.
As a world leader in studying Earth's climate, NASA researchers play a vital role in shaping our understanding of global change. In today's post, Su discusses the critical role clouds play in climate, and why learning more about them is a key to predicting how our climate will change in the future. For more information on Blog Action Day, visit: http://www.blogactionday.org .
Clouds are among the most fascinating natural phenomena and have inspired countless works of literature and art. Their ever-changing forms make them a great challenge to atmospheric scientists working to predict how our climate will change in the future in response to increasing greenhouse gases such as carbon dioxide.
Clouds occur at many different heights in our atmosphere and take many different forms. There are three main types of clouds: stratus, cumulus and cirrus. Stratus clouds are low clouds, usually within 2 kilometers (7,000 feet) above the surface. They look like a gray blanket, extending thousands of kilometers across the sky. Cumulus clouds look like puffy cotton balls and extend vertically for large distances. The third type is wispy and feathery-looking cirrus. Cirrus clouds are usually high in the sky, about 7 kilometers (23,000 feet) above the surface. These three types of clouds have different impacts on Earth's climate due to their unique abilities to reflect sunlight and trap heat radiated from Earth's surface.
Stratus clouds can effectively block sunlight from reaching the surface; therefore, they act as an umbrella that cools Earth. Cirrus clouds are relatively transparent to sunlight but can trap terrestrial radiation, JUST AS carbon dioxide does, so they have a net warming effect on Earth. Cumulus clouds can block sunlight and also trap terrestrial radiation. Their net effect varies greatly depending on their actual heights and thicknesses.
Climate scientists have long struggled to quantify how different types of clouds change when global warming occurs. For example, an increase in stratus clouds may cool Earth's surface, compensating for global warming; while an increase in cirrus clouds may further warm Earth's surface, exacerbating global warming. Up to now, scientists have not been able to come to a consensus as to whether stratus, cumulus or cirrus clouds will increase or decrease as global temperatures increase.
A key advancement in cloud studies in recent years has been the availability of global satellite observations of clouds, especially the measurements of clouds at different heights provided by NASA satellites like CloudSat, managed by NASA's Jet Propulsion Laboratory (JPL). These observations are allowing scientists to better simulate clouds in climate models, which are the primary tools climate scientists use to predict future climate change. Up till now, the dynamic nature of clouds has made them very difficult to simulate in current climate models. But by applying space data, we at JPL are working closely with modelers to improve cloud simulations and thereby improve predictions of future climate change.
To learn more about JPL's research in this field and the CloudSat mission, visit: http://cloudsat.atmos.colostate.edu/home .
JPL scientist Bjorn Lambrigtsen goes on hurricane watch every June. He is part of a large effort to track hurricanes and understand what powers them. Lambrigtsen specializes in the field of microwave instruments, which fly aboard research planes and spacecraft, penetrating through thick clouds to see the heart of a hurricane.
While scientists are adept at predicting where these powerful storms will hit land, there are crucial aspects they still need to wrench from these potentially killer storms.
Here are thoughts and factoids from Lambrigtsen in the field of hurricane research.
1. Pinpointing the moment of birth
Most Atlantic hurricanes start as a collection of thunderstorms off the coast of Africa. These storm clusters move across the Atlantic, ending up in the Caribbean, Gulf of Mexico or Central America. While only one in 10 of these clusters evolve into hurricanes, scientists do not yet know what triggers this powerful transformation.
Pinpointing a hurricane's origin will be a major goal of a joint field campaign in 2010 between NASA and the National Oceanic and Atmospheric Administration (NOAA).
2. Predicting intensity
Another focus of next year's research campaign will be learning how to better predict a storm's intensity. It is difficult for emergency personnel and the public to gauge storm preparations when they don't know if the storm will be mild or one with tremendous force. NASA's uncrewed Global Hawk will be added to the 2010 research armada. This drone airplane, which can fly for 30 straight hours, will provide an unprecedented long-duration view of hurricanes in action, giving a window into what fuels storm intensity.
3. Deadly force raining down
Think about a hurricane. You imagine high, gusting winds and pounding waves. However, one of the deadliest hurricanes in recent history was one that parked itself over Central America in October 1998 and dumped torrential rain. Even with diminished winds, rain from Hurricane Mitch reached a rate of more than 4 inches per hour. This caused catastrophic floods and landslides throughout the region.
4. Replenishing "spring"
Even though hurricanes can wreak havoc, they also carry out the important task of replenishing the freshwater supply along the Florida and southeastern U.S. coast and Gulf of Mexico. The freshwater deposited is good for the fish and the ecological environment.
5. One size doesn't fit all
Hurricanes come in a huge a variety of sizes. Massive ones can cover the entire Gulf of Mexico (about 1,000 miles across), while others are just as deadly at only 100 miles across. This is a mystery scientists are still trying to unravel.
NASA and NOAA conduct joint field campaigns to study hurricanes. The agencies use research planes to fly through and above hurricanes, and scientists collect data from NASA spacecraft that fly overhead. NOAA, along with its National Hurricane Center, is the U.S. government agency tasked with hurricane forecasting.
Oxygen, or O2 on the table of chemical elements, is a vital component for life on Earth. It is the second most abundant gas in Earth's atmosphere, making up about 21 percent of its volume. On the other hand, its cousin ozone (O3) makes up less than 0.00001 percent. In fact, if all the ozone in Earth's atmosphere were brought down to the surface, air pressure and temperature conditions would compress ozone into a layer just three millimeters thick, equivalent to two pennies stacked one on top of the other. ! Despite its tiny amount, ozone is also a vital ingredient for life on Earth.
Ozone in fact is vital for life on Earth, but it also has a "bad" side as well - that is, there is both good and bad ozone out there. Good ozone, which accounts for about 91 percent of the ozone in Earth's atmosphere, is present in the stratosphere, the middle layer in Earth's atmosphere. This portion of ozone is commonly referred to as the "ozone layer." The ozone layer absorbs more than 90 percent of the sun's high-frequency ultraviolet light, which is potentially damaging to life on Earth. Without the ozone layer, this radiation would not be filtered as it reaches the surface of Earth, resulting in detrimental health effects for life on Earth. Among the health effects humans could experience as a result of overexposure to ultraviolet radiation are skin cancers, premature aging of the skin and other skin problems, cataracts and other forms of eye damage, and suppression of our bodies' immune systems and our skin's natural defenses.
The troposphere, the part of the atmosphere closest to Earth, contains both good and bad ozone. In the lower troposphere, ozone may serve as an air pollutant since it is a major component of photochemical smog. In the middle troposphere, ozone acts as an atmospheric cleanser, and in the upper troposphere, ozone is a greenhouse gas, which could be bad if concentrations get too high.
The Tropospheric Emission Spectrometer, a science instrument onboard NASA's Aura satellite, is improving our understanding of the good and bad ozone in the troposphere. The spectrometer, which was launched in 2004, provides the first global view of tropospheric ozone and vertical concentrations of ozone, as well as temperature and other important tropospheric features, including carbon monoxide (CO), methane (CH4), water vapor and ammonia (NH3). The instrument has studied the origin and distribution of tropospheric ozone. It has also shed light on how the increasing ozone abundance in the troposphere is affecting air quality on a global scale, as well as ozone's role in chemical reactions that "clean" the atmosphere, and climate change.
These data are used by scientists to determine the degree to which natural sources, like lightning and plant growth, and human-produced sources, like automobiles, industrial pollution, and biomass burning, contribute to ozone production and chemistry. For example, during summertime in the upper troposphere, where ozone acts as a greenhouse gas, lightning generates much greater amounts of ozone than do human activities, thereby having a big impact on regional pollution. Over the last few years, the spectrometer has obtained global data on ozone and chemicals that participate in ozone formation. The fact that the instrument is able to quantify vertical profiles of ozone improves our understanding of how various reactions taking place at specified heights contribute to ozone chemistry. Similar to ozone, chemicals that participate in its production also exist in tiny amounts. Still, this enables scientists to better understand long-term variations in the quantity, distribution and mixing of many tropospheric gases that have a large impact on climate and air quality.
My role with the instrument is to validate the quality of the most recent ozone measurements, which are taken in a special observation mode, called "stare." This mode is used to monitor biomass burning events and volcanic activity. I compare measurements taken by an ozonesdone (a lightweight, balloon-borne instrument that measures ozone, air pressure, temperature and humidity as it ascends through the atmosphere) with measurements from the tropospheric spectrometer. We do this so we can demonstrate the accuracy and precision of the instrument's readings. I am also participating in projects that use the instrument data to better understand the chemistry and transport of pollutants coming from wildfires, such as those that occurred in Australia in December 2006. For the future, I am interested in using the tropospheric spectrometer satellite data for ozone and methane to better quantify the degree to which they contribute to global warming and climate change.
Not all oceanographers spend their time out on the seas. As a project scientist for the Physical Oceanography Distributed Active Archive Center here at JPL , I study the world's ocean from my computer, using data from a series of NASA satellites that orbit Earth. These data measure everything from how the ocean changes during an El Nino to how such climatic changes affect local regions like California's coast.
This kind of precise data was impossible 100 years ago. In fact, scientific and technological advances over the last century have revolutionized the field of oceanography. Today, we gather data both from instruments in the ocean and from satellites in space. These satellite data measure changes in sea surface topography (the ocean surface has changes in elevation, just like the land), ocean surface winds, sea surface temperature and water pressure at the bottom of the ocean. The satellites view the ocean from 700 to 1,300 kilometers (440 to 800 miles) above Earth. Current advanced technologies allow scientists to combine data from different satellites to view ocean conditions in near-real time, only 6 to 12 hours from when the satellite acquires the data. This information can then be sent to researchers and decision makers for use in improving forecasts for hurricanes to the regional and local impacts of ocean phenomena like El Nino and La Nina.
Examples of satellite data can be seen in these images. The view on the left shows temperatures off the coast of California in September of 1997 (El Nino). On the right, sea surface temperatures from September of 2008 (normal conditions). Notice the warmer temperatures (seen in red) resulting from the 1997-1998 El Nino event. Such temperature changes have direct impacts on local climate and fisheries. These data are leading to a new understanding of how hurricanes get their energy from the ocean. These satellite data also help forecast regional ocean temperatures, which affect local weather.
As technology improves, along with the availability of these data in real time, new opportunities will continue to expand to better understand our planet and its impacts on our lives.
A few hours ago I had the privilege to watch the Orbiting Carbon Observatory launch from Vandenberg Air Force Base. The creativity, effort and dedication of many, many people were sitting on the launch pad. Many of the people who had worked so hard to get the mission to the pad were in attendance with family and friends there to share in the excitement. The weather was perfect. Cold enough to make the stars seems to be just out of reach, still enough to be pleasant to stand outside waiting for the main event. As it got closer, hundreds of voices followed along the magic of the countdown - "10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0 - Liftoff!". The rocket cleared the pad - rising on a column of intense white light. At our distance, it seemed to rise forever before the roar finally reached us. In the dark, clear sky we could watch the various stages burn out, fall back and be replaced by the ignition of the next state. Everything seemed to be going perfectly.
We got on the buses to leave the viewing area, excited by what we witnessed and excited by the mission to come. Both feelings did not last long. Soon text messages and phone calls started to disturb the darkened buses. Within a few minutes, it was clear that the launch had not gone as well as we thought. By the time we got off the buses, it looked grim. In the next couple of hours, it became clear that the rocket failed and we never achieved orbit.
Oddly, hearing that the spacecraft hit the ocean near Antarctica made it worse. I had this vision of the system orbiting the Earth - dead and mute - like a modern day Flying Dutchman. Knowing that the hardware I helped design and build had been destroyed on impact made the loss real.
Almost 10 years ago, I was working with a scientist who was also supporting the Mars lander that was lost in 1999. The day after it failed, she told me to always try to enjoy the intellectual challenge of designing a mission and the hardware to make it possible. At the end of the day, that might be all you get. Since then, she has been involved in the incredibly successful Mars Exploration Rovers and the Phoenix lander. She is working to prepare the Mars Science Laboratory for its 2011 launch.
I hope that her past is my prologue. I hope that the next 10 years bring a productive series of missions to advance our understanding of the carbon cycle - much as the recent Mars missions have advanced our understanding of our solar system's history.
Imagine if you could scoop exactly one million molecules out of the air in front of you (while being careful not to grab any water vapor). Now, start sorting these molecules into different piles. Start with the two most common molecules and you've sorted 99 percent of your sample -- the nitrogen pile will have about 780,000 molecules, and oxygen pile will have about 210,000 molecules. Working on the third most common molecule, argon, gets you a new pile with about 9,000 molecules. Congratulations, you've sorted 99.9 percent of the molecules into just three piles. The remaining 1,000 molecules are called "trace gases." The most famous and the most common trace gas is carbon dioxide, or CO2. Out of the million you had at the beginning, you'll count about 385 CO2 molecules.
Now, imagine repeating this experiment 12 times per second while flying over Earth at more than 16,000 miles per hour. Each of those counts needs to be accurate enough to note the addition or subtraction of one molecule of CO2 per one million of air. This is the experiment that a group of scientists and engineers at NASA's Jet Propulsion Laboratory conceived almost 10 years ago. We call it the Orbiting Carbon Observatory, and it is now at the launch pad waiting for its ride into space.
The heart of the mission is a very accurate instrument -- called a "spectrometer" -- tuned to sense the presence of CO2. A spectrometer is a type of camera that splits incoming light into hundreds of different colors and then measures the amount of light in each of these colors. In the case of this mission, the spectrometer measures sunlight that has passed through the atmosphere twice: once on the way down to the surface, and then again on the way up to the orbiting spacecraft. When the light passes through air containing CO2, certain colors are absorbed. The spectrometer creates an image with dark bands where the sunlight is partially or completely missing. This image looks similar to a barcode. Encoded in that barcode is the information to infer how many CO2 molecules the sunlight encountered on its way to the spacecraft.
I joined the project in early 2001 as the lead engineer for the spectrometer. In the eight years that have followed, we've gone from an idea to a fully built and tested system sitting on top of a rocket, ready for launch. Along the way, a group of talented people has put in countless hours designing, building, and testing the system. When doing something for the first time, there are always issues that come up -- some of which look insurmountable at the time. It's been a challenge, but the hard work and creativity of our team saw us through all of them.
Now we are waiting for the payoff -- the first data from space. We've done everything we can to be ready. Now, launch awaits ...
For those of us living in southern California, the risk of earthquakes is a constant fact of life. In fact, small earthquakes occur daily, we simply may not notice them. It’s the larger, more damaging earthquakes that are cause for concern. The infamous San Andreas fault twists its way through much of California, posing significant risk to southern and northern California both-- and as many scientists have said, it’s not a question of if, it’s a question of when, a large earthquake will occur.
Even though the risk of earthquakes is always present, I am sure most people are not thinking about this on the way to work, or as they are watching TV at night, or just generally going about their daily lives. Establishing an earthquake preparedness plan probably doesn’t even come to mind, except possibly when there is a major earthquake elsewhere, or a minor earthquake nearby.
We here at JPL are working on ways to extend our ability to forecast earthquakes. We are combining the state of the art in high performance computing resources and modeling software with satellite observations made from space of small scale motion on Earth. This will enhance our understanding of the fundamental earthquake processes. With projects like NASA/JPL’s QuakeSim, which aims to improve our ability to forecast earthquakes, much as we do the weather, we will also be able to help prepare ourselves for the inevitable.
Unfortunately, should a large earthquake catch us unprepared
-- and remember, it's not a question of if, it's a question of when
-- this could have disastrous consequences. According to FEMA, the annualized loss due to earthquakes is $5.3 billion per year, with 66% ($3.5 billion) concentrated in the state of California alone. A moderate-sized earthquake in the metropolitan Los Angeles region could lead to loss of vital infrastructure
-- water via the aqueduct, freeways, possibly even the ports or the airports, rendering us isolated and without resources for not days, but possibly months.
We are told to be prepared in case of an earthquake with 72 hours’ worth of water and food and other necessary emergency provisions. That will certainly see us through the first few days, but if the vital infrastructural resources like our water distribution, sewers, freeways, and other pipelines are taken out, we could be looking at much more than 72 hours without proper services, especially water and power. Are you prepared for such a circumstance?
On November 13, 2008, the United States Geological Survey will lead a disaster preparedness scenario called "The Great Southern California Shakeout." It will be based on a magnitude 7.8 earthquake along the southern San Andreas fault. Shaking from an earthquake of this size is projected to last up to two minutes, and the modeling that they have done has predicted that sediments in the various basins around the Los Angeles area will trap and magnify seismic waves, amplifying ground motions, much like what occurred in the Northridge earthquake. (To learn more about the "Great Shakeout," please visit: www.shakeout.org)
This earthquake scenario will also be the basis for the statewide emergency response exercise, Golden Guardian 2008. These complementary exercises are meant to demonstrate our ability to deal with an earthquake scenario in which there would be 1800 deaths, 50,000 injuries, and $200 billion in damage. An earthquake of this magnitude could produce destruction on the scale of the recent Gulf Coast hurricanes or worse.
One thing to keep in mind, though, is that we need to be proactive, rather than simply reactive. That way, when the inevitable moderate to large earthquake does hit, we will be as ready as we can be to deal with it. Exercises like the ShakeOut certainly help to keep the community more aware of the ever-present risk of earthquakes, but we as individuals also need to take the time to make sure that we are disaster prepared as well. That way we can be not only prepared, but resilient.
Remember the warning to beware of yellow snow? Well, what’s true in your backyard is true on a much larger scale too. (For those from warmer climates, yellow-tinted snow is a sign that a dog or other animal has recently “paid a visit.”)
Snow at Earth’s north and south poles can also be tainted. Certain molecules — ones that can eventually damage our protective ozone layer in the stratosphere, affect the air down in the troposphere where we live, and possibly contribute to climate change — are being deposited into the snow.
Just how is this happening? Start with the fact that air at lower latitudes circulates toward the poles. This air carries ozone-damaging molecules picked up in industrial, highly populated areas. Once over the poles, some of these molecules are deposited onto the snowpack, where they migrate to thin liquid films in snow. Once sunlight hits the snow, the light energy breaks down these molecules, which are then released back into the atmosphere, giving the area over the poles a double hit of ozone-damaging molecules.
Scientists are finding that snow has unique properties that make these chemical reactions happen much faster than we used to believe. We don’t fully understand why this is happening, but we know that the mixture of sun (an energy source) and snow bring about the release of these ozone-damaging molecules into the atmosphere much faster than in areas without snow.
Many of the polluting molecules that remain in the snow eventually get incorporated in the polar food chain. When the snow melts into the sea, the molecules may be ingested by sea creatures. Not all of them are unhealthy, but some of them are.
Why care about reactions going on in distant, frozen expanses at Earth’s poles? Those regions are a beacon of climate change, where we see chemical processes that may play a large role in the planet’s future.
My wife likes to gamble. She's no high roller or anything, but give her a hundred dollars, a spare weekend and a room full of slot machines and she's happy.
Not me, though. Somewhere along the way, I guess I took one too many math classes and betting against the house just isn't much fun anymore.
But I understand why she likes it. It's the ups and downs of gambling that are fun. You lose, lose, lose and then every once in a while you win a great big jackpot. Maybe you even win enough to make up for the last 30 or 40 bets you lost. But like any game in the casino, the odds are stacked against you. If you play long enough, you will eventually lose.
Global warming and climate change work in much the same way. Wait long enough and odds are, the Earth will be warmer. But will tomorrow be warmer than today? Who knows! There are plenty of things about the atmosphere and ocean that can't be predicted. Over a period of days or weeks, we call these unpredictable changes "the weather."
No one can predict the weather more than a few days in advance, any more than they can predict which slot the roulette ball will land in before the croupier spins it. Weather, like roulette, is essentially random.
But a little randomness doesn't stop casino owners from taking your bet at the roulette table. They know the odds, and they know if enough bets are laid they will eventually come out ahead. Climate scientists know that, too.
Random events happen in the atmosphere and oceans all the time. Not just the weather, but things like El Nino, La Nina and huge volcanic eruptions can make the planet warm up or cool down for years at time. There could even be a few others that we haven't discovered yet.
Still, for all its short-term ups and downs Earth's average temperature has risen dramatically over the last one hundred years. That's no accident. Like the house edge at the roulette table, human-made greenhouse gasses have tilted the odds in favor of a warming planet.
Sometimes it's easy to forget that fact when new science results come out. Like the recreational gambler, we often find it more fun to focus on the ups and downs: a short-term cooling period, a warm year during a big El Nino.
But for climate change and casino owners, it's important to remember the big picture. The roulette player might win three or four bets in a row, but that doesn't change the odds. Eventually the casino will win. Likewise, as long as humans continue to add carbon dioxide to the atmosphere, the planet will continue to warm.
So whenever people ask me about the latest warming or cooling in the climate record, I'm always reminded of my wife and her slot machines. By the end of the weekend her hundred dollars is almost always gone, but the thrill of the ups and downs kept her entertained for the entire time. "Did you win?" people ask. She always flashes her sly smile and says, "Sometimes!"
What is Kepler?
Kepler is a mission that is designed to find Earth-sized planets outside our solar system. Specifically, it will look for these rocky planets in the "habitable zone" near their stars — meaning at a distance where liquid water could exist on the surface.
Kepler will accomplish this by monitoring a large set of stars (approximately 100,000) and looking for the signature dip in brightness that indicates that a planet has crossed between the spacecraft and the star. The instrument that detects this dip is called a photometer — literally, a "light meter." It is basically a large telescope that funnels the light from the stars onto a CCD array (similar to the ones used in digital cameras).
By surveying such a large number of stars using this "transit" method, Kepler will be able to determine the frequency of Earth-sized (and larger) planets around a wide variety of stars.
What do I think is cool about this mission?
I love the fact that the Kepler approach - looking for the dips in stellar brightness that occur when a planet passes between the photometer and a star - is so straightforward. It is such a wonderfully simple way to look for planets! Of course in practice, there are plenty of complicating factors that make this a challenging mission to execute. The change in brightness that we are looking for is very small (on the order of 0.01 percent). To make sure we can detect that, we have to carefully control noise in the system - things like electronic noise from reading out the CCDs, smear from tiny motions of the spacecraft, etc. These and other aspects of the mission have provided plenty of challenges to keep things interesting for the design team.
One of my favorite things about the Kepler mission is that the patch of sky we will be surveying is near a particular group of highly recognizable constellations. The stars Kepler will look at are in the area of what is known as the Summer Triangle, a group of constellations - Aquila, Cygnus and Lyra - that are overhead at midnight when viewed from northern latitudes in the summer months. When the scientist team starts identifying planets in our field of view, anyone will be able to go outside, point towards the Summer Triangle and say "they've just discovered a planet over there." To me, there is something about that which will make the discoveries that much more personal.
I am also a huge sci-fi fan and I have always been particularly fascinated by books and movies about how humans might some day colonize other worlds in the galaxy. I think it is fantastic to get to work on a mission that will be looking for planets outside our solar system that are Earth-sized and in a range around their stars that could be habitable; places where such colonization could one day take place... I can't wait to see what we find!
What do I do?
I am a member of the Project System Engineering Team at JPL. This team is responsible for a wide variety of tasks on Kepler, aimed at ensuring the project meets the driving scientific and technological objectives. This often involves checking that the interfaces between the different elements of the project work smoothly. For example, one of our responsibilities is to conduct end-to-end tests of the mission's information system. In this test, we check to make sure that the right commands are being generated to collect data, data is collected using spacecraft hardware, and then the data flows correctly through the ground data system. This lets us verify that the entire data flow chain functions as it should before we launch.
My particular focus has been ensuring that we work out all of the details associated with executing each of the mission phases (the launch phase, the on-orbit checkout period that we call the commissioning phase, and the main data-gathering portion of the mission, which is the science phase). I work closely with my colleagues at NASA Ames, Ball Aerospace and JPL to identify and resolve open issues associated with planning for, testing and eventually executing the activities associated with these phases.
What is happening on the project right now?
The project is in what is known as the Assembly, Test and Launch Operations phase. Right now, the assembled spacecraft and instrument (known collectively as the flight system) is in the middle of the environmental testing campaign at Ball. This involves many hours of running the flight system and monitoring its performance while exposing it to the types of temperatures, pressures and other conditions that it will see in space. The system that will collect and distribute the data is undergoing integrated testing as well, with teams of people working to push test data through all of the various ground interfaces. The operations team — the people who will be responsible for generating and testing commands, monitoring the health and safety of the spacecraft and ensuring that data is collected from it by the Deep Space Network — are undergoing training and getting ready for upcoming mission phase rehearsals that we call "operational readiness tests." Even though we are still several months away from launch, it is a very busy time on the project!
Who is involved?
The principle investigator and the science office that will lead the scientific data analysis are at the NASA Ames Research Center in Mountain View, Calif. The spacecraft and photometer were built at Ball Aerospace & Technologies Corporation in Boulder, Colo. The mission operations center is located at the Laboratory for Atmospheric and Space Physics at the University of Colorado at Boulder. The mission is managed here at the Jet Propulsion Laboratory in Pasadena, Calif.