Discover opportunities to engage students in science, technology, engineering and math (STEM) with lessons and resources inspired by the latest happenings at NASA.

› Learn more and explore the collection



A glowing, orange ring outlines a black hole.

In the News

Accomplishing what was previously thought to be impossible, a team of international astronomers has captured an image of a black hole’s silhouette. Evidence of the existence of black holes – mysterious places in space where nothing, not even light, can escape – has existed for quite some time, and astronomers have long observed the effects on the surroundings of these phenomena. In the popular imagination, it was thought that capturing an image of a black hole was impossible because an image of something from which no light can escape would appear completely black. For scientists, the challenge was how, from thousands or even millions of light-years away, to capture an image of the hot, glowing gas falling into a black hole. An ambitious team of international astronomers and computer scientists has managed to accomplish both. Working for well over a decade to achieve the feat, the team improved upon an existing radio astronomy technique for high-resolution imaging and used it to detect the silhouette of a black hole – outlined by the glowing gas that surrounds its event horizon, the precipice beyond which light cannot escape. Learning about these mysterious structures can help students understand gravity and the dynamic nature of our universe, all while sharpening their math skills.

How They Did It

Though scientists had theorized they could image black holes by capturing their silhouettes against their glowing surroundings, the ability to image an object so distant still eluded them. A team formed to take on the challenge, creating a network of telescopes known as the Event Horizon Telescope, or the EHT. They set out to capture an image of a black hole by improving upon a technique that allows for the imaging of far-away objects, known as Very Long Baseline Interferometry, or VLBI.

Telescopes of all types are used to see distant objects. The larger the diameter, or aperture, of the telescope, the greater its ability to gather more light and the higher its resolution (or ability to image fine details). To see details in objects that are far away and appear small and dim from Earth, we need to gather as much light as possible with very high resolution, so we need to use a telescope with a large aperture.

That’s why the VLBI technique was essential to capturing the black hole image. VLBI works by creating an array of smaller telescopes that can be synchronized to focus on the same object at the same time and act as a giant virtual telescope. In some cases, the smaller telescopes are also an array of multiple telescopes. This technique has been used to track spacecraft and to image distant cosmic radio sources, such as quasars.

More than a dozen antennas pointing forward sit on barren land surrounded by red and blue-purple mountains in the distance.

Making up one piece of the EHT array of telescopes, the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile has 66 high-precision antennas. Image credit: NRAO/AUI/NSF | + Expand image

The aperture of a giant virtual telescope such as the Event Horizon Telescope is as large as the distance between the two farthest-apart telescope stations – for the EHT, those two stations are at the South Pole and in Spain, creating an aperture that’s nearly the same as the diameter of Earth. Each telescope in the array focuses on the target, in this case the black hole, and collects data from its location on Earth, providing a portion of the EHT’s full view. The more telescopes in the array that are widely spaced, the better the image resolution.

This video shows the global network of radio telescopes in the EHT array that performed observations of the black hole in the galaxy M87. Credit: C. Fromm and L. Rezzolla (Goethe University Frankfurt)/Black Hole Cam/EHT Collaboration | Watch on YouTube

To test VLBI for imaging a black hole and a number of computer algorithms for sorting and synchronizing data, the Event Horizon Telescope team decided on two targets, each offering unique challenges.

The closest supermassive black hole to Earth, Sagittarius A*, interested the team because it is in our galactic backyard – at the center of our Milky Way galaxy, 26,000 light-years (156 quadrillion miles) away. (An asterisk is the astronomical standard for denoting a black hole.) Though not the only black hole in our galaxy, it is the black hole that appears largest from Earth. But its location in the same galaxy as Earth meant the team would have to look through “pollution” caused by stars and dust to image it, meaning there would be more data to filter out when processing the image. Nevertheless, because of the black hole’s local interest and relatively large size, the EHT team chose Sagittarius A* as one of its two targets.

An image showing a smattering of orange stars against the black backdrop of space with a small black circle in the middle and a rectangle identifying the location of the M87 black hole.

A close-up image of the core of the M87 galaxy, imaged by the Chandra X-ray Observatory. Image credit: NASA/CXC/Villanova University/J. Neilsen | + Expand image

A blue jet extends from a bright yellow point surrounded by smaller yellow stars.

This image from NASA's Hubble Space Telescope shows a jet of subatomic particles streaming from the center of M87*. Image credits: NASA and the Hubble Heritage Team (STScI/AURA) | + Expand image

The second target was the supermassive black hole M87*. One of the largest known supermassive black holes, M87* is located at the center of the gargantuan elliptical galaxy Messier 87, or M87, 53 million light-years (318 quintillion miles) away. Substantially more massive than Sagittarius A*, which contains 4 million solar masses, M87* contains 6.5 billion solar masses. One solar mass is equivalent to the mass of our Sun, approximately 2x10^30 kilograms. In addition to its size, M87* interested scientists because, unlike Sagittarius A*, it is an active black hole, with matter falling into it and spewing out in the form of jets of particles that are accelerated to velocities near the speed of light. But its distance made it even more of a challenge to capture than the relatively local Sagittarius A*. As described by Katie Bouman, a computer scientist with the EHT who led development of one of the algorithms used to sort telescope data during the processing of the historic image, it’s akin to capturing an image of an orange on the surface of the Moon.

By 2017, the EHT was a collaboration of eight sites around the world – and more have been added since then. Before the team could begin collecting data, they had to find a time when the weather was likely to be conducive to telescope viewing at every location. For M87*, the team tried for good weather in April 2017 and, of the 10 days chosen for observation, a whopping four days were clear at all eight sites!

Each telescope used for the EHT had to be highly synchronized with the others to within a fraction of a millimeter using an atomic clock locked onto a GPS time standard. This degree of precision makes the EHT capable of resolving objects about 4,000 times better than the Hubble Space Telescope. As each telescope acquired data from the target black hole, the digitized data and time stamp were recorded on computer disk media. Gathering data for four days around the world gave the team a substantial amount of data to process. The recorded media were then physically transported to a central location because the amount of data, around 5 petabytes, exceeds what the current internet speeds can handle. At this central location, data from all eight sites were synchronized using the time stamps and combined to create a composite set of images, revealing the never-before-seen silhouette of M87*’s event horizon. The team is also working on generating an image of Sagittarius A* from additional observations made by the EHT.

This zoom video starts with a view of the ALMA telescope array in Chile and zooms in on the heart of M87, showing successively more detailed observations and culminating in the first direct visual evidence of a supermassive black hole’s silhouette. Credit: ESO/L. Calçada, Digitized Sky Survey 2, ESA/Hubble, RadioAstron, De Gasperin et al., Kim et al., EHT Collaboration. Music: Niklas Falcke | Watch on YouTube

As more telescopes are added and the rotation of Earth is factored in, more of the image can be resolved, and we can expect future images to be higher resolution. But we might never have a complete picture, as Katie Bouman explains here (under “Imaging a Black Hole”).

To complement the EHT findings, several NASA spacecraft were part of a large effort to observe the black hole using different wavelengths of light. As part of this effort, NASA’s Chandra X-ray Observatory, Nuclear Spectroscopic Telescope Array (NuSTAR) and Neil Gehrels Swift Observatory space telescope missions – all designed to detect different varieties of X-ray light – turned their gaze to the M87 black hole around the same time as the EHT in April 2017. NASA’s Fermi Gamma-ray Space Telescope was also watching for changes in gamma-ray light from M87* during the EHT observations. If the EHT observed changes in the structure of the black hole’s environment, data from these missions and other telescopes could be used to help figure out what was going on.

Though NASA observations did not directly trace out the historic image, astronomers used data from Chandra and NuSTAR satellites to measure the X-ray brightness of M87*’s jet. Scientists used this information to compare their models of the jet and disk around the black hole with the EHT observations. Other insights may come as researchers continue to pore over these data.

Why It's Important

Learning about mysterious structures in the universe provides insight into physics and allows us to test observation methods and theories, such as Einstein’s theory of general relativity. Massive objects deform spacetime in their vicinity, and although the theory of general relativity has directly been proven accurate for smaller-mass objects, such as Earth and the Sun, the theory has not yet been directly proven for black holes and other regions containing dense matter.

One of the main results of the EHT black hole imaging project is a more direct calculation of a black hole’s mass than ever before. Using the EHT, scientists were able to directly observe and measure the radius of M87*’s event horizon, or its Schwarzschild radius, and compute the black hole’s mass. That estimate was close to the one derived from a method that uses the motion of orbiting stars – thus validating it as a method of mass estimation.

The size and shape of a black hole, which depend on its mass and spin, can be predicted from general relativity equations. General relativity predicts that this silhouette would be roughly circular, but other theories of gravity predict slightly different shapes. The image of M87* shows a circular silhouette, thus lending credibility to Einstein’s theory of general relativity near black holes.

An illustration of a black hole surrounded by a bright, colorful swirl of material. Text describes each part of the black hole and its surroundings.

This artist’s impression depicts a rapidly spinning supermassive black hole surrounded by an accretion disc. Image credit: ESO | + Expand image

The data also offer some insight into the formation and behavior of black hole structures, such as the accretion disk that feeds matter into the black hole and plasma jets that emanate from its center. Scientists have hypothesized about how an accretion disk forms, but they’ve never been able to test their theories with direct observation until now. Scientists are also curious about the mechanism by which some supermassive black holes emit enormous jets of particles traveling at near light-speed.

These questions and others will be answered as more data is acquired by the EHT and synthesized in computer algorithms. Be sure to stay tuned for that and the next expected image of a black hole – our Milky Way’s own Sagittarius A*.

Teach It

Capture your students’ enthusiasm about black holes by challenging them to solve these standards-aligned math problems.

Model black-hole interaction with this NGSS-aligned lesson:

Explore More


Check out these related resources for students from NASA’s Space Place

TAGS: Black Hole, Teachable Moments, Science, K-12 Education, Teachers, Educators

  • Ota Lutz
READ MORE

Illustration of spacecraft against a starry background

Update: March 15, 2019 – The answers to the 2018 NASA Pi Day Challenge are here! View the illustrated answer key


In the News

The excitement of Pi Day – and our annual excuse to chow down on pie – is upon us! The holiday celebrating the mathematical constant pi arrives on March 14, and with it comes the sixth installment of the NASA Pi Day Challenge from the Jet Propulsion Laboratory’s Education Office. This challenge gives students in grades 6-12 a chance to solve four real-world problems faced by NASA scientists and engineers. (Even if you’re done with school, they’re worth a try for the bragging rights.)

https://www.jpl.nasa.gov/edu/teach/activity/pi-in-the-sky-6/

Visit the "Pi in the Sky 6" lesson page to explore classroom resources and downloads for the 2019 NASA Pi Day Challenge. Image credit: NASA/JPL-Caltech/Kim Orr | + Expand image

Why March 14?

Pi, the ratio of a circle’s circumference to its diameter, is what is known as an irrational number. As an irrational number, its decimal representation never ends, and it never repeats. Though it has been calculated to trillions of digits, we use far fewer at NASA. In fact, 3.14 is a good approximation, which is why March 14 (or 3/14 in U.S. month/day format) came to be the date that we celebrate this mathematical marvel.

The first-known Pi Day celebration occurred in 1988. In 2009, the U.S. House of Representatives passed a resolution designating March 14 as Pi Day and encouraging teachers and students to celebrate the day with activities that teach students about pi.

The 2019 Challenge

This year’s NASA Pi Day Challenge features four planetary puzzlers that show students how pi is used at the agency. The challenges involve weathering a Mars dust storm, sizing up a shrinking storm on Jupiter, estimating the water content of a rain cloud on Earth and blasting ice samples with lasers!

›Take on the 2019 NASA Pi Day Challenge!

The Science Behind the Challenge

In late spring of 2018, a dust storm began stretching across Mars and eventually nearly blanketed the entire planet in thick dust. Darkness fell across Mars’ surface, blocking the vital sunlight that the solar-powered Opportunity rover needed to survive. It was the beginning of the end for the rover’s 15-year mission on Mars. At its height, the storm covered all but the peak of Olympus Mons, the largest known volcano in the solar system. In the Deadly Dust challenge, students must use pi to calculate what percentage of the Red Planet was covered by the dust storm.

The Terra satellite, orbiting Earth since 1999, uses the nine cameras on its Multi-Angle Imaging SpectroRadiometer, or MISR, instrument to provide scientists with unique views of Earth, returning data about atmospheric particles, land-surface features and clouds. Estimating the amount of water in a cloud, and the potential for rainfall, is serious business. Knowing how much rain may fall in a given area can help residents and first responders prepare for emergencies like flooding and mudslides. In Cloud Computing, students can use their knowledge of pi and geometric shapes to estimate the amount of water contained in a cloud.

Jupiter’s Great Red Spot, a giant storm that has been fascinating observers since the early 19th century, is shrinking. The storm has been continuously observed since the 1830s, but measurements from spacecraft like Voyager, the Hubble Space Telescope and Juno indicate the storm is getting smaller. How much smaller? In Storm Spotter, students can determine the answer to that very question faced by scientists.

Scientists studying ices found in space, such as comets, want to understand what they’re made of and how they interact and react with the environment around them. To see what molecules may form in space when a comet comes into contact with solar wind or sunlight, scientists place an ice sample in a vacuum and then expose it to electrons or ultraviolet photons. Scientists have analyzed samples in the lab and detected molecules that were later observed in space on comet 67P/Churyumov-Gerasimenko. To analyze the lab samples, an infrared laser is aimed at the ice, causing it to explode. But the ice will explode only if the laser is powerful enough. Scientist use pi to figure out how strong the laser needs to be to explode the sample – and students can do the same when they solve the Icy Intel challenge.

Explore More

Participate

Join the conversation and share your Pi Day Challenge answers with @NASAJPL_Edu on social media using the hashtag #NASAPiDayChallenge

Blogs and Features

Related Activities

Multimedia

Facts and Figures

Missions and Instruments

Websites

TAGS: Pi Day, K-12, STEM, Science, Engineering, Technology, Math, Pi, Educators, Teachers, Informal Education, Museums

  • Lyle Tavernier
READ MORE

In the News

This summer, a global dust storm encircled Mars, blocking much of the vital solar energy that NASA’s Opportunity rover needs to survive. After months of listening for a signal, the agency has declared that the longest-lived rover to explore Mars has come to the end of its mission. Originally slated for a three-month mission, the Opportunity rover lived a whopping 14.5 years on Mars. Opportunity beat the odds many times while exploring the Red Planet, returning an abundance of scientific data that paved the way for future exploration.

Scientists and engineers are celebrating this unprecedented mission success, still analyzing data collected during the past decade and a half and applying lessons learned to the design of future spacecraft. For teachers, this historic mission provides lessons in engineering design, troubleshooting and scientific discovery.

How They Did It

Launched in 2003 and landed in early 2004, the twin Mars Exploration Rovers, Spirit and Opportunity, were the second spacecraft of their kind to land on our neighboring planet.

Preceded by the small Sojourner rover in 1997, Spirit and Opportunity were substantially larger, weighing about 400 pounds, or 185 kilograms, on Earth (150 pounds, or 70 kilograms, on Mars) and standing about 5 feet tall. The solar-powered rovers were designed for a mission lasting 90 sols, or Mars days, during which they would look for evidence of water on the seemingly barren planet.

Dust in the Wind

Scientists and engineers always hope a spacecraft will outlive its designed lifetime, and the Mars Exploration Rovers did not disappoint. Engineers at NASA’s Jet Propulsion Laboratory in Pasadena, California, expected the lifetime of these sun-powered robots to be limited by dust accumulating on the rovers’ solar panels. As expected, power input to the rovers slowly decreased as dust settled on the panels and blocked some of the incoming sunlight. However, the panels were “cleaned” accidentally when seasonal winds blew off the dust. Several times during the mission, power levels were restored to pre-dusty conditions. Because of these events, the rovers were able to continue their exploration much longer than expected with enough power to continue running all of their instruments.

Side-by-side images of Opportunity on Mars, showing dust on its solar panels and then relatively clean solar panels

A self-portrait of NASA's Mars Exploration Rover Opportunity taken in late March 2014 (right) shows that much of the dust on the rover's solar arrays was removed since a similar portrait from January 2014 (left). Image Credit: NASA/JPL-Caltech/Cornell Univ./Arizona State Univ. | › Full image and caption

Terrestrial Twin

To troubleshoot and overcome challenges during the rovers’ long mission, engineers would perform tests on a duplicate model of the spacecraft, which remained on Earth for just this purpose. One such instance was in 2005, when Opportunity got stuck in the sand. Its right front wheel dug into loose sand, reaching to just below its axle. Engineers and scientists worked for five weeks to free Opportunity, first using images and spectroscopy obtained by the rover’s instruments to recreate the sand trap on Earth and then placing the test rover in the exact same position as Opportunity. The team eventually found a way to get the test rover out of the sand trap. Engineers tested their commands repeatedly with consistent results, giving them confidence in their solution. The same commands were relayed to Opportunity through NASA’s Deep Space Network, and the patient rover turned its stuck wheel just the right amount and backed out of the trap that had ensnared it for over a month, enabling the mission to continue.

Engineers test moves on a model of the Opportunity rover in the In-Situ Instrument Laboratory at JPL

Inside the In-Situ Instrument Laboratory at JPL, rover engineers check how a test rover moves in material chosen to simulate some difficult Mars driving conditions. | › Full image and caption

A few years later, in 2009, Spirit wasn’t as lucky. Having already sustained some wheel problems, Spirit got stuck on a slope in a position that would not be favorable for the Martian winter. Engineers were not able to free Spirit before winter took hold, denying the rover adequate sunlight for power. Its mission officially ended in 2011. Meanwhile, despite a troubled shoulder joint on its robotic arm that first started showing wear in 2006, Opportunity continued exploring the Red Planet. It wasn’t until a dust storm completely enveloped Mars in the summer of 2018 that Opportunity finally succumbed to the elements.

The Final Act

animation showing a dust storm moving across Mars

This set of images from NASA’s Mars Reconnaissance Orbiter (MRO) shows a giant dust storm building up on Mars in 2018, with rovers on the surface indicated as icons. Image credit: NASA/JPL-Caltech/MSSS | › Full image and caption

simulated views of the sun as the 2018 dust storm darkened from Opportunity's perspective on Mars

This series of images shows simulated views of a darkening Martian sky blotting out the Sun from NASA’s Opportunity rover’s point of view in the 2018 global dust storm. Each frame corresponds to a tau value, or measure of opacity: 1, 3, 5, 7, 9, 11. Image credit: NASA/JPL-Caltech/TAMU | › Full image and caption

Dust storm season on Mars can be treacherous for solar-powered rovers because if they are in the path of the dust storm, their access to sunlight can be obstructed for months on end, longer than their batteries can sustain them. Though several dust storms occurred on Mars during the reign of the Mars Exploration Rovers, 2018 brought a large, thick dust storm that covered the entire globe and shrouded Opportunity’s access to sunlight for four months. Only the caldera of Olympus Mons, the largest known volcano in the solar system, peeked out above the dust.

The transparency or “thickness” of the dust in Mars’ atmosphere is denoted by the Greek letter tau. The higher the tau, the less sunlight is available to charge a surface spacecraft’s batteries. An average tau for Opportunity’s location is 0.5. The tau at the peak of the 2018 dust storm was 10.8. This thick dust was imaged and measured by the Curiosity Mars rover on the opposite side of the planet. (Curiosity is powered by a radioisotope thermoelectric generator.)

Since the last communication with Opportunity on June 10, 2018, NASA has sent more than 1,000 commands to the rover that have gone unanswered. Each of these commands was an attempt to get Opportunity to send back a signal saying it was alive. A last-ditch effort to reset the rover’s mission clock was met with silence.

Why It’s Important

The Mars Exploration Rovers were designed to give a human-height perspective of Mars, using panoramic cameras approximately 5 feet off the surface, while their science instruments investigated Mars’ surface geology for signs of water. Spirit and Opportunity returned more than 340,000 raw images conveying the beauty of Mars and leading to scientific discoveries. The rovers brought Mars into classrooms and living rooms around the world. From curious geologic formations to dune fields, dust devils and even their own tracks on the surface of the Red Planet, the rovers showed us Mars in a way we had never seen it before.

tracks on Mars with a patch of white soil showing

This mosaic shows an area of disturbed soil made by the Spirit rover's stuck right front wheel. The trench exposed a patch of nearly pure silica, with the composition of opal. Image credit: NASA/JPL-Caltech/Cornell | › Full image and caption

Mineral vein on the surface of Mars

This color view of a mineral vein was taken by the Mars rover Opportunity on Nov. 7, 2011. Image credit: NASA/JPL-Caltech/Cornell/ASU | › Full image and caption

The rovers discovered that Mars was once a warmer, wetter world than it is today and was potentially able to support microbial life. Opportunity landed in a crater and almost immediately discovered deposits of hematite, which is a mineral known to typically form in the presence of water. During its travels across the Mars surface, Spirit found rocks rich in magnesium and iron carbonates that likely formed when Mars was warm and wet, and sustained a near-neutral pH environment hospitable to life. At one point, while dragging its malfunctioning wheel, Spirit excavated 90 percent pure silica lurking just below the sandy surface. On Earth, this sort of silica usually exists in hot springs or hot steam vents, where life as we know it often finds a happy home. Later in its mission, near the rim of Endeavor crater, Opportunity found bright-colored veins of gypsum in the rocks. These veins likely formed when water flowed through underground fractures in the rocks, leaving calcium behind. All of these discoveries lead scientists to believe that Mars was once more hospitable to life than it is today, and they laid the groundwork for future exploration.

Imagery from the Mars Reconnaissance Orbiter and Mars Odyssey, both orbiting the Red Planet, has been combined with surface views and data from the Mars Exploration Rovers for an unprecedented understanding of the planet’s geology and environment.

Not only did Spirit and Opportunity add to our understanding of Mars, but also the rovers set the stage for future exploration. Following in their tracks, the Curiosity rover landed in 2012 and is still active, investigating the planet’s surface chemistry and geology, and confirming the presence of past water. Launching in 2020 is the next Mars rover, currently named Mars 2020. Mars 2020 will be able to analyze soil samples for signs of past microbial life. It will carry a drill that can collect samples of interesting rocks and soils, and set them aside in a cache on the surface of Mars. In the future, those samples could be retrieved and returned to Earth by another mission. Mars 2020 will also do preliminary research for future human missions to the Red Planet, including testing a method of producing oxygen from Mars’ atmosphere.

It’s thanks to three generations of surface-exploring rovers coupled with the knowledge obtained by orbiters and stationary landers that we have a deeper understanding of the Red Planet’s geologic history and can continue to explore Mars in new and exciting ways.

Teach It

Use these standards-aligned lessons and related activities to get students doing engineering, troubleshooting and scientific discovery just like NASA scientists and engineers!

Explore More

Try these related resources for students from NASA’s Space Place

TAGS: K-12 Education, Teachers, Educators, Students, Opportunity, Mars rover, Rovers, Mars, Lessons, Activities, Missions

  • Ota Lutz
READ MORE

The supermoon lunar eclipse captured as it moved over NASA’s Glenn Research Center on September 27, 2015.

In the News

Looking up at the Moon can create a sense of awe at any time, but those who do so on the evening of January 20 will be treated to the only total lunar eclipse of 2019. Visible for its entirety in North and South America, this eclipse is being referred to by some as a super blood moon – “super” because the Moon will be closest to Earth in its orbit during the full moon (more on supermoons here) and “blood" because the total lunar eclipse will turn the Moon a reddish hue (more on that below). This is a great opportunity for students to observe the Moon – and for teachers to make connections to in-class science content.

How It Works

Eclipses can occur when the Sun, the Moon and Earth align. Lunar eclipses can happen only during a full moon, when the Moon and the Sun are on opposite sides of Earth. At that point, the Moon can move into the shadow cast by Earth, resulting in a lunar eclipse. However, most of the time, the Moon’s slightly tilted orbit brings it above or below Earth’s shadow.

Watch on YouTube

The time period when the Moon, Earth and the Sun are lined up and on the same plane – allowing for the Moon to pass through Earth’s shadow – is called an eclipse season. Eclipse seasons last about 34 days and occur just shy of every six months. When a full moon occurs during an eclipse season, the Moon travels through Earth’s shadow, creating a lunar eclipse.

Graphic showing the alignment of the Sun, Earth and Moon when a full moon occurs during an eclipse season versus a non-eclipse season

When a full moon occurs during an eclipse season, the Moon travels through Earth's shadow, creating a lunar eclipse. Credit: NASA/JPL-Caltech | + Enlarge image

Unlike solar eclipses, which require special glasses to view and can be seen only for a few short minutes in a very limited area, a total lunar eclipse can be seen for about an hour by anyone on the nighttime side of Earth – as long as skies are clear.

What to Expect

The Moon passes through two distinct parts of Earth’s shadow during a lunar eclipse. The outer part of the cone-shaped shadow is called the penumbra. The penumbra is less dark than the inner part of the shadow because it’s penetrated by some sunlight. (You have probably noticed that some shadows on the ground are darker than others, depending on how much outside light enters the shadow; the same is true for the outer part of Earth’s shadow.) The inner part of the shadow, known as the umbra, is much darker because Earth blocks additional sunlight from entering the umbra.

At 6:36 p.m. PST (9:36 p.m. EST) on January 20, the edge of the Moon will begin entering the penumbra. The Moon will dim very slightly for the next 57 minutes as it moves deeper into the penumbra. Because this part of Earth’s shadow is not fully dark, you may notice only some dim shading (if anything at all) on the Moon near the end of this part of the eclipse.

Graphic showing the positions of the Moon, Earth and Sun during a partial lunar eclipse

During a total lunar eclipse, the Moon first enters into the penumbra, or the outer part of Earth's shadow, where the shadow is still penetrated by some sunlight. Credit: NASA | + Enlarge image

At 7:33 p.m. PST (10:33 p.m. EST), the edge of the Moon will begin entering the umbra. As the Moon moves into the darker shadow, significant darkening of the Moon will be noticeable. Some say that during this part of the eclipse, the Moon looks as if it has had a bite taken out of it. That “bite” gets bigger and bigger as the Moon moves deeper into the shadow.

The Moon as seen during a partial lunar eclipse

As the Moon starts to enter into the umbra, the inner and darker part of Earth's shadow, it appears as if a bite has been taken out of the Moon. This "bite" will grow until the Moon has entered fully into the umbra. Credit: NASA | + Enlarge image

At 8:41 p.m. PST (11:41 p.m. EST), the Moon will be completely inside the umbra, marking the beginning of the total lunar eclipse. The moment of greatest eclipse, when the Moon is halfway through the umbra, occurs at 9:12 p.m. PST (12:12 a.m. EST).

Graphic showing the Moon inside the umbra

The total lunar eclipse starts once the moon is completely inside the umbra. And the moment of greatest eclipse happens with the Moon is halfway through the umbra as shown in this graphic. Credit: NASA | + Enlarge image

As the Moon moves completely into the umbra, something interesting happens: The Moon begins to turn reddish-orange. The reason for this phenomenon? Earth’s atmosphere. As sunlight passes through it, the small molecules that make up our atmosphere scatter blue light, which is why the sky appears blue. This leaves behind mostly red light that bends, or refracts, into Earth’s shadow. We can see the red light during an eclipse as it falls onto the Moon in Earth’s shadow. This same effect is what gives sunrises and sunsets a reddish-orange color.

The Moon as seen during a total lunar eclipse at the point of greatest eclipse

As the Moon moves completely into the umbra, it turns a reddish-orange color. Credit: NASA | + Enlarge image

A variety of factors affect the appearance of the Moon during a total lunar eclipse. Clouds, dust, ash, photochemical droplets and organic material in the atmosphere can change how much light is refracted into the umbra. Additionally, the January 2019 lunar eclipse takes place when the full moon is at or near the closest point in its orbit to Earth – a time popularly known as a supermoon. This means the Moon is deeper inside the umbra shadow and therefore may appear darker. The potential for variation provides a great opportunity for students to observe and classify the lunar eclipse based on its brightness. Details can be found in the “Teach It” section below.

At 9:43 p.m. PST (12:43 a.m. EST), the edge of the Moon will begin exiting the umbra and moving into the opposite side of the penumbra. This marks the end of the total lunar eclipse.

At 10:50 p.m. PST (1:50 a.m. EST), the Moon will be completely outside the umbra. It will continue moving out of the penumbra until the eclipse ends at 11:48 p.m (2:48 a.m. EST).

What if it’s cloudy where you live? Winter eclipses always bring with them the risk of poor viewing conditions. If your view of the Moon is obscured by the weather, explore options for watching the eclipse online, such as the Time and Date live stream.

Why It’s Important

Lunar eclipses have long played an important role in understanding Earth and its motions in space.

In ancient Greece, Aristotle noted that the shadows on the Moon during lunar eclipses were round, regardless of where an observer saw them. He realized that only if Earth were a spheroid would its shadows be round – a revelation that he and others had many centuries before the first ships sailed around the world.

Earth wobbles on its axis like a spinning top that’s about to fall over, a phenomenon called precession. Earth completes one wobble, or precession cycle, over the course of 26,000 years. Greek astronomer Hipparchus made this discovery by comparing the position of stars relative to the Sun during a lunar eclipse to those recorded hundreds of years earlier. A lunar eclipse allowed him to see the stars and know exactly where the Sun was for comparison – directly opposite the Moon. If Earth didn’t wobble, the stars would appear to be in the same place they were hundreds of years earlier. When Hipparchus saw that the stars’ positions had indeed moved, he knew that Earth must wobble on its axis!

Lunar eclipses are also used for modern-day science investigations. Astronomers have used ancient eclipse records and compared them with computer simulations. These comparisons helped scientists determine the rate at which Earth’s rotation is slowing.

Teach It

Ask students to observe the lunar eclipse and evaluate the Moon’s brightness using the Danjon Scale of Lunar Eclipse Brightness. The Danjon scale illustrates the range of colors and brightness the Moon can take on during a total lunar eclipse, and it’s a tool observers can use to characterize the appearance of an eclipse. View the lesson guide below. After the eclipse, have students compare and justify their evaluations of the eclipse.

Use these standards-aligned lessons and related activities to get your students excited about the eclipse, Moon phases and Moon observations:

TAGS: Lunar Eclipse, Moon, Teachers, Educators, K-12 Education, Astronomy

  • Lyle Tavernier
READ MORE

This illustration shows the position of NASA's Voyager 1 and Voyager 2 probes, outside of the heliosphere, a protective bubble created by the Sun that extends well past the orbit of Pluto.

In the News

The Voyager 2 spacecraft, launched in 1977, has reached interstellar space, a region beyond the heliosphere – the protective bubble of particles and magnetic fields created by the Sun – where the only other human-made object is its twin, Voyager 1.

The achievement means new opportunities for scientists to study this mysterious region. And for educators, it’s a chance to get students exploring the scale and anatomy of our solar system, plus the engineering and math required for such an epic journey.

How They Did It

Launched just 16 days apart, Voyager 1 and Voyager 2 were designed to take advantage of a rare alignment of the outer planets that only occurs once every 176 years. Their trajectory took them by the outer planets, where they captured never-before-seen images. They were also able to steal a little momentum from Jupiter and Saturn that helped send them on a path toward interstellar space. This “gravity assist” gave the spacecraft a velocity boost without expending any fuel. Though both spacecraft were destined for interstellar space, they followed slightly different trajectories.

Illustration of the trajectories of Voyager 1 and 2

An illustration of the trajectories of Voyager 1 and Voyager 2. Image credit: NASA/JPL-Caltech | + Expand image

Voyager 1 followed a path that enabled it to fly by Jupiter in 1979, discovering the gas giant’s rings. It continued on for a 1980 close encounter with Saturn’s moon Titan before a gravity assist from Saturn hurled it above the plane of the solar system and out toward interstellar space. After Voyager 2 visited Jupiter in 1979 and Saturn in 1981, it continued on to encounter Uranus in 1986, where it obtained another assist. Its last planetary visit before heading out of the solar system was Neptune in 1989, where the gas giant’s gravity sent the probe in a southward direction toward interstellar space. Since the end of its prime mission at Neptune, Voyager 2 has been using its onboard instruments to continue sensing the environment around it, communicating data back to scientists on Earth. It was this data that scientists used to determine Voyager 2 had entered interstellar space.

How We Know

Interstellar space, the region between the stars, is beyond the influence of the solar wind, charged particles emanating from the Sun, and before the influence of the stellar wind of another star. One hint that Voyager 2 was nearing interstellar space came in late August when the Cosmic Ray Subsystem, an instrument that measures cosmic rays coming from the Sun and galactic cosmic rays coming from outside our solar system, measured an increase in galactic cosmic rays hitting the spacecraft. Then on November 5, the instrument detected a sharp decrease in high energy particles from the Sun. That downward trend continued over the following weeks.

The data from the cosmic ray instrument provided strong evidence that Voyager 2 had entered interstellar space because its twin had returned similar data when it crossed the boundary of the heliosheath. But the most compelling evidence came from its Plasma Science Experiment – an instrument that had stopped working on Voyager 1 in 1980. Until recently, the space surrounding Voyager 2 was filled mostly with plasma flowing out from our Sun. This outflow, called the solar wind, creates a bubble, the heliosphere, that envelopes all the planets in our solar system. Voyager 2’s Plasma Science Experiment can detect the speed, density, temperature, pressure and flux of that solar wind. On the same day that the spacecraft’s cosmic ray instrument detected a steep decline in the number of solar energetic particles, the plasma science instrument observed a decline in the speed of the solar wind. Since that date, the plasma instrument has observed no solar wind flow in the environment around Voyager 2, which makes mission scientists confident the probe has entered interstellar space.

graph showing data from the cosmic ray and plasma science instruments on Voyager 2

This animated graph shows data returned from Voyager 2's cosmic ray and plasma science instruments, which provided the evidence that the spacecraft had entered interstellar space. Image credit: NASA/JPL-Caltech/GSFC | + Expand image

Though the spacecraft have left the heliosphere, Voyager 1 and Voyager 2 have not yet left the solar system, and won't be leaving anytime soon. The boundary of the solar system is considered to be beyond the outer edge of the Oort Cloud, a collection of small objects that are still under the influence of the Sun's gravity. The width of the Oort Cloud is not known precisely, but it is estimated to begin at about 1,000 astronomical units from the Sun and extend to about 100,000 AU. (One astronomical unit, or AU, is the distance from the Sun to Earth.) It will take about 300 years for Voyager 2 to reach the inner edge of the Oort Cloud and possibly 30,000 years to fly beyond it. By that time, both Voyager spacecraft will be completely out of the hydrazine fuel used to point them toward Earth (to send and receive data) and their power sources will have decayed beyond their usable lifetime.

Why It’s Important

Since the Voyager spacecraft launched more than 40 years ago, no other NASA missions have encountered as many planets (some of which had never been visited) and continued making science observations from such great distances. Other spacecraft, such as New Horizons and Pioneer 10 and 11, will eventually make it to interstellar space, but we will have no data from them to confirm their arrival or explore the region because their instruments already have or will have shut off by then.

Watch on YouTube

Interstellar space is a region that’s still mysterious because until 2012, when Voyager 1 arrived there, no spacecraft had visited it. Now, data from Voyager 2 will help add to scientists’ growing understanding of the region. Scientists are hoping to continue using Voyager 2’s plasma science instrument to study the properties of the ionized gases, or plasma, that exist in the interstellar medium by making direct measurements of the plasma density and temperature. This new data may shed more light on the evolution of our solar neighborhood and will most certainly provide a window into the exciting unexplored region of interstellar space, improving our understanding of space and our place in it.

As power wanes on Voyager 2, scientists will have to make tough choices about which instruments to keep turned on. Further complicating the situation is the freezing cold temperature at which the spacecraft is currently operating – perilously close to the freezing point of its hydrazine fuel. But for as long as both Voyager spacecraft are able to maintain power and communication, we will continue to learn about the uncharted territory of interstellar space.

Teach It

Use these standards-aligned lessons and related activities to get students doing math and science with a real-world (and space!) connection.

Explore More

TAGS: Teachers, Educators, Science, Engineering, Technology, Solar System, Voyager, Spacecraft, Educator Resources, Lessons, Activities

  • Ota Lutz
READ MORE

Animation showing InSight landing on Mars

Tom Hoffman, InSight Project Manager, NASA JPL, left, and Sue Smrekar, InSight deputy principal investigator, NASA JPL, react after receiving confirmation InSight is safe on the surface of Mars

This is the first image taken by NASA's InSight lander on the surface of Mars.

The Instrument Deployment Camera (IDC), located on the robotic arm of NASA's InSight lander, took this picture of the Martian surface on Nov. 26

UPDATE: Nov. 27, 2018 – The InSight spacecraft successfully touched down on Mars just before noon on Nov. 26, 2018, marking the eighth time NASA has succeeded in landing a spacecraft on the Red Planet. This story has been updated to reflect the current mission status. For more mission updates, follow along on the InSight Mission Blog, JPL News, as well as Facebook and Twitter (@NASAInSight, @NASAJPL and @NASA).


In the News

NASA’s newest mission to Mars, the InSight lander, touched down just before noon PST on Nov. 26. So while some people were looking for Cyber Monday deals, scientists and engineers at NASA’s Jet Propulsion Laboratory were monitoring their screens for something else: signals from the spacecraft that it successfully touched down on the Red Planet.

InSight spent nearly seven months in space, kicked off by the first interplanetary launch from the West Coast of the U.S. Once it arrived at the Red Planet, InSight had to perform its entry, descent and landing, or EDL, to safely touch down on the Martian surface. This was perhaps the most dangerous part of the entire mission because it required that the spacecraft withstand temperatures near 1,500 degrees Fahrenheit, quickly put on its brakes by using the atmosphere to slow down, then release a supersonic parachute and finally lower itself to the surface using 12 retrorockets.

When NASA’s InSight descends to the Red Planet on Nov. 26, 2018, it is guaranteed to be a white-knuckle event. Rob Manning, chief engineer at NASA’s Jet Propulsion Laboratory, explains the critical steps that must happen in perfect sequence to get the robotic lander safely to the surface. | Watch on YouTube

But even after that harrowing trip to the surface, InSight will have to overcome one more challenge before it can get to the most important part of the mission, the science. After a thorough survey of its landing area, InSight will need to carefully deploy each of its science instruments to the surface of Mars. It may sound like an easy task, but it’s one that requires precision and patience.

It’s also a great opportunity for educators to engage students in NASA’s exploration of Mars and the importance of planetary science while making real-world connections to lessons in science, coding and engineering. Read on to find out how.

How It Works: Deploying InSight’s Instruments

InSight is equipped with three science investigations with which to study the deep interior of Mars for the first time. The Seismic Experiment for Interior Structures, or SEIS, is a seismometer that will record seismic waves traveling through the interior of Mars.

These waves can be created by marsquakes, or even meteorites striking the surface. The Heat Flow and Physical Properties Package, or HP3, will investigate how much heat is still flowing out of Mars. It will do so by hammering a probe down to a depth of up to 16 feet (about 5 meters) underground. The Rotation and Interior Structure Experiment, or RISE, will use InSight’s telecommunications system to precisely track the movement of Mars through space. This will shed light on the makeup of Mars’ iron-rich core.

But to start capturing much of that science data, InSight will have to first carefully move the SEIS and HP3 instruments from its stowage area on the lander deck and place them in precise locations on the ground. Among its many firsts, InSight will be the first spacecraft to use a robotic arm to place instruments on the surface of Mars. Even though each instrument will need to be lowered only a little more than three feet (1 meter) to the ground, it’s a delicate maneuver that the team will rehearse to make sure they get it right.

InSight’s robotic arm is nearly 6 feet (about 2 meters) long. At the end of the arm is a five-fingered grappler that is designed to grab SEIS and HP3 from the deck of the lander and place them on the ground in front of the lander in a manner similar to how a claw game grabs prizes and deposits them in the collection chute. But on Mars, it has to work every time.

InSight will be the first mission on another planet to use a robotic arm to grasp instruments and place them on the surface. While it may look like an arcade machine, this space claw is designed to come away with a prize every time. | Watch on YouTube

Before the instruments can be set down, the area where they will be deployed – commonly referred to as the work space – must be assessed so SEIS and HP3 can be positioned in the best possible spots to meet their science goals. InSight is designed to land with the solar panels at an east-west orientation and the robotic arm facing south. The work space covers about three-square meters to the south of the rover. Because InSight is a three-legged lander and not a six-wheeled rover, science and engineering teams must find the best areas to deploy the instruments within the limited work space at InSight’s landing spot. That is why choosing the best landing site (which for InSight means one that is very flat and has few rocks) is so important.

Just as having two eyes gives us the ability to perceive depth, InSight will use a camera on its robotic arm to take what are known as stereo-pair images. These image pairs, made by taking a photo and then moving the camera slightly to the side for another image, provide 3D elevation information that’s used by the science and engineering teams. With this information, they can build terrain maps that show roughness and tilt, and generate something called a goodness map to help identify the best location to place each instrument. Evaluating the work space is expected to take a few weeks.

Once the team has selected the locations where they plan to deploy the instruments, the robotic arm will use its grapple to first grab SEIS and lower it to the surface. When the team confirms that the instrument is on the ground, the grapple will be released and images will be taken. If the team decides they like where the instrument is placed, it will be leveled, and the seismic sensor will be re-centered so it can be calibrated to collect scientific data. If the location is deemed unsuitable, InSight will use its robotic arm to reposition SEIS.

But wait, there’s more! SEIS is sensitive to changes in air pressure, wind and even local magnetic fields. In fact, it is so sensitive that it can detect ground movement as small as half the radius of a hydrogen atom! So that the instrument isn’t affected by the wind and changes in temperature, the robotic arm will have to cover SEIS with the Wind and Thermal Shield.

After SEIS is on the ground and covered by the shield, and the deployment team is satisfied with their placement, the robotic arm will grab the HP3 instrument and place it on the surface. Just as with SEIS, once the team receives confirmation that HP3 is on the ground, the grapple will be released and the stability of the instrument will be confirmed. The final step in deploying the science instruments is to release the HP3 self-hammering mole from within the instrument so that it will be able to drive itself into the ground. The whole process from landing to final deployment is expected to take two to three months.

Why It’s Important

For the science instruments to work – and for the mission to be a success – it’s critical that the instruments are safely deployed. So while sending a mission to another planet is a huge accomplishment and getting pictures of other worlds is inspiring, it’s important to remember that science is the driver behind these missions. As technologies advance, new techniques are discovered and new ideas are formulated. Opportunities arise to explore new worlds and revisit seemingly familiar worlds with new tools.

Using its science instruments, SEIS and HP3, plus the radio-science experiment (RISE) to study how much Mars wobbles as it orbits the Sun, InSight will help scientists look at Mars in a whole new way: from the inside.

SEIS will help scientists understand how tectonically active Mars is today by measuring the power and frequency of marsquakes, and it will also measure how often meteorites impact the surface of Mars.

HP3 and RISE will give scientists the information they need to determine the size of Mars’ core and whether it’s liquid or solid; the thickness and structure of the crust; the structure of the mantle and what it’s made of; and how warm the interior is and how much heat is still flowing through.

Answering these questions is important for understanding Mars, and on a grander scale, it is key to forming a better picture of the formation of our solar system, including Earth.

Teach It

Use these resources to bring the excitement of NASA’s newest Mars mission and the scientific discovery that comes with it into the classroom.

Explore More

Follow Along

Resources and Activities

Feature Stories and Podcasts

Websites and Interactives

TAGS: InSight, Landing, Mars, K-12 Educators, Informal Educators, Engineering, Science, Mission Events

  • Lyle Tavernier
READ MORE

Satellite images of the 2018 Carr and Ferguson wildfires in California

Update – August 8, 2018: This feature, originally published on August 23, 2016, has been updated to include information on 2018 fires and current fire research.

Once again, it’s fire season in the western United States with many citizens finding themselves shrouded in wildfire smoke. Late summer in the west brings heat, low humidity and wind – optimal conditions for fire. These critical conditions have resulted in the Mendocino Complex Fire, the largest fire in California's recorded history. Burning concurrently in California are numerous other wildfires, including the Carr fire, the 12th largest in California history.

Because of their prevalence and effects on a wide population, wildfires will remain a seasonal teachable moment for decades to come. Follow these links to learn about NASA’s fire research and see images of current fires from space. Check out the information and lessons below to help students learn how NASA scientists use technology to monitor and learn about fires and their impacts.


In the News

You didn’t need to check social media, read the newspaper or watch the local news to know that California wildfires were making headlines this summer. Simply looking up at a smoke-filled sky was enough for millions of people in all parts of the state to know there was a fire nearby.

Fueled by high temperatures, low humidity, high winds and five years of vegetation-drying drought, more than 4,800 fires have engulfed 275,000-plus acres across California already this year. And the traditional fire season – the time of year when fires are more likely to start, spread and consume resources – has only just begun.

With wildfires starting earlier in the year and continuing to ignite throughout all seasons, fire season is now a year-round affair not just in California, but also around the world. In fact, the U.S. Forest Service found that fire seasons have grown longer in 25 percent of Earth's vegetation-covered areas.

For NASA's Jet Propulsion Laboratory, which is located in Southern California, the fires cropping up near and far are a constant reminder that its efforts to study wildfires around the world from space, the air and on the ground are as important as ever.

JPL uses a suite of Earth satellites and airborne instruments to help better understand fires and aide in fire management and mitigation. By looking at multiple images and types of data from these instruments, scientists compare what a region looked like before, during and after a fire, as well as how long the area takes to recover.

Animation of the FireSat network of satellites capturing wildfires on Earth

This animation shows how FireSat would use a network of satellites around the Earth to detect fires faster than ever before. | + Expand image

While the fire is burning, scientists watch its behavior from an aerial perspective to get a big-picture view of the fire itself and the air pollution it is generating in the form of smoke filled with carbon monoxide and carbon dioxide.

Natasha Stavros, a wildfire expert at JPL, joined Zach Tane with the U.S. Forest Service during a Facebook Live event (viewable below) to discuss some of these technologies and how they're used to understand wildfire behavior and improve wildfire recovery.

Additionally, JPL is working with a startup in San Francisco called Quadra Pi R2E to develop FireSat, a global network of satellites designed to detect wildfires and alert firefighting crews faster. When completed in June 2018, the network's array of more than 200 satellites will use infrared sensors to detect fires around the world much faster than is possible today. Working 24 hours a day, the satellites will be able to automatically detect fires as small as 35 to 50 feet wide within 15 minutes of when they begin. And within three minutes of a fire being detected, the FireSat network will notify emergency responders in the area.

Using these technologies, NASA scientists are gaining a broader understanding of fires and their impacts.

Why It's Important

One of the ways we often hear wildfires classified is by how much area they have burned. Though this is certainly of some importance, of greater significance to fire scientists is the severity of the fire. Wildfires are classified as burning at different levels of severity: low, medium, and high. Severity is a function of intensity, or how hot the fire was, and its spread rate, or the speed at which it travels. A high-severity fire is going to do some real damage. (Severity is measured by the damage left after the fire, but can be estimated during a fire event by calculating spread rate and measuring flame height which indicates intensity.)

Google Earth image showing fire severity
This image, created using data imported into Google Earth, shows the severity of the 2014 King Fire. Green areas are unchanged by the fire; yellow equals low severity; orange equals moderate severity; and red equals high severity. A KMZ file with this data is available in the Fired Up Over Math lesson linked below. Credit: NASA/JPL-Caltech/E. Natasha Stavros.

The impacts of wildfires range from the immediate and tangible to the delayed and less obvious. The potential for loss of life, property and natural areas is one of the first threats that wildfires pose. From a financial standpoint, fires can lead to a downturn in local economies due to loss of tourism and business, high costs related to infrastructure restoration, and impacts to federal and state budgets.

The release of greenhouse gases like carbon dioxide and carbon monoxide is also an important consideration when thinking about the impacts of wildfires. Using NASA satellite data, researchers at the University of California, Berkeley, determined that between 2001 and 2010, California wildfires emitted about 46 million tons of carbon, around five to seven percent of all carbon emitted by the state during that time period.

Animation showing Carbon Dioxide levels rising from the Station Fire in Southern California.
This animation from NASA's Eyes on the Earth visualization program shows carbon monoxide rising (red is the highest concentration) around Southern California as the Station Fire engulfed the area near JPL in 2009. Image credit: NASA/JPL-Caltech

In California and the western United States, longer fire seasons are linked to changes in spring rains, vapor pressure and snowmelt – all of which have been connected to climate change. Wildfires serve as a climate feedback loop, meaning certain effects of wildfires – the release of CO2 and CO – contribute to climate change, thereby enhancing the factors that contribute to longer and stronger fire seasons.

While this may seem like a grim outlook, it’s worth noting that California forests still act as carbon sinks – natural environments that are capable of absorbing carbon dioxide from the atmosphere. In certain parts of the state, each hectare of redwood forest is able to store the annual greenhouse gas output of 500 Americans.

Studying and managing wildfires is important for maintaining resources, protecting people, properties and ecosystems, and reducing air pollution, which is why JPL, NASA and other agencies are continuing their study of these threats and developing technologies to better understand them.

Teach It

Have your students try their hands at solving some of the same fire-science problems that NASA scientists do with these two lessons that get students in grades 3 through 12 using NASA data, algebra and geometry to approximate burn areas, fire-spread rate and fire intensity:

Explore More


Lyle Tavernier was a co-author on this feature.

TAGS: teachable moments, wildfires, science

  • Ota Lutz
READ MORE

In the News

A pair of Earth orbiters designed to keep track of the planet's water resources and evolving water cycle is scheduled to launch this month – no earlier than May 22, 2018. The Gravity Recovery and Climate Experiment Follow-On mission, or GRACE-FO, will pick up where its predecessor, GRACE, left off when it completed its 15-year mission in 2017. By measuring changes in Earth’s gravity, the mission will track water movement around the globe, identifying risks such as droughts and floods and revealing how land ice and sea level are evolving. The GRACE-FO mission is a great way to get students asking, and answering, questions about how we know what we know about some of the major components of Earth’s water cycle: ice sheets, glaciers, sea level, and ground-water resources.

How It Works

The GRACE-FO mission, a partnership between NASA and the German Research Centre for Geosciences (GFZ), will measure small variations in Earth’s mass to track how and where water is moving across the planet. This is no easy task, as water can be solid, liquid or gas; it can be in plain sight (as in a lake or glacier); it can be in the atmosphere or hidden underground; and it’s always on the move. But one thing all this water has in common, regardless of what state of matter it is in or where it is located, is mass.

Everything that has mass exerts a gravitational force. It is this gravitational force that GRACE-FO measures to track the whereabouts of water on Earth. Most of Earth's gravitational force, more than 99 percent, does not change from one month to the next because it is exerted by Earth’s solid surface and interior. GRACE-FO is sensitive enough to measure the tiny amount that does change – mostly as a result of the movement of water within the Earth system.

GRACE-FO works by flying two spacecraft in tandem around Earth – one spacecraft trailing the other at a distance of about 137 miles (220 kilometers). By pointing their microwave ranging instruments at each other, the satellites can measure tiny changes in the distance between them – within one micron (the diameter of a blood cell) – caused by changes in Earth’s gravitational field. Scientists can then use those measurements to create a map of Earth’s global gravitational field and calculate local mass variations.

As the forward spacecraft travels over a region that has more or less mass than the surrounding areas, such as a mountain or low valley, the gravitational attraction of that mass will cause the spacecraft to speed up or slow down, slightly increasing or decreasing the relative distance between it and its trailing companion. As a result of this effect, GRACE-FO will be able to track water as it moves into or out of a region, changing the region’s mass and, therefore, its gravity. In fact, the previous GRACE spacecraft measured a weakening gravity field over several years in Central California, enabling an estimate of aquifer depletion, and in Greenland, providing accurate measurements of ice melt over more than 15 years.

Find out more about how the mission works in the video below, from JPL's "Crazy Engineering" video series:

Why It’s Important

Tracking changes in our water resources and the water cycle is important for everyone. The water cycle is one of the fundamental processes on Earth that sustains life and shapes our planet, moving water between Earth's oceans, atmosphere and land. Over thousands of years, we have developed our civilizations around that cycle, placing cities and agriculture near rivers and the sea, building reservoirs and canals to bring water to where it is needed, and drilling wells to pump water from the ground. We depend on this cycle for the water resources that we need, and as those resources change, communities and livelihoods are affected. For example, too much water in an area causes dangerous floods that can destroy property, crops and infrastructure. Too little water causes shortages, which require us to reduce how much water we use. GRACE-FO will provide monthly data that will help us study those precious water resources.

Graphic showing the amount of water in aquifers across Earth as measured by GRACE

A map of groundwater storage trends for Earth's 37 largest aquifers using GRACE data shows depletion and replenishment in millimeters of water per year. Twenty-one aquifers have exceeded sustainability tipping points and are being depleted, and 13 of these are considered significantly distressed, threatening regional water security and resilience. Image credit: NASA/JPL-Caltech

Changes to Earth’s water over multiple years are an important indicator of how Earth is responding in a changing climate. Monitoring changes in ice sheets and glaciers, surface and underground water storage, the amount of water in large lakes and rivers, as well as changes in sea level and ocean currents, provides a global view of how Earth’s water cycle and energy balance are evolving. As our climate changes and our local water resources shift, we need accurate observations and continuous measurements like those from GRACE and GRACE Follow-On to be able to respond and plan.

As a result of the GRACE mission, we have a much more accurate picture of how our global water resources are evolving in both the short and long term. GRACE-FO will continue the legacy of GRACE, yielding up-to-date water and surface mass information and allowing us to identify trends over the coming years.

Teach It

Have students interpret GRACE data for themselves:
Get students learning about global water resources:
Teach students to read, interpret and compare “heat map” representations of Earth science data:

Explore More

Try these related resources for students from NASA's Space Place:

TAGS: Earth Science, Teach, In the News, GRACE, Climate Change, Water, Water Cycle

  • Ota Lutz
READ MORE

In the News

A spacecraft designed to study seismic activity on Mars, or “marsquakes,” is scheduled to lift off on a nearly seven-month journey to the Red Planet on May 5, 2018.

NASA’s InSight Mars lander is designed to get the first in-depth look at the “heart” of Mars: its crust, mantle and core. In other words, it will be the Red Planet’s first thorough checkup since it formed 4.5 billion years ago. The launch, from Vandenberg Air Force Base in Central California, also marks a first: It will be the first time a spacecraft bound for another planet lifts off from the West Coast. It’s a great opportunity to get students excited about the science and math used to launch rockets and explore other planets.

How It Works

NASA usually launches interplanetary spacecraft from the East Coast, at Cape Canaveral in Florida, to provide them with a momentum boost from Earth’s easterly rotation. It’s similar to how running in the direction you are throwing a ball can provide a momentum boost to the ball. If a spacecraft is launched without that extra earthly boost, the difference must be made up by the rocket engine. Since InSight is a small, lightweight spacecraft, its rocket can easily accommodate getting it into orbit without the help of Earth’s momentum.

Scheduled to launch no earlier than 4:05 a.m. PDT on May 5, InSight will travel aboard an Atlas V 401 launch vehicle on a southerly trajectory over the Pacific Ocean. (Here's how to watch the launch in person or online.) If the weather is bad or there are any mechanical delays, InSight can launch the next day. In fact, InSight can launch any day between May 5 and June 8, a time span known as a launch period, which has multiple launch opportunities during a two-hour launch window each day.

Regardless of the date when InSight launches, its landing on Mars is planned for November 26, 2018, around noon PST. Mission controllers can account for the difference in planetary location between the beginning of the launch window and the end by varying the amount of time InSight spends in what’s called a parking orbit. A parking orbit is a temporary orbit that a spacecraft can enter before moving to its final orbit or trajectory. For InSight, the Atlas V 401 will boost the spacecraft into a parking orbit where it will coast for a while to get into proper position for an engine burn that will send it toward Mars. The parking orbit will last 59 to 66 minutes, depending on the date and time of the launch.

Why It’s Important

Previous missions to Mars have investigated the history of the Red Planet’s surface by examining features like canyons, volcanoes, rocks and soil. However, many important details about the planet's formation can only be found by studying the planet’s interior, far below the surface. And to do that, you need specialized instruments and sensors like those found on InSight.

The InSight mission, designed to operate for one Mars year (approximately two Earth years), will use its suite of instruments to investigate the interior of Mars and uncover how a rocky body forms and becomes a planet. Scientists hope to learn the size of Mars’ core, what it’s made of and whether it’s liquid or solid. InSight will also study the thickness and structure of Mars’ crust, the structure and composition of the mantle and the temperature of the planet’s interior. And a seismometer will determine how often Mars experiences tectonic activity, known as “marsquakes,” and meteorite impacts.

Together, the instruments will measure Mars’ vital signs: its "pulse" (seismology), "temperature" (heat flow), and "reflexes" (wobble). Here’s how they work:

Illustration of the InSight Mars lander on the Red Planet - Labeled

This labeled artist's concept depicts the NASA InSight Mars lander at work studying the interior of Mars.

InSight’s seismometer is called SEIS, or the Seismic Experiment for Interior Structure. By measuring seismic vibrations across Mars, it will provide a glimpse into the planet’s internal activity. The volleyball-size instrument will sit on the Martian surface and wait patiently to sense the seismic waves from marsquakes and meteorite impacts. These measurements can tell scientists about the arrangement of different materials inside Mars and how the rocky planets of the solar system first formed. The seismometer may even be able to tell us if there's liquid water or rising columns of hot magma from active volcanoes underneath the Martian surface.

The Heat Flow and Physical Properties Probe, HP3 for short, burrows down almost 16 feet (five meters) into Mars' surface. That's deeper than any previous spacecraft arms, scoops, drills or probes have gone before. Like studying the heat leaving a car engine, HP3 will measure the heat coming from Mars' interior to reveal how much heat is flowing out and what the source of the heat is. This will help scientists determine whether Mars formed from the same material as Earth and the Moon, and will give them a sneak peek into how the planet evolved.

InSight’s Rotation and Interior Structure Experiment, or RISE, instrument tracks tiny variations in the location of the lander. Even though InSight is stationary on the planet, its position in space will wobble slightly with Mars itself, as the planet spins on its axis. Scientists can use what they learn about the Red Planet’s wobble to determine the size of Mars’ iron-rich core, whether the core is liquid, and which other elements, besides iron, may be present.

When InSight lifts off, along for the ride in the rocket will be two briefcase-size satellites, or CubeSats, known as MarCO, or Mars Cube One. They will take their own path to Mars behind InSight, arriving in time for landing. If all goes as planned, as InSight enters the Martian atmosphere, MarCO will relay data to Earth about entry, descent and landing operations, potentially faster than ever before. InSight will also transmit data to Earth the way previous Mars spacecraft have, by using NASA’s Mars Reconnaissance Orbiter as a relay. MarCO will be the first test of CubeSat technology at another planet, and if successful, it could provide a new way to communicate with spacecraft in the future, providing news of a safe landing – or any potential problems – sooner.

Thanks to the Mars rovers, landers and orbiters that have come before, scientists know that Mars has low levels of geological activity – but a lander like InSight can reveal what might be lurking below the surface. And InSight will give us a chance to discover more not just about the history of Mars, but also of our own planet’s formation.

Teach It

When launching to another planet, we want to take the most efficient route, using the least amount of rocket fuel possible. To take this path, we must launch during a specific window of time, called a launch window. Use this lesson in advanced algebra to estimate the launch window for the InSight lander and future Mars missions.

SEIS will record the times that marsquake surface waves arrive at the lander. Try your hand, just like NASA scientists, using these times, a little bit of algebra and the mathematical constant π to determine the timing and location of a marsquake!

Take students on a journey to Mars with this set of 19 standards-aligned STEM lessons that can be modified to fit various learning environments, including out-of-school time.

Build, test and launch your very own air-powered rocket to celebrate the first West Coast interplanetary spacecraft launch!

Explore More

Try these related resources for students from NASA's Space Place:

TAGS: InSight, Lessons, K-12, Activities, Teaching, STEM, Mars

  • Ota Lutz
READ MORE