Ota Lutz is the manager of the K-12 Education Group at NASA’s Jet Propulsion Laboratory. When she’s not writing new lessons or teaching, she’s probably cooking something delicious, volunteering in the community, or dreaming about where she will travel next.


Icons and overlays showing an orbital path, heat map, and cat's heart rate are show over an image of an orange tabby cat laying on a gray couch and looking intently off to the side.

Find out how the now famous video beamed from space, showing a cat chasing a laser, marked a milestone for space exploration, and find resources to engage students in related STEM learning.


You may have seen in the news last month that NASA beamed a cat video from space. It was all part of a test of new technology known as Deep Space Optical Communications. While the video went down in cat video history, the NASA technology used to transmit the first ultra-high-definition video from deep space also represented a historic advancement for space exploration – the potential to stream videos from the Moon, Mars, and beyond.

Read on to learn how this new technology will revolutionize space communications. Then, explore STEM learning resources that will get students using coding, math, and engineering to explore more about how NASA communicates with spacecraft.

Why did NASA beam a cat video from space?

Communicating with spacecraft across the solar system means sending data – such as commands, images, measurements, and status reports – over enormous distances, with travel times limited by the speed of light. NASA spacecraft have traditionally used radio signals to transmit information to Earth via the Deep Space Network, or DSN. The DSN is made up of an array of giant antennas situated around the globe (in California, Spain, and Australia) that allow us to keep in contact with distant spacecraft as Earth rotates.

When scientists and engineers want to send commands to a spacecraft in deep space, they turn to the Deep Space Network, NASA’s international array of giant antennas. | Watch on YouTube

Although sending transmissions using radio frequencies works well, advances in spacecraft technology mean we're collecting and transmitting a lot more data than in the past. The more data a spacecraft collects and needs to transmit to Earth, the more time it takes to transmit that data. And with so many spacecraft waiting to take their turn transmitting via the DSN's antennas, a sort of data traffic jam is on the horizon.

This interactive shows a real-time simulated view of communications between spacecraft and the DSN. Explore more on DSN Now

To alleviate the potential traffic jam, NASA is testing technology known as optical communications, which allows spacecraft to send and receive data at a higher information rate so that each transmission takes less of the DSN’s time.

The technology benefits scientists and engineers – or anyone who is fascinated by space – by allowing robotic spacecraft exploring planets we can't yet visit in person to send high-definition imagery and stream video to Earth for further study. Optical communications could also play an important role in upcoming human missions to the Moon and eventually to Mars, which will require a lot of data transmission, including video communication.

But why transmit a video of a cat? For a test of this kind, engineers would normally send randomly generated test data. But, in this case, to mark what was a significant event for the project, the team at NASA's Jet Propulsion Laboratory worked with the center's DesignLab to create a fun video featuring the pet of a JPL employee – a now famous orange tabby named Taters – chasing a laser. The video was also a nod to the project's use of lasers (more on that in a minute) and the first television test broadcast in 1928 that featured a statue of the cartoon character Felix the Cat.

This 15-second ultra-high-definition video featuring a cat named Taters was streamed via laser from deep space by NASA on Dec. 11, 2023. | Watch on YouTube

How lasers improve spacecraft communications

The NASA project designed to test this new technology is known as Deep Space Optical Communications, or DSOC. It aims to prove that we can indeed transmit data from deep space at a higher information rate.

To improve upon the rate at which data flows between spacecraft and antennas on Earth, DSOC uses laser signals rather than the radio signals currently used to transmit data. Radio signals and laser signals are both part of the electromagnetic spectrum and travel at the same speed – the speed of light – but they have different wavelengths. The DSOC lasers transmit data in the near-infrared portion of the electromagnetic spectrum, so their wavelength is shorter than radio waves, and they have a higher frequency.

Each type of wave on the electromagnetic spectrum is represented with a wavy line. Each wave – radio, microwave, infrared, visible, ultraviolet, x-ray, and gamma ray – is between a range of wavelengths that get shorter (from >100,000,000 nm to <.01 nm) and frequencies that get higher (from <3x10^9 to >3x10^19 Hz) from left to right. Visible light makes up a relatively tiny portion of the full spectrum.

This chart compares the wavelength and frequency range of each kind of wave on the electromagnetic spectrum. Note: The graphic representations are not to scale. Image credit: NASA/JPL-Caltech | + Expand image | › Download low-ink version for printing

Since there are more infrared than radio wavelengths over a particular distance, more data can be sent over the same distance using infrared. And since the speed of infrared and radio waves is equal to the speed of light, this also means that more data can be sent in the same length of time using infrared.

As a result, DSOC’s maximum information rate is around 267 megabits per second (Mbps), faster than many terrestrial internet signals. At that high data rate, the 153.6 megabit cat video took only 0.58 seconds to transmit and another 101 seconds to travel the 19 million miles to Earth at the speed of light. Instead, if we had sent the cat video using Psyche's radio transmitter, which has a data rate of 360 kilobits per second, it would have taken 426 seconds to transmit the video, plus the same speed-of-light travel time, to get to Earth.

Here's how DSOC aims to revolutionize deep space communications. | Watch on YouTube

This kind of spacecraft communications isn't without its challenges. Accurately pointing the narrow laser beam is one of the greatest challenges of optical communications.

DSOC consists of a "flight laser transceiver" aboard the Psyche spacecraft – which is currently on its journey to study the asteroid 16-Psyche – and a receiving station on Earth. The flight transceiver is a 22-centimeter-diameter apparatus that can both transmit and receive signals. Its maximum transmitter strength is a low 4 Watts. For the December 2023 test, a 160-Watt beacon signal was transmitted to the DSOC flight transceiver by a 1-meter telescope located at JPL's Table Mountain facility near Wrightwood, California. This beacon signal was used by the Psyche spacecraft as a pointing reference so it could accurately aim the DSOC transceiver at the Earth receiving station – the 5-meter Hale telescope at Caltech’s Palomar Observatory near San Diego.

This animation shows how DSOC's laser signals are sent between the Psyche spacecraft and ground stations on Earth - first as a pointing reference to ensure accurate aiming of the narrow laser signal and then as a data transmission to the receiving station. | Watch on YouTube

When the DSOC laser beam encounters Earth, it is much narrower than a radio signal transmitted from the same distance. In fact, the laser beam is only a few hundred kilometers wide when it reaches Earth, in sharp contrast with an approximately 2.5-million-kilometer-wide radio signal. This narrow beam must be pointed accurately enough so it not only intersects Earth, but also overlaps the receiving station. To ensure that the beam will be received at Palomar Observatory, the transmission must be aimed not directly at Earth, but at a point where Earth will be in its orbit when the signal arrives after traveling the great distance from the spacecraft.

What's next for laser communications

Engineers will do additional tests of the DSOC system as the Psyche spacecraft continues its 2.2-billion-mile (3.6-billion-kilometer) journey to its destination in the asteroid belt beyond Mars. Over the next couple of years, DSOC will make weekly contacts with Earth. Visit NASA's DSOC website to follow along as NASA puts the system through its paces to potentially usher in a new means of transmitting data through space.

How does the cat video relate to STEM learning?

The DSOC project provides a wonderful opportunity to help students understand the electromagnetic spectrum and learn about real-world applications of STEM in deep space communications. Try out these lessons and resources to get students engaged.

Educator Resources

Student Resources

Explore More

Multimedia

Interactives

Downloads

Websites

Articles

TAGS: K-12 Education, Educators, Students, Learning Resources, Teaching Resources, DSOC, DSN, Deep Space Network

  • Ota Lutz
READ MORE

A rectangular box-shaped spacecraft with long arms extending from either side. Above the arms are wing-like solar panels extending in the opposite direction. The curvature of Earth and wispy clouds are depicted just below the spacecraft.

Explore how and why the SWOT mission will take stock of Earth's water budget, what it could mean for assessing climate change, and how to bring it all to students.

Update: Dec. 15, 2022 – NASA, the French space agency, and SpaceX are now targeting 3:46 a.m. PST (6:46 a.m. EST) on Friday, Dec.16, for the launch of the Surface Water and Ocean Topography (SWOT) satellite. Visit NASA's SWOT launch blog for the latest updates.


NASA is launching an Earth-orbiting mission that will map the planet’s surface water resources better than ever before. Scheduled to launch on Dec. 16 from Vandenberg Space Force Base in California, the Surface Water and Ocean Topography, or SWOT mission is the latest international collaboration designed to monitor and report on our home planet. By providing us with a highly detailed 3D view of rivers, lakes, and oceans, SWOT promises to improve our understanding of Earth’s water cycle and the role oceans play in climate change, as well as help us better respond to drought and flooding.

Read on to find out why we're hoping to learn more about Earth's surface water, get to know the science behind SWOT's unique design, and follow along with STEM teaching and learning resources.

Why It's Important

Observing Earth from space provides scientists with a global view that is important for understanding the whole climate system. In the case of SWOT, we will be able to monitor Earth’s surface water with unprecedented detail and accuracy. SWOT will provide scientists with measurements of water volume change and movement that will inform our understanding of fresh water availability, flood hazards, and the mechanisms of climate change.

Scientists and engineers provide an overview of the SWOT mission. Credit: NASA/JPL-Caltech | Watch on YouTube

Water Flow

Scientists use a variety of methods to track Earth’s water. These include stream and lake gauges and even measurements from space such as sea surface altimetry and gravitational measurements of aquifer volumes. Monitoring of river flow and lake volume is important because it can tell us how much freshwater is readily available and at what locations. River flow monitoring can also help us make inferences about the downstream environmental impact. But monitoring Earth’s surface water in great detail with enough frequency to track water movement has proven challenging. Until now, most monitoring of river flow and lake levels has relied on water-flow and water-level gauges placed across Earth, which requires that they be accessible and maintained. Not all streams and lakes have gauges and previous space-based altimetry and gravitational measurements, though useful for large bodies of water, have not been able to adequately track the constant movement of water through smaller rivers or lakes.

Here's why understanding Earth’s "water budget" is an important part of understanding our planet and planning for future water needs.

SWOT will be able to capture these measurements across the globe in 3D every 21 days. The mission will monitor how much water is flowing through hundreds of thousands of rivers wider than 330 feet (100 meters) and keep a close watch on the levels of more than a million lakes larger than 15 acres (6 hectares). Data from the mission will be used to create detailed maps of rivers, lakes, and reservoirs that will enable accurate monitoring to provide a view of freshwater resources that is not reliant on physical access. Meanwhile, SWOT’s volumetric measurements of rivers, lakes, and reservoirs will help hydrologists better track drought and flooding impacts in near-real-time.

Coastal Sea Level Rise

SWOT will measure our oceans with unprecedented accuracy, revealing details of ocean features as small as 9 miles (15 kilometers) across. SWOT will also monitor sea levels and tides. Though we have excellent global sea level data, we do not have detailed sea level measurements near coastlines. Coastal sea levels vary across the globe as a result of ocean currents, weather patterns, land changes, and other factors. Sea levels are rising faster than ever, and higher sea levels also mean that hurricane storm surges will reach farther inland than ever before, causing substantially more damage than the same category of hurricanes in the past. SWOT will be able to monitor coastal sea level variations and fill gaps in the observations we currently have from other sources.

What is sea level rise and what does it mean for our planet? | › View Transcript

Ocean Heat Sinks

Further contributing to our understanding of the role Earth’s oceans play in climate change, SWOT will explore how the ocean absorbs atmospheric heat and carbon, moderating global temperatures and climate change. Scientists understand ocean circulation on a large scale and know that ocean currents are driven by temperature and salinity differences. However, scientists do not currently have a good understanding of fine-scale ocean currents, where most of the ocean's motion-related energy is stored and lost. Circulation at these fine scales is thought to be responsible for transporting half of the heat and carbon from the upper ocean to deeper layers. Such downward ocean currents have helped to mitigate the decades-long rise in global air temperatures by absorbing and storing heat and carbon away from the atmosphere. Knowing more about this process is critical for understanding the mechanisms of global climate change.

JPL scientist Josh Willis uses a water balloon to show how Earth's oceans are absorbing most of the heat being trapped on our warming world. | › Related lesson

These fine-scale ocean currents also transport nutrients to marine life and circulate pollutants such as crude oil and debris. Understanding nutrient transport helps oceanographers assess ocean health and the productivity of fisheries. And tracking pollutants aids in natural hazard assessment, prediction, and response.

How It Works

A joint effort between NASA and the French space agency – with contributions from the Canadian and UK space agencies – SWOT will continue NASA’s decades-long record of monitoring sea surface height across the globe. But this mission will add a level of detail never before achieved.

SWOT will measure more than 90% of Earth’s surface water, scanning the planet between 78°N latitude and 78°S latitude within 1 centimeter of accuracy and retracing the same path every 21 days. Achieving this level of accuracy from a spacecraft height of 554 miles (891 kilometers) requires that the boom using radar to measure water elevation remain stable within 2 microns – or about 3% of the thickness of a human hair.

This visualization shows ocean surface currents around the world during the period from June 2005 through December 2007. With its new, high resolution wide-swath measurements, SWOT will be able to observe eddies and current features at greater resolution than previously possible. Credit: NASA Scientific Visualization Studio | Watch on YouTube

Prior to SWOT, spacecraft have used conventional nadir, or straight-down, altimetry to measure sea surface height. Conventional nadir altimetry sends a series of radar or laser pulses down to the surface and measures the time it takes for each signal to return to the spacecraft, thus revealing distances to surface features. To acquire more detailed information on surface water, SWOT will use an innovative instrument called the Ka-band Radar Interferometer, or KaRIn, to measure water height with exceptional accuracy. Ka-band is a portion of the microwave part of the electromagnetic spectrum. SWOT uses microwaves because they can penetrate clouds to return data about water surfaces.

A radar signal is sent straight down from the SWOT spacecraft as it flies over Earth. Beams are shown bouncing back to receivers on either side of the spacecraft. The section of Earth measured by the spacecraft is shown as two side-by-side tracks colored in as a heatmap. The camera zooms out to show these tracks criscrossing the planet and eventually covering a majority of the surface.

SWOT will track Earth's surface water in incredible detail using an innovative instrument called the Ka-band Radar Interferometer, or KaRIn. Image credit: NASA/JPL-Caltech | + Expand image

The KaRIn instrument uses the principles of synthetic aperture radar combined with interferometry to measure sea surface height. A radar signal is emitted from the end of the 10-meter-wide boom on the spacecraft. The reflected signal is then received by antennas on both ends of the boom, capturing data from two 30-mile (50-kilometer) wide swaths on either side of the spacecraft. The received signals will be slightly out of sync, or phase, from one another because they will travel different distances to return to the receivers on either end of the boom. Knowing the phase difference, the distance between the antennas, and the radar wavelength allows us to calculate the distance to the surface.

The first of three images shows two paths of different lengths extending diagonally from a point on Earth’s surface to receivers on either side of the SWOT spacecraft. A second image shows the paths as light waves that are slightly out of phase. The third image shows a line drawn directly from the rightmost receiver to the path leading to the leftmost receiver, such that the intersected paths from Earth are equal in length. The upper triangle formed by this intersection has a short leg, highlighted in yellow, that represents the remaining length of the leftmost path. The yellow short leg represents the range difference between the two paths from Earth.

Radar signals bounced off the water’s surface will be received by antennas on both ends of SWOT's 10-meter-wide boom. The received signals will be slightly out of phase because they will travel different distances as they return to the receivers. Scientists use this phase difference and the radar wavelength to calculate the distance to the surface. Image credit: NASA/JPL-Caltech | + Expand image

The observations acquired by the two antennas can be combined into what is known as an interferogram. An interferogram is a pattern of wave interference that can reveal more detail beyond the 1-centimeter resolution captured by the radar. To explain how it works, we'll recall a couple of concepts from high school physics. When out-of-phase waves from the two antennas are combined, constructive and destructive interference patterns result in some wave crests being higher and some wave troughs being lower than those of the original waves. The patterns that result from the combination of the waves reveal more detail with resolution better than the 1-centimeter wavelength of the original Ka-band radar waves because the interference occurs over a portion of a wavelength. An interferogram can be coupled with elevation data to reveal a 3D representation of the water’s surface.

A diagram illustrating the swaths of data that SWOT will collect, including labels for the following: 10 m baseline between SWOT's receivers; a distance of 891 km between the surface and Interferometer Antenna 1; Interferometer Left Swath resulting in ocean topography with an H-Pol swath of 10-60 km; Interferometer Right Swath resulting in surface water topography with a V-Pol of 10-60 km; a straight-down Nadir Altimeter path directly below the spacecraft in the gap between the swaths; a cross-track resolution from 70m to 10m.

The KaRIn instrument illuminates two parallel tracks of approximately 50 kilometres on either side of a nadir track from a traditional altimeter. The signals are received by two antennas 10 metres apart and are then processed to yield interferometry measurements. Image credit: NASA/JPL-Caltech | + Expand image

This highly accurate 3D view of Earth’s surface water is what makes SWOT so unique and will enable scientists to more closely monitor the dynamics of the water cycle. In addition to observing ocean currents and eddies that will inform our understanding of the ocean’s role in climate change, SWOT's use of interferometry will allow scientists to track volumetric changes in lakes and quantify river flooding, tasks that cannot yet be done on a wide scale in any other way.

A colorful swath of yellows, oranges, magentas, purples is overlaid horizontally on a satellite image of desert landscape with thin yellow and red lines cutting diagonally across the image. On the center-left of the image, the colors fan out like a rainbow sprinkler. On the left side of the swath are a cluster of yellow dots.

This interferogram was captured by the air-based UAVSAR instrument of the magnitude 7.2 Baja California earthquake of April 4, 2010. The interferogram is overlaid atop a Google Earth image of the region. Image credit: NASA/JPL/USGS/Google | › Learn more

Follow Along

SWOT is scheduled to launch no earlier than Dec. 16, 2022, on a SpaceX Falcon 9 rocket from Vandenberg Space Force Base in California. Tune in to watch the launch on NASA TV.

After launch, the spacecraft will spend 6-months in a calibration and validation phase, during which it will make a full orbit of Earth every day at an altitude of 553 miles (857 kilometers). Upon completion of this phase, SWOT will increase its altitude to 554 miles (891 kilometers) and assume a 21-day repeat orbit for the remainder of its mission.

Visit the mission website to follow along as data are returned and explore the latest news, images, and updates as SWOT provides a new view on one of our planet's most important resources.

Teach It

The SWOT mission is the perfect opportunity to engage students in studying Earth’s water budget and water cycle. Explore these lessons and resources to get students excited about the STEM involved in studying Earth’s water and climate change from space.

Educator Resources

Student Activities

Explore More

Activities for Kids

Websites

Facts & Figures

Videos

Interactives

Image Gallery

Articles

Podcast

TAGS: K-12 Education, Teachers, Educators, Earth Science, Earth, Climate Change, Climate, Satellites, Teachable Moments, Climate TM

  • Ota Lutz
READ MORE

A slightly oblong donut-shaped ring of glowing warm dust especially bright at spots on the top, left, and right surrounds a black hole.

Find out how scientists captured the first image of Sagittarius A*, why it's important, and how to turn it into a learning opportunity for students.


Our home galaxy, the Milky Way, has a supermassive black hole at its center, but we’ve never actually seen it – until now. The Event Horizon Telescope, funded by the National Science Foundation, has released the first image of our galactic black hole, Sagittarius A* (pronounced “Sagittarius A-star” and abbreviated Sgr A*).

Read on to find out how the image was acquired and learn more about black holes and Sagittarius A*. Then, explore resources to engage learners in the exciting topic of black holes.

How Black Holes Work

A black hole is a location in space with a gravitational pull so strong that nothing, not even light, can escape it. A black hole’s outer edge, called its event horizon, defines the spherical boundary where the velocity needed to escape exceeds the speed of light. Matter and radiation fall in, but they can’t get out. Because not even light can escape, a black hole is literally black. Contrary to their name’s implication, black holes are not empty. In fact, a black hole contains a great amount of matter packed into a relatively small space. Black holes come in various sizes and can exist throughout space.

We can surmise a lot about the origin of black holes from their size. Scientists know how some types of black holes form, yet the formation of others is a mystery. There are three different types of black holes, categorized by their size: stellar-mass, intermediate-mass, and supermassive black holes.

Stellar-mass black holes are found throughout our Milky Way galaxy and have masses less than about 100 times that of our Sun. They comprise one of the possible endpoints of the lives of high-mass stars. Stars are fueled by the nuclear fusion of hydrogen, which forms helium and other elements deep in their interiors. The outflow of energy from the central regions of the star provides the pressure necessary to keep the star from collapsing under its own weight.

A bubble if gas is sucked into a swirl of glowing dust and gas around a black hole as hair-like whisps extend from the top and bottom of the swirl.

This illustration shows a binary system containing a stellar-mass black hole called IGR J17091-3624. The strong gravity of the black hole, on the left, is pulling gas away from a companion star on the right. This gas forms a disk of hot gas around the black hole, and the wind is driven off this disk. Image credit: NASA/CXC/M.Weiss | › Full image and caption

Once the fuel in the core of a high-mass star has completely burned out, the star collapses, sometimes producing a supernova explosion that releases an enormous amount of energy, detectable across the electromagnetic spectrum. If the star’s mass is more than about 25 times that of our Sun, a stellar-mass black hole can form.

Intermediate-mass black holes have masses between about 100 and 100,000 times that of our Sun. Until recently, the existence of intermediate-mass black holes had only been theorized. NASA’s Chandra X-ray Observatory has identified several intermediate-mass black hole candidates by observing X-rays emitted by the gas surrounding the black hole. The Laser Interferometer Gravitational Wave Observatory, or LIGO, funded by the National Science Foundation, detected the merger of two stellar-mass black holes with masses 65 and 85 times that of our Sun forming an intermediate-mass black hole of 142 solar masses. (Some of the mass was converted to energy and about nine solar masses were radiated away as gravitational waves.)

Supermassive black holes contain between a million and a billion times as much mass as a stellar-mass black hole. Scientists are uncertain how supermassive black holes form, but one theory is that they result from the combining of stellar-mass black holes.

A scale on the bottom shows mass (relative to the Sun) from 1 to 1 million and beyond. Stellar-mass black holes are shown on the left side of the scale between about 10 and 100 solar masses, followed on the right by intermediate-mass black holes from 100 to over 100,000 stellar masses followed by supermassive black holes from about 1 million on.

This chart illustrates the relative masses of super-dense cosmic objects, ranging from white dwarfs to the supermassive black holes encased in the cores of most galaxies. | › Full image and caption

Our local galactic center’s black hole, Sagittarius A*, is a supermassive black hole with a mass of about four million suns, which is fairly small for a supermassive black hole. NASA’s Hubble Space Telescope and other telescopes have determined that many galaxies have supermassive black holes at their center.

A bright-white collection of stars is surrounded by a berry colored swirl of stellar dust and stars.

This image shows the center of the Milky Way galaxy along with a closer view of Sagittarius A*. It was made by combining X-ray images from NASA's Chandra X-ray Observatory (blue) and infrared images from the agency's Hubble Space Telescope (red and yellow). The inset shows Sgr A* in X-rays only, covering a region half a light year wide. Image credit: X-ray: NASA/UMass/D.Wang et al., IR: NASA/STScI | › Full image and caption

Why They're Important

Black holes hold allure for everyone from young children to professional astronomers. For astronomers, in particular, learning about Sagittarius A* is important because it provides insights into the formation of our galaxy and black holes themselves.

Understanding the physics of black hole formation and growth, as well as their surrounding environments, gives us a window into the evolution of galaxies. Though Sagittarius A* is more than 26,000 light years (152 quadrillion miles) away from Earth, it is our closest supermassive black hole. Its formation and physical processes influence our galaxy as galactic matter continually crosses the event horizon, growing the black hole’s mass.

Studying black holes also helps us further understand how space and time interact. As one gets closer to a black hole, the flow of time slows down compared with the flow of time far from the black hole. In fact, according to Einstein’s theory of general relativity, the flow of time slows near any massive object. But it takes an incredibly massive object, such as a black hole, to make an appreciable difference in the flow of time. There's still much to learn about what happens to time and space inside a black hole, so the more we study them, the more we can learn.

How Scientists Imaged Sagittarius A*

Black holes, though invisible to the human eye, can be detected by observing their effects on nearby space and matter. As a result of their enormous mass, black holes have extremely high gravity, which pulls in surrounding material at rapid speeds, causing this material to become very hot and emit X-rays.

This video explains how Sagittarius A* appears to still have the remnants of a blowtorch-like jet dating back several thousand years. Credit: NASA | Watch on YouTube

X-ray-detecting telescopes such as NASA’s Chandra X-ray Observatory can image the material spiraling into a black hole, revealing the black hole’s location. NASA’s Hubble Space Telescope can measure the speed of the gas and stars orbiting a point in space that may be a black hole. Scientists use these measurements of speed to determine the mass of the black hole. Hubble and Chandra are also able to image the effects of gravitational lensing, or the bending of light that results from the gravitational pull of black holes or other high-mass objects such as galaxies.

A bright central blob is surrounded by blue halos and whisps forming a sort of target pattern.

The thin blue bull's-eye patterns in this Hubble Space Telescope image are called "Einstein rings." The blobs are giant elliptical galaxies roughly 2 to 4 billion light-years away. And the bull's-eye patterns are created as the light from galaxies twice as far away is distorted into circular shapes by the gravity of the giant elliptical galaxies. | › Full image and caption

To directly image the matter surrounding a black hole, thus revealing the silhouette of the black hole itself, scientists from around the world collaborated to create the Event Horizon Telescope. The Event Horizon Telescope harnesses the combined power of numerous telescopes around the world that can detect radio-wave emissions from the sky to create a virtual telescope the size of Earth.

Narrated by Caltech’s Katie Bouman, this video explains how she and her fellow teammates at the Event Horizon Telescope project managed to take a picture of Sagittarius A* (Sgr A*), a beastly black hole lying 27,000 light-years away at the heart of our Milky Way galaxy. Credit: Caltech | Watch on YouTube

In 2019, the team released the first image of a black hole's silhouette when they captured the glowing gasses surrounding the M87* galactic black hole nearly 53 million light-years (318 quintillion miles) away from Earth. The team then announced that one of their next endeavors was to image Sagittarius A*.

A warm glowing ring surrounds an empty blackness.

Captured by the Event Horizon Telescope in 2019, this image of the the glowing gasses surrounding the M87* black hole, was the first image ever captured of a black hole. Image credit: Event Horizon Telescope Collaboration | + Expand image

To make the newest observation, the Event Horizon Telescope focused its array of observing platforms on the center of the Milky Way. A telescope array is a group of telescopes arranged so that, as a set, they function similarly to one giant telescope. In addition to the telescopes used to acquire the M87* image, three additional radio telescopes joined the array to acquire the image of Sagittarius A*: the Greenland Telescope, the Kitt Peak 12-meter Telescope in Arizona, and the NOrthern Extended Millimeter Array, or NOEMA, in France.

This image of the center of our Milky Way galaxy representing an area roughly 400 light years across, has been translated into sound. Listen for the different instruments representing the data captured by the Chandra X-ray Observatory, Hubble Space Telescope, and Spitzer Space Telescope. The Hubble data outline energetic regions where stars are being born, while Spitzer's data captures glowing clouds of dust containing complex structures. X-rays from Chandra reveal gas heated to millions of degrees from stellar explosions and outflows from Sagittarius A*. Credit: Chandra X-ray Observatory | Watch on YouTube

The distance from the center of Sagittarius A* to its event horizon, a measurement known as the Schwarzschild radius, is enormous at seven million miles (12,000,000 kilometers or 0.08 astronomical units). But its apparent size when viewed from Earth is tiny because it is so far away. The apparent Schwarzschild radius for Sagittarius A* is 10 microarcseconds, about the angular size of a large blueberry on the Moon.

Acquiring a good image of a large object that appears tiny when viewed from Earth requires a telescope with extraordinarily fine resolution, or the ability to detect the smallest possible details in an image. The better the resolution, the better the image and the more detail the image will show. Even the best individual telescopes or array of telescopes at one location do not have a good enough resolution to image Sagittarius A*.

A dense field of stars like grains of sand is surrounded by wispy clouds of glowing gas and dust.

This image captured by NASA's Hubble Space Telescope shows the star-studded center of the Milky Way towards the constellation of Sagittarius. Even though you can't see our galaxy's central black hole directly, you might be able to pinpoint its location based on what you've learned about black holes thusfar. Image credit: NASA, ESA, and G. Brammer | › Full image and caption

The addition of the 12-meter Greenland Telescope, though a relatively small instrument, widened the diameter, or aperture, of the Event Horizon Telescope to nearly the diameter of Earth. And NOEMA – itself an array of twelve 15-meter antennas with maximum separation of 2,500 feet (760 meters) – helped further increase the Event Horizon Telescope’s light-gathering capacity.

Altogether, when combined into the mighty Event Horizon Telescope, the virtual array obtained an image of Sagittarius A* spanning about 50 microarcseconds, or about 1/13th of a billionth the span of the night sky.

A slightly oblong donut-shaped ring of glowing warm dust especially bright at spots on the top, left, and right surrounds a black hole.

Sagittarius A* is more than 26,000 light years (152 quadrillion miles) away from Earth and has the mass of 4 million suns. Image credit: Event Horizon Telescope | › Full image and caption

While the Event Horizon Telescope was busy capturing the stunning radio image of Sagittarius A*, an additional worldwide contingent of astronomical observatories was also focused on the black hole and the region surrounding it. The aim of the team, known as the Event Horizon Telescope Multiwavelength Science Working Group, was to observe the black hole in other parts of the electromagnetic spectrum beyond radio. As part of the effort, X-ray data were collected by NASA’s Chandra X-ray Observatory, Nuclear Spectroscopic Telescope (NuSTAR), and Neil Gehrels Swift Observatory, additional radio data were collected by the East Asian Very Long-Baseline Interferometer (VLBI) network and the Global 3 millimeter VLBI array, and infrared data were collected by the European Southern Observatory’s Very Large Telescope.

The data from these multiple platforms will allow scientists to continue building their understanding of the behavior of Sagittarius A* and to refine their models of black holes in general. The data collected from these multiwavelength observations are crucial to the study of black holes, such as the Chandra data revealing how quickly material falls in toward the disk of hot gas orbiting the black hole’s event horizon. Data such as these will hopefully help scientists better understand black hole accretion, or the process by which black holes grow.

Teach It

Check out these resources to bring the real-life STEM of black holes into your teaching, plus learn about opportunities to involve students in real astronomy research.

Explore More

Articles

Educator Guides

Student Activities

Check out these related resources for students from NASA’s Space Place

Across the NASA-Verse


This Teachable Moment was created in partnership with NASA’s Universe of Learning. Universe of Learning materials are based upon work supported by NASA under award number NNX16AC65A to the Space Telescope Science Institute, working in partnership with Caltech/IPAC, Center for Astrophysics | Harvard & Smithsonian, and the Jet Propulsion Laboratory.

TAGS: Black hole, Milky Way, galaxy, universe, stars, teachers, educators, lessons, Teachable Moments, K-12, science

  • Ota Lutz
READ MORE

Just beyond the wing of a plane, the edge of a tall glacier is visible through the plane's window. At the bottom of the glacier, bits of ice surround an elliptical pool of brown water at the glacier's edge.

Explore how the OMG mission discovered more about what's behind one of the largest contributors to global sea level rise. Plus, learn what it means for communities around the world and how to get students engaged.


After six years investigating the effects of warming oceans on Greenland's ice sheet, the Oceans Melting Greenland, or OMG, mission has concluded. This airborne and seaborne mission studied how our oceans are warming and determined that ocean water is melting Greenland’s glaciers as much as warm air is melting them from above.

Read on to learn more about how OMG accomplished its goals and the implications of what we learned. Then, explore educational resources to engage students in the science of this eye-opening mission.

Why It's Important

Global sea level rise is one of the major environmental challenges of the 21st century. As oceans rise, water encroaches on land, affecting populations that live along shorelines. Around the world – including U.S. regions along the Gulf of Mexico and Eastern Seaboard and in Alaska – residents are feeling the impact of rising seas. Additionally, freshwater supplies are being threatened by encroaching saltwater from rising seas.

Sea level rise is mostly caused by melting land ice (primarily glaciers), which adds water to the ocean, as well as thermal expansion, the increase in volume that occurs when water heats up. Both ice melt and thermal expansion result from rising global average temperatures on land and in the sea – one facet of climate change.

This short video explains why Greenland's ice sheets are melting and what it means for our planet. Credit: NASA/JPL-Caltech | Watch more from the Earth Minute series

Greenland’s melting glaciers contribute more freshwater to sea level rise than any other source, which is why the OMG mission set out to better understand the mechanisms behind this melting.

How We Did It

The OMG mission used a variety of instruments onboard airplanes and ships to map the ocean floor, measure the behemoth Greenland glaciers, and track nearby water temperature patterns.

Join JPL scientist Josh Willis as he and the NASA Oceans Melting Greenland (OMG) team work to understand the role that ocean water plays in melting Greenland’s glaciers. Credit: NASA/JPL-Caltech | Watch on YouTube

An animation shows a ship passing over the ocean directly in front of a glacier and scanning the sea floor followed by a plane flying overhead and scanning the air.

This animation shows how the OMG mission created a map of the ocean floor, known as a bathymetric map, to determine the geometry around Greenland's glaciers. Image credit: NASA/JPL-Caltech | + Expand image

An animation shows a plan flying over a glacier and scanning the ground below followed by a plane flying over the ocean shelf next to the glacier and dropping probes into the water.

This animation shows how the OMG mission used radar to measure changes in the thickness and retreat of Greenland's glaciers as well as probes to measure ocean temperature and salinity. Credit: NASA/JPL-Caltech | + Expand image

Early on, the mission team created a map of the ocean floor, known as a bathymetric map, by combining multibeam sonar surveys taken from ships and gravity measurements taken from airplanes. Interactions among glaciers and warming seas are highly dependent on the geometry of the ocean floor. For example, continental shelf troughs carved by glaciers allow pathways for water to interact with glacial ice. So understanding Greenland's local bathymetry was crucial to OMG's mission.

To locate the edges of Greenland's glaciers and measure their heights, the mission used a radar instrument known as the Glacier and Ice Surface Topography Interferometer. Every spring during the six-year OMG mission, the radar was deployed on NASA’s Gulfstream III airplane that flew numerous paths over Greenland’s more than 220 glaciers. Data from the instrument allowed scientists to determine how the thickness and area of the glaciers are changing over time.

Finally, to measure ocean temperature and salinity patterns, scientists deployed numerous cylindrical probes. These probes dropped from an airplane and fell through the water, taking measurements from the surface all the way to the ocean floor. Each probe relayed its information back to computers onboard the plane where ocean temperatures and salinity were mapped. Then, scientists took this data back to their laboratories and analyzed it for trends, determining temperature variations and circulation patterns.

What We Discovered

Prior to the OMG mission, scientists knew that warming air melted glaciers from above, like an ice cube on a hot day. However, glaciers also flow toward the ocean and break off into icebergs in a process called calving. Scientists had the suspicion that warmer ocean waters were melting the glaciers from below, causing them to break off more icebergs and add to rising seas. It wasn’t until they acquired the data from OMG, that they discovered the grim truth: Glaciers are melting from above and below, and warming oceans are having a significant effect on glacial melt.

This narrated animation shows warm ocean water is melting glaciers from below, causing their edges to break off in a process called calving. Credit: NASA | Watch on YouTube

What this means for our Earth's climate is that as we continue burning fossil fuels and contributing to greenhouse gas accumulation, the oceans, which store more than 90% of the heat that is trapped by greenhouse gases, will continue to warm, causing glaciers to melt faster than ever. As warming ocean water moves against glaciers, it eats away at their base, causing the ice above to break off. In other words, calving rates increase and sea level rises even faster.

Our oceans control our climate and affect our everyday lives, whether or not we live near them. With the pace of the melt increasing, our shorelines and nearby communities will be in trouble sooner than previously expected. And it’s not just the beaches that will be affected. If Greenland’s glaciers all melt, global sea levels will rise by over 24 feet (7.4 meters), bringing dramatic change to the landscapes of major cities around the world.

› Read more about OMG’s findings and how scientists are continuing their research through ongoing initiatives and projects.

Teach It

Check out these resources to bring the real-life STEM behind the mission into your teaching. With lessons for educators and student projects, engage students in learning about the OMG mission and NASA climate science.

Educator Guides

Student Projects

Articles

Explore More

Websites

Facts & Figures

Videos

Interactives

Image Gallery

Articles

Podcast

TAGS: Teachable Moment, Climate, Earth Science, Glaciers, Greenland, Ice, Sea Level Rise, Teachers, Educators, Parents, Lessons, Missions, Earth, Climate TM

  • Ota Lutz
READ MORE

Satellite Image of smoke above the Western U.S.

Data overlayed on a satellite image of the United States shows a thick cloud of aerosols over the western US

Animated satellite image of Earth

Update: Sept. 14, 2020 – This feature, originally published on Aug. 23, 2016, has been updated to include information on the 2020 fires and current fire research.


In the News

Once again, it’s fire season in the western United States with many citizens finding themselves shrouded in wildfire smoke. Late summer in the West brings heat, low humidity, and wind – optimal conditions for fire. These critical conditions have resulted in the August Complex Fire, the largest fire in California's recorded history. Burning concurrently in California are numerous other wildfires, including the SCU Lightning Complex fire, the third-largest in California history.

Fueled by high temperatures, low humidity, high winds, and years of vegetation-drying drought, more than 7,700 fires have engulfed over 3 million acres across California already this year. And the traditional fire season – the time of year when fires are more likely to start, spread, and consume resources – has only just begun.

Because of their prevalence and effects on a wide population, wildfires will remain a seasonal teachable moment for decades to come. Keep reading to find out how NASA studies wildfires and their effects on climate and communities. Plus, explore lessons to help students learn more about fires and their impacts.

How It Works

With wildfires starting earlier in the year and continuing to ignite throughout all seasons, fire season is now a year-round affair not just in California, but also around the world. In fact, the U.S. Forest Service found that fire seasons have grown longer in 25 percent of Earth's vegetation-covered areas.

Animation of the FireSat network of satellites capturing wildfires on Earth

This animation shows how FireSat would use a network of satellites around the Earth to detect fires faster than ever before. | + Expand image

For NASA's Jet Propulsion Laboratory, which is located in Southern California, the fires cropping up near and far are a constant reminder that its efforts to study wildfires around the world from space, the air, and on the ground are as important as ever.

JPL uses a suite of Earth satellites and airborne instruments to help better understand fires and aide in fire management and mitigation. By looking at multiple images and types of data from these instruments, scientists compare what a region looked like before, during, and after a fire, as well as how long the area takes to recover.

While the fire is burning, scientists watch its behavior from an aerial perspective to get a big-picture view of the fire itself and the air pollution it is generating in the form of smoke filled with carbon monoxide and carbon dioxide.

Natasha Stavros, a wildfire expert at JPL, joined Zach Tane with the U.S. Forest Service during a Facebook Live event to discuss some of these technologies and how they're used to understand wildfire behavior and improve wildfire recovery.

Additionally, JPL worked with a startup in San Francisco called Quadra Pi R2E to develop FireSat, a global network of satellites designed to detect wildfires and alert firefighting crews faster. 

Using these technologies, NASA scientists are gaining a broader understanding of fires and their impacts.

Why It's Important

One of the ways we often hear wildfires classified is by how much area they have burned. Though this is certainly of some importance, of greater significance to fire scientists is the severity of the fire. Wildfires are classified as burning at different levels of severity: low, medium, and high. Severity is a function of intensity, or how hot the fire was, and its spread rate, or the speed at which it travels. A high-severity fire is going to do some real damage. (Severity is measured by the damage left after the fire, but can be estimated during a fire event by calculating spread rate and measuring flame height which indicates intensity.)

Google Earth image showing fire severity
This image, created using data imported into Google Earth, shows the severity of the 2014 King Fire. Green areas are unchanged by the fire; yellow equals low severity; orange equals moderate severity; and red equals high severity. A KMZ file with this data is available in the Fired Up Over Math lesson linked below. Credit: NASA/JPL-Caltech/E. Natasha Stavros.

The impacts of wildfires range from the immediate and tangible to the delayed and less obvious. The potential for loss of life, property, and natural areas is one of the first threats that wildfires pose. From a financial standpoint, fires can lead to a downturn in local economies due to loss of tourism and business, high costs related to infrastructure restoration, and impacts to federal and state budgets.

The release of greenhouse gases like carbon dioxide and carbon monoxide is also an important consideration when thinking about the impacts of wildfires. Using NASA satellite data, researchers at the University of California, Berkeley, determined that between 2001 and 2010, California wildfires emitted about 46 million tons of carbon, around five to seven percent of all carbon emitted by the state during that time period.

Animation showing Carbon Dioxide levels rising from the Station Fire in Southern California.
This animation from NASA's Eyes on the Earth visualization program shows carbon monoxide rising (red is the highest concentration) around Southern California as the Station Fire engulfed the area near JPL in 2009. Image credit: NASA/JPL-Caltech

In California and the western United States, longer fire seasons are linked to changes in spring rains, vapor pressure, and snowmelt – all of which have been connected to climate change. Wildfires serve as a climate feedback loop, meaning certain effects of wildfires – the release of CO2 and CO – contribute to climate change, thereby enhancing the factors that contribute to longer and stronger fire seasons.

While this may seem like a grim outlook, it’s worth noting that California forests still act as carbon sinks – natural environments that are capable of absorbing carbon dioxide from the atmosphere. In certain parts of the state, each hectare of redwood forest is able to store the annual greenhouse gas output of 500 Americans.

Studying and managing wildfires is important for maintaining resources, protecting people, properties, and ecosystems, and reducing air pollution, which is why JPL, NASA, and other agencies are continuing their study of these threats and developing technologies to better understand them.

Teach It

Have your students try their hands at solving some of the same fire-science problems that NASA scientists do with these two lessons that get students in grades 3 through 12 using NASA data, algebra, and geometry to approximate burn areas, fire-spread rate and fire intensity:

Explore More


Lyle Tavernier contributed to this feature.

TAGS: teachable moments, wildfires, science, Earth Science, Earth, Climate Change, Climate TM

  • Ota Lutz
READ MORE

In the News

On Jan. 30, 2020, the venerable Spitzer Space Telescope mission will officially come to an end as NASA makes way for a next-generation observatory. For more than 16 years, Spitzer has served as one of NASA’s four Great Observatories, surveying the sky in infrared. During its lifetime, Spitzer detected planets and signs of habitability beyond our solar system, returned stunning images of regions where stars are born, spied light from distant galaxies formed when the universe was young, and discovered a huge, previously-unseen ring around Saturn. Read on to learn more about this amazing mission and gather tools to teach your students that there truly is more than meets the eye in the infrared universe!

How It Worked

Human eyes can see only the portion of the electromagnetic spectrum known as visible light. This is because the human retina can detect only certain wavelengths of light through special photoreceptors called rods and cones. Everything we see with our eyes either emits or reflects visible light. But visible light is just a small portion of the electromagnetic spectrum. To "see" things that emit or reflect other wavelengths of light, we must rely on technology designed to sense those portions of the electromagnetic spectrum. Using this specialized technology allows us to peer into space and observe objects and processes we wouldn’t otherwise be able to see.

Infographic showing the electromagnetic spectrum and applications for various wavelengths.

This diagram shows wavelengths of light on the electromagnetic spectrum and how they're used for various applications. Image credit: NASA | + Expand image

Infrared is one of the wavelengths of light that cannot be seen by human eyes. (It can sometimes be felt by our skin as heat if we are close enough to a strong source.) All objects that have temperature emit many wavelengths of light. The warmer they are, the more light they emit. Most things in the universe are warm enough to emit infrared radiation, and that light can be seen by an infrared-detecting telescope. Because Earth’s atmosphere absorbs most infrared radiation, infrared observations of space are best conducted from outside the planet's atmosphere.

Learn more about the infrared portion of the electromagnetic spectrum and how NASA uses it to explore space. Credit: NASA/JPL-Caltech | Watch on YouTube

So, to get a look at space objects that were otherwise hidden from view, NASA launched the Spitzer Space Telescope in 2003. Cooled by liquid helium and capable of viewing the sky in infrared, Spitzer launched into an Earth-trailing orbit around the Sun, where it became part of the agency's Great Observatory program along with the visible-light and near-infrared-detecting Hubble Space Telescope, Compton Gamma-Ray Observatory and Chandra X-ray Observatory. (Keeping the telescope cold reduces the chances of heat, or infrared light, from the spacecraft interfering with its astronomical observations.)

Over its lifetime, Spitzer has been used to detect light from objects and regions in space where the human eye and optical, or visible-light-sensing, telescopes may see nothing.

Why It's Important

NASA's Spitzer Space Telescope has returned volumes of data, yielding numerous scientific discoveries.

Vast, dense clouds of dust and gas block our view of many regions of the universe. Infrared light can penetrate these clouds, enabling Spitzer to peer into otherwise hidden regions of star formation, newly forming planetary systems and the centers of galaxies.

A whisp of orange and green dust bows out beside a large blue star among a field of smaller blue stars.

The bow shock, or shock wave, in front of the giant star Zeta Ophiuchi shown in this image from Spitzer is visible only in infrared light. The bow shock is created by winds that flow from the star, making ripples in the surrounding dust. Image credit: NASA/JPL-Caltech | › Full image and caption

Infrared astronomy also reveals information about cooler objects in space, such as smaller stars too dim to be detected by their visible light, planets beyond our solar system (called exoplanets) and giant molecular clouds where new stars are born. Additionally, many molecules in space, including organic molecules thought to be key to life's formation, have unique spectral signatures in the infrared. Spitzer has been able to detect those molecules when other instruments have not.

Bursts of reds, oranges, greens, blues and violets spread out in all directions from a bright center source. Reds and oranges dominate the left side of the image.

Both NASA's Spitzer and Hubble space telescopes contributed to this vibrant image of the Orion nebula. Spitzer's infrared view exposed carbon-rich molecules, shown in this image as wisps of red and orange. Image credit: NASA/JPL-Caltech/T. Megeath (University of Toledo) & M. Robberto (STScI) | › Full image and caption

Stars are born from condensing clouds of dust and gas. These newly formed stars are optically visible only once they have blown away the cocoon of dust and gas in which they were born. But Spitzer has been able to see infant stars as they form within their gas and dust clouds, helping us learn more about the life cycles of stars and the formation of solar systems.

A blanket of green- and orange-colored stellar dust surrounds a grouping of purple, blue and red stars.

Newborn stars peek out from beneath their natal blanket of dust in this dynamic image of the Rho Ophiuchi dark cloud from Spitzer. The colors in this image reflect the relative temperatures and evolutionary states of the various stars. The youngest stars are shown as red while more evolved stars are shown as blue. Image credit: NASA/JPL-Caltech/Harvard-Smithsonian CfA | › Full image and caption

Infrared emissions from most galaxies come primarily from stars as well as interstellar gas and dust. With Spitzer, astronomers have been able to see which galaxies are furiously forming stars, locate the regions within them where stars are born and pinpoint the cause of the stellar baby boom. Spitzer has given astronomers valuable insights into the structure of our own Milky Way galaxy by revealing where all the new stars are forming.

A bright band of crimson-colored dust stretches across the center of this image covered in tiny specs of light from hundreds of thousands of stars.

This Spitzer image, which covers a horizontal span of 890 light-years, shows hundreds of thousands of stars crowded into the swirling core of our spiral Milky Way galaxy. In visible-light pictures, this region cannot be seen at all because dust lying between Earth and the galactic center blocks our view. Image credit: NASA/JPL-Caltech | › Full image and caption

Spitzer marked a new age in the study of planets outside our solar system by being the first telescope to directly detect light emitted by these so-called exoplanets. This has made it possible for us to directly study and compare these exoplanets. Using Spitzer, astronomers have been able to measure temperatures, winds and the atmospheric composition of exoplanets – and to better understand their potential habitability. The discoveries have even inspired artists at NASA to envision what it might be like to visit these planets.

Collage of exoplanet posters from NASA

Thanks to Spitzer, scientists are learning more and more about planets beyond our solar system. These discoveries have even inspired a series of posters created by artists at NASA, who imagined what future explorers might encounter on these faraway worlds. Image credit: NASA/JPL-Caltech | › Download posters

Data collected by Spitzer will continue to be analyzed for decades to come and is sure to yield even more scientific findings. It's certainly not the end of NASA's quest to get an infrared window into our stellar surroundings. In the coming years, the agency plans to launch its James Webb Space Telescope, with a mirror more than seven times the diameter of Spitzer's, to see the universe in even more detail. And NASA's Wide Field Infrared Survey Telescope, or WFIRST, will continue infrared observations in space with improved technology. Stay tuned for even more exciting infrared imagery, discoveries and learning!

Teach It

Use these lessons, videos and online interactive features to teach students how we use various wavelengths of light, including infrared, to learn about our universe:


Explore More

Also, check out these related resources for kids from NASA’s Space Place:

TAGS: Teachable Moments, science, astronomy, K-12 education, teachers, educators, parents, STEM, lessons, activities, Spitzer, Space Telescope, Missions, Spacecraft, Stars, Galaxies, Universe, Infrared, Wavelengths, Spectrum, Light

  • Ota Lutz
READ MORE

Side-by-side satellite and data images of soil moisture, flooding, temperature, a snowstorm, a wildfire and a hurricane

In the News

An extreme weather event is something that falls outside the realm of normal weather patterns. It can range from superpowerful hurricanes to torrential downpours to extended hot dry weather and more. Extreme weather events are, themselves, troublesome, but the effects of such extremes, including damaging winds, floods, drought and wildfires, can be devastating.

NASA uses airborne and space-based platforms, in conjunction with those from the National Oceanic and Atmospheric Administration, or NOAA, to monitor these events and the ways in which our changing climate is contributing to them. Together, the agencies are collecting more detailed data on weather and climate than ever before, improving society's ability to predict, monitor and respond to extreme events.

NASA makes this data available to the public, and students can use it to understand extreme weather events happening in their regions, learn more about weather and climate in general, and design plans for resilience and mitigation. Read on for a look at the various kinds of extreme weather, how climate change is impacting them, and ways students can use NASA data to explore science for themselves.

How It Works

Global climate change, or the overall warming of our planet, has had observable effects on the environment. Glaciers have shrunk, ice on rivers and lakes is breaking up and melting earlier in the year, precipitation patterns have changed, plant and animal habitat ranges have shifted, and trees are flowering sooner, exposing fruit blossoms to damaging erratic spring hail and deadly late frost. Effects that scientists had predicted in the past are now occurring: loss of sea ice, accelerated sea level rise, shifting storm patterns and longer, more intense heat waves.

Some of the most visible and disruptive effects of global climate change are extreme weather and resulting disasters such as wildfires and flooding. These events vary by geographic location, with many regions, such as the Southwest United States and parts of Central and South America, Asia, Europe, Africa and Australia, experiencing more heat, drought and insect outbreaks that contribute to increased wildfires. Other regions of the world, including coastal areas of the United States and many island nations, are experiencing flooding and salt water intrusion into drinking water wells as a result of sea level rise and storm surges from intense tropical storms. And some areas of the world, such as the Midwestern and Southern United States, have been inundated with rain that has resulted in catastrophic flooding.

Side-by-side images showing the river on a typical day and the river flooded

This pair of images shows the northeast side of Tulsa, Oklahoma, in May 2018 (left) and in May 2019 (right) after the Caney and Verdigris rivers flooded. Image credit: NASA/USGS | › Full image and caption

Temperatures, rainfall, droughts, high-intensity hurricanes and severe flooding events all are increasing and projected to continue as the world's climate warms, according to the National Climate Assessment. Weather is dynamic and various types of weather can interact to produce extreme outcomes. Here's how climate change can play a role in some of these weather extremes.

High Temperatures

This color-coded map displays a progression of changing global surface temperature anomalies from 1880 through 2018. Higher-than-normal temperatures are shown in red and lower-than-normal temperatures are shown in blue. The final frame represents the global temperatures five-year averaged from 2014 through 2018. Scale in degrees Celsius. Credit: NASA's Scientific Visualization Studio. Data provided by Robert B. Schmunk (NASA/GSFC GISS). | Watch on YouTube

Eighteen of the 19 warmest years on record have occurred since 2001. September 2019 tied as the hottest month on record for the planet. Since the 1880s, the average global surface temperature has risen about 2 degrees Fahrenheit (1 degree Celsius). As a result of warming temperatures, global average sea level has risen nearly 7 inches (178 millimeters) over the past 100 years. Data show this warming of the Earth system has been driven in large part by increased emissions into the atmosphere of carbon dioxide and other greenhouse gases created by human activities. And as temperatures continue to rise, we can expect more extreme weather.

Drought and Wildfires

Side-by-side images showing red areas throughout Alaska representing hotter than usual temperatures and a satellite image showing smoke and clouds coming from the same areas

The image on the left shows air temperatures during a record-breaking June 2019 heat wave in Alaska. Around the same time, a cluster of lightning-triggered wildfires broke out in the same area. Smoke from the wildfires can be seen in the image on the right. Image credit: NASA | › Full image and caption

High temperatures alone can lead to drought. Drought can cause problems for humans, animals and crops dependent on water and can weaken trees, making them more susceptible to disease and insect attacks. High temperatures combined with low humidity, dry vegetation and hot, dry, fast winds typify what is known as "fire weather" or "fire season." During fire season, wildfires are more likely to start, spread rapidly and be difficult to extinguish.

A satellite image of Northern California showing a dark reddish brown section with smoke eminating from it

The Operational Land Imager on the Landsat 8 satellite captured this image of the Walker Fire in Northern California on Sept. 8, 2019. Image credit: NASA/USGS | › Full image and caption

In California, where climate change has brought hotter, drier weather, residents are plagued by two fire seasons – one lasting from June through September that is primarily caused by high heat, low humidity and dry vegetation, and another lasting from October through April that is generally more volatile, as it is fueled by high winds. This 11-month fire season is longer than in past years. In recent years, California has also seen an increase in destructive wildfires. Weather extremes and climate change are partly to blame, even in relatively wet years. In California, these years mean more plant growth and potentially more fuel for fires when those plants dry out in the fall and the winds arrive. Wildfires have some fairly obvious effects on people and property. In addition to the visible destruction, smoke from wildfires can dramatically decrease air quality, pushing carbon into the air and destroying important carbon-sequestering plants and trees. Large-scale biomass destruction, as is happening in the Amazon rainforest, will have a lasting impact on important Earth processes.

Hurricanes

Satellite image of a hurricane heading towards Japan

This image, acquired on October 11, 2019, by the Moderate Resolution Imaging Spectroradiometer, or MODIS, on NASA's Aqua satellite, shows Typhoon Hagibis as its outer cloud bands neared Japan. Image credit: NASA | › Full image and caption

Since the 1980s, regions of the world prone to hurricanes, cyclones and typhoons have witnessed an increase in intensity, frequency and duration of these destructive storms. All three are intense tropical storms that form over oceans. (The different names refer to where on Earth they occur.) They are all fueled by available heat energy from warm ocean water. Warmer oceans provide more energy to passing storms, meaning hurricanes can form more quickly and reach higher speeds. Typhoon Hagibis, which recently left a trail of destruction in Japan, was described as the worst storm to hit the region in decades. Growing unusually quickly from a tropical storm to a Category 5 storm in less than a day, Hagibis was so intense it was called a super typhoon. In 2018, the second strongest cyclone to hit a U.S. territory and the largest typhoon of the year, Super Typhoon Yutu, caused catastrophic destruction on the Mariana Islands, an archipelago in the North Pacific Ocean. More intense storms and rising sea levels make storm surge – ocean water that is pushed toward the shore by strong winds – even worse than in the past. Typhoons can wreak havoc on infrastructure and compromise fresh water reserves. It can take months or even years for a hard-hit region to recover.

Snowstorms

Satellite image of white snow clouds and snow over the Mid-Alantic U.S.

The MODIS instrument aboard NASA's Terra Satellite captured the low-pressure area near New England that brought heavy snows and thundersnow to the Mid-Atlantic and Northeastern U.S. in January 2011. Image credit: NASA Goddard/MODIS Rapid Response Team | › Full image and caption

Like any other weather event, extreme cold weather events such as blizzards and unusually heavy snowfall can be, but are not always, linked to climate change. Just as warmer ocean water increases the intensity of a warm tropical storm, warmer than average winter ocean temperatures in the Atlantic feed additional energy and moisture into cold storms, influencing the severity of snowfall once the storm comes ashore in the Eastern United States. There is some natural variability, such as the presence of El Niño conditions, that can also lead to severe snowstorms in the region. But natural variability isn't enough to fully explain the increase in major snowstorms in the U.S. In fact, the frequency of extreme snowstorms in the eastern two-thirds of the region has increased dramatically over the last century. Approximately twice as many extreme snowstorms occurred in the U.S. during the latter half of the 20th century as in the first half.

Why It's Important

Because of the risk to lives and property, monitoring the increasing number of extreme weather events is more important now than ever before. And a number of NASA satellites and airborne science instruments are doing just that.

Artist's concept of dozens of satellites circling Earth with a glare from the Sun in the background

This graphic shows NASA's fleet of Earth-science satellites designed to monitor weather and climate across the globe. Image credit: NASA | › Full image and caption

A large global constellation of satellites, operated by NASA and NOAA, combined with a small fleet of planes operated by the U.S. Forest Service, help detect and map the extent, spread and impact of forest fires. As technology has advanced, so has the value of remote sensing, the science of scanning Earth from a distance using satellites and high-flying airplanes. Wildfire data from satellites and aircraft provide information that firefighters and command centers can use to call evacuation orders and make decisions about where to deploy crews to best arrest a fire's progress.

The agencies' satellites and airborne instruments also work in conjunction with those from international partners to provide data about hurricanes to decision makers at the National Hurricane Center, where predictions and warnings are issued so evacuations can be coordinated among the public and local authorities. Visible imagery from NASA satellites helps forecasters understand whether a storm is brewing or weakening based on changes to its structure. Other instruments on NASA satellites can measure sea surface characteristics, wind speeds, precipitation, and the height, thickness and inner structure of clouds.

Three side-by-side data images of the hurricane from different perspectives with colors overlayed to represent various science data

Three images of Hurricane Dorian, as seen by a trio of NASA's Earth-observing satellites in August 2019. The data sent by the spacecraft revealed in-depth views of the storm, including detailed heavy rain, cloud height and wind. Image credit: NASA/JPL-Caltech | › Full image and caption

NASA's airborne instruments, such as those aboard the Global Hawk aircraft, provide data from within the storm that cannot be otherwise obtained. Global Hawk can fly above a storm in a back-and-forth pattern and drop instruments called dropsondes through the storm. These instruments measure winds, temperature, pressure and humidity on their way to the surface. This detailed data can be used to characterize a storm, informing scientists of shifting patterns and potential future developments.

NASA missions will continue to study both weather and climate phenomena – whether they be droughts, floods, wildfires, hurricanes or other extremes – returning data for analysis. New airborne instruments aboard the satellite-simulating ER-2 and cloud-penetrating P-3 aircraft will fly missions starting in 2020 to study Atlantic coast-threatening snowstorms. Data from these flights will be combined with ground-based radar measurements and satellite measurements to better understand storms and their potential impact. Meanwhile, climate science instruments and satellites will continue to collect data that can inform everyone about the many aspects of our changing planet.

Teach It

Weather and climate data isn't just for meteorologists. Explore the resources and standards-aligned lessons below to get students analyzing local weather patterns, understanding wildfire monitoring and modeling global climate!

Precipitation and Clouds

Wildfires and Temperature

Sea Level

Satellites and Data

Climate

For Students

Explore More

Resources for Students

TAGS: Earth, Earth science, climate change, weather, extreme weather, hurricane, wildfire, typhoons, drought, flood, sea level rise, Climate TM

  • Ota Lutz
READ MORE

Buzz Aldrin stands on the moon in his puffy, white spacesuit next to an American flag waving in the wind. The command module casts a long, dark shadow nearby.

In the News

This year marks the 50th anniversary of humans landing on the Moon. Now NASA is headed to the Moon once again, using it as a proving ground for a future human mission to Mars. Use this opportunity to get students excited about Earth's natural satellite, the amazing feats accomplished 50 years ago and plans for future exploration.

How They Did It

When NASA was founded in 1958, scientists were unsure whether the human body could even survive orbiting Earth. Space is a demanding environment. Depending on where in space you are, it can lack adequate air for breathing, be very cold or hot, and have dangerous levels of radiation. Additionally, the physics of space travel make everything inside a space capsule feel weightless even while it's hurtling through space. Floating around inside a protective spacecraft may sound fun, and it is, but it also can have detrimental effects on the human body. Plus, it can be dangerous with the hostile environment of space lurking on the other side of a thin metal shell.

In 1959, NASA's Jet Propulsion Laboratory began the Ranger project, a mission designed to impact the Moon – in other words, make a planned crash landing. During its descent, the spacecraft would take pictures that could be sent back to Earth and studied in detail. These days, aiming to merely impact a large solar system body sounds rudimentary. But back then, engineering capabilities and course-of-travel, or trajectory, mathematics were being developed for the first time. A successful impact would be a major scientific and mathematical accomplishment. In fact, it took until July 1964 to achieve the monumental task, with Ranger 7 becoming the first U.S. spacecraft to impact the near side of the Moon, capturing and returning images during its descent.

Side-by-side images of a model of the Ranger 7 spacecraft in color and a black and white image of the Moon taken by Ranger 7.

These side-by-side images show a model of the Ranger 7 spacecraft (left) and an image the spacecraft took of the Moon (right) before it impacted the surface. Image credit: NASA/JPL-Caltech | › + Expand image

After the successful Ranger 7 mission, two more Ranger missions were sent to the Moon. Then, it was time to land softly. For this task, JPL partnered with Hughes Aircraft Corporation to design and operate the Surveyor missions between 1966 and 1968. Each of the seven Surveyor landers were equipped with a television camera – with later landers carried scientific instruments, too – aimed at obtaining up-close lunar surface data to assess the Moon's suitability for a human landing. The Surveyors also demonstrated in-flight maneuvers and in-flight and surface-communications capabilities.

Side-by-side image of an astronaut next to the Surveyor 7 lander and a mosaic of images from Surveyor 3

These side-by-side images show Apollo 12 Commander Charles Conrad Jr. posing with the Surveyor 7 spacecraft on the Moon (left) and a mosaic of images taken by Surveyor 3 on the lunar surface (right). Image credits: NASA/JPL-Caltech | › + Expand image

In 1958, at the same time JPL was developing the technological capabilities to get to the Moon, NASA began the Mercury program to see if it was possible for humans to function in space. The success of the single-passenger Mercury missions, with six successful flights that placed two astronauts into suborbital flight and four astronauts into Earth orbit, kicked off the era of U.S. human spaceflight.

Cutaway illustration of the Mercury capsule with a single astronaut inside.

The success of the single-passenger Mercury capsule, shown in this illustrated diagram, proved that humans could live and work in space, paving the way for future human exploration. Image credit: NASA | › Full image and caption

In 1963, NASA's Gemini program proved that a larger capsule containing two humans could orbit Earth, allowing astronauts to work together to accomplish science in orbit for long-duration missions (up to two weeks in space) and laying the groundwork for a human mission to the Moon. With the Gemini program, scientists and engineers learned how spacecraft could rendezvous and dock while in orbit around Earth. They were also able to perfect re-entry and landing methods and began to better understand the effects of longer space flights on astronauts. After the successful Gemini missions, it was time to send humans to the Moon.

Cutaway illustration of the Gemini spacecraft with two astronauts inside.

The Gemini spacecraft, shown in this illustrated cutaway, paved the way for the Apollo missions. Image credit: NASA | › Full image and caption

The Apollo program officially began in 1963 after President John F. Kennedy directed NASA in September of 1962 to place humans on the Moon by the end of the decade. This was a formidable task as no hardware existed at the time that would accomplish the feat. NASA needed to build a giant rocket, a crew capsule and a lunar lander. And each component needed to function flawlessly.

Rapid progress was made, involving numerous NASA and contractor facilities and hundreds of thousands of workers. A crew capsule was designed, built and tested for spaceflight and landing in water by the NASA contractor North American Aviation, which eventually became part of Boeing. A lunar lander was developed by the Grumman Corporation. Though much of the astronaut training took place at or near the Manned Spacecraft Center, now known as NASA’s Johnson Space Center, in Texas, astronauts practiced lunar landings here on Earth using simulators at NASA's Dryden (now Armstrong) Flight Research Center in California and at NASA's Langley Research Center in Virginia. The enormous Saturn V rocket was a marvel of complexity. Its first stage was developed by NASA's Marshall Space Flight Center in Alabama. The upper-stage development was managed by the Lewis Flight Propulsion Center, now known as NASA's Glenn Research Center, in Ohio in partnership with North American Aviation and Douglas Aircraft Corporation, while Boeing integrated the whole vehicle. The engines were tested at what is now NASA's Stennis Space Center in Mississippi, and the rocket was transported in pieces by water for assembly at Cape Kennedy, now NASA's Kennedy Space Center, in Florida. As the Saturn V was being developed and tested, NASA also developed a smaller, interim vehicle known as the Saturn I and started using it to test Apollo hardware. A Saturn I first flew the Apollo command module design in 1964.

Unfortunately, one crewed test of the Apollo command module turned tragic in February 1967, when a fire erupted in the capsule and killed all three astronauts who had been designated as the prime crew for what became known as Apollo 1. The command module design was altered in response, delaying the first crewed Apollo launch by 21 months. In the meantime, NASA flew several uncrewed Apollo missions to test the Saturn V. The first crewed Apollo launch became Apollo 7, flown on a Saturn IB, and proved that the redesigned command module would support its crew while remaining in Earth orbit. Next, Earth-Moon trajectories were calculated for this large capsule, and the Saturn V powered Apollo 8 set off for the Moon, proving that the calculations were accurate, orbiting the Moon was feasible and a safe return to Earth was possible. Apollo 8 also provided the first TV broadcast from lunar orbit. The next few Apollo missions further proved the technology and allowed humans to practice procedures that would be needed for an eventual Moon landing.

On July 16, 1969, a Saturn V rocket launched three astronauts to the Moon on Apollo 11 from Cape Kennedy. The Apollo 11 spacecraft had three parts: a command module, called "Columbia," with a cabin for the three astronauts; a service module that provided propulsion, electricity, oxygen and water; and a lunar module, "Eagle," that provided descent to the lunar surface and ascent back to the command and service modules.

Collage of three images showing the lunar module during its descent to the Moon, on the lunar surface and during its ascent.

In this image collage, the Apollo 11 lunar module is shown on its descent to the Moon (left), on the lunar surface as Buzz Aldrin descends the stairs (middle), and on its ascent back to the command module (right). Image credit: NASA | › View full image collection

On July 20, while astronaut and command module pilot Michael Collins orbited the Moon, Neil Armstrong and Buzz Aldrin landed Eagle on the Moon and set foot on the surface, accomplishing a first for humankind. They collected regolith (surface "dirt") and rock samples, set up experiments, planted an American flag and left behind medallions honoring the Apollo 1 crew and a plaque that read, "We came in peace for all mankind."

Collage of images showing Buzz Aldrin doing various activities on the Moon.

This collage of images from the Apollo 11 Moon landing shows Buzz Aldrin posing for a photo on the Moon (left), and setting up the solar wind and seismic experiments (middle). The image on the right shows the plaque the team placed on Moon to commemorate the historic event. Image credit: NASA | › View full image collection

After 21.5 hours on the lunar surface, Armstrong and Aldrin rejoined Collins in the Columbia command module and, on July 21, headed back to Earth. On July 24, after jettisoning the service module, Columbia entered Earth's atmosphere. With its heat shield facing forward to protect the astronauts from the extreme friction heating outside the capsule, the craft slowed and a series of parachutes deployed. The module splashed down in the South Pacific Ocean, 380 kilometers (210 nautical miles) south of Johnston Atoll. Because scientists were uncertain about contamination from the Moon, the astronauts donned biological-isolation garments delivered by divers from the recovery ship, the aircraft carrier the USS Hornet. The astronauts boarded a life raft and then the USS Hornet, where the outside of their biological-isolation suits were washed down with disinfectant. To be sure no contamination was brought back to Earth from the Moon, the astronauts were quarantined until Aug. 10, at which point scientists determined the risk was low that biological contaminants or microbes had returned with the astronauts. Columbia was also disinfected and is now part of the National Air and Space Museum in Washington, D.C.

On the left, a capsule floats in the ocean while astronauts sit in a raft in a gray suits. On the right, the three astronauts smile while looking out of a small window and while Nixon faces them with a microphone in front of him.

These side-by-side images show the Apollo 11 astronauts leaving the capsule in their biological isolation garments after successfully splashing down in the South Pacific Ocean (left). At right, President Richard M. Nixon welcomes the Apollo 11 astronauts, (left to right) Neil A. Armstrong, Michael Collins and Buzz Aldrin, while they peer through the window of the Mobile Quarantine Facility aboard the USS Hornet. Image credit: NASA | › View full image collection

The Apollo program continued with six more missions to the Moon over the next three years. Astronauts placed seismometers to measure "moonquakes" and other science instruments on the lunar surface, performed science experiments, drove a carlike moon buggy on the surface, planted additional flags and returned more lunar samples to Earth for study.

Why It's Important

Apollo started out as a demonstration of America's technological, economic and political prowess, which it accomplished with the first Moon landing. But the Apollo missions accomplished even more in the realm of science and engineering.

Some of the earliest beneficiaries of Apollo research were Earth scientists. The Apollo 7 and 9 missions, which stayed in Earth orbit, took photographs of Earth in different wavelengths of light, highlighting things that might not be seen on the ground, like diseased trees and crops. This research led directly to the joint NASA-U.S. Geological Survey Landsat program, which has been studying Earth's resources from space for more than 45 years.

Samples returned from the Moon continue to be studied by scientists around the world. As new tools and techniques are developed, scientists can learn even more about our Moon, discovering clues to our planet's origins and the formation of the solar system. Additionally, educators can be certified to borrow lunar samples for use in their classrooms.

The Apollo 11 astronauts crowd around a lunar sample contained in a protective case.

The Apollo 11 astronauts take a closer look at a sample they brought back from the Moon. Image credit: NASA | › View full image collection

Perhaps the most important scientific finding came from comparing similarities in the composition of lunar and terrestrial rocks and then noting differences in the amount of specific substances. This suggested a new theory of the Moon's formation: that it accreted from debris ejected from Earth by a collision with a Mars-size object early in our planet's 4.5-billion-year history.

The 12 astronauts who walked on the Moon are the best-known faces of the Apollo program, but in numbers, they were also the smallest part of the program. About 400,000 men and women worked on Apollo, building the vehicles, calculating trajectories, even making and packing food for the crews. Many of them worked on solving a deceptively simple question: "How do we guide astronauts to the Moon and back safely?" Some built the spacecraft to carry humans to the Moon, enable surface operations and safely return astronauts to Earth. Others built the rockets that would launch these advanced spacecraft. In doing all this, NASA engineers and scientists helped lead the computing revolution from transistors to integrated circuits, the forebears to the microchip. An integrated circuit – a miniaturized electronic circuit that is used in nearly all electronic equipment today – is lighter weight, smaller and able to function on less power than the older transistors and capacitors. To suit the needs of the space capsule, NASA developed integrated circuits for use in the capsule's onboard computers. Additionally, computing advancements provided NASA with software that worked exactly as it was supposed to every time. That software lead to the development of the systems used today in retail credit-card swipe devices.

Some lesser-known benefits of the Apollo program include the technologies that commercial industries would then further advance to benefit humans right here on Earth. These "spinoffs" include technology that improved kidney dialysis, modernized athletic shoes, improved home insulation, advanced commercial and residential water filtration, and developed the freeze-drying technique for preserving foods.

Apollo was succeeded by missions that have continued to build a human presence in space and advance technologies on Earth. Hardware developed for Apollo was used to build America's first Earth-orbiting space station, Skylab. After Skylab, during the Apollo-Soyuz test project, American and Soviet spacecraft docked together, laying the groundwork for international cooperation in human spaceflight. American astronauts and Soviet cosmonauts worked together aboard the Soviet space station Mir, performing science experiments and learning about long-term space travel's effects on the human body. Eventually, the U.S. and Russia, along with 13 other nations, partnered to build and operate the International Space Station, a world-class science laboratory orbiting 400 kilometers (250 miles) above Earth, making a complete orbit every 90 minutes.

Graphic showing a possible configuration for the future lunar gateway

Although the configuration is not final, this infographic shows the current lineup of parts comprising the lunar Gateway. Image credit: NASA | › Full image and caption

And the innovations continue today. NASA is planning the Artemis mission to put humans on the Moon again in 2024 with innovative new technologies and the intent of establishing a permanent human presence. Working in tandem with commercial and international partners, NASA will develop the Space Launch System launch vehicle, Orion crew capsule, a new lunar lander and other operations hardware. The lunar Gateway – a small spaceship that will orbit the Moon and include living quarters for astronauts, a lab for science, and research and ports for visiting spacecraft – will provide access to more of the lunar surface than ever before. While at the Moon, astronauts will research ways to use lunar resources for survival and further technological development. The lessons and discoveries from Artemis will eventually pave a path for a future human mission to Mars.

Teach It

Use these standards-aligned lessons to help students learn more about Earth's only natural satellite:

As students head out for the summer, get them excited to learn more about the Moon and human exploration using these student projects:

Explore More

TAGS: K-12 Education, Teachers, Educators, Classroom, Engineering, Science, Students, Projects, Moon, Apollo, Summer

  • Ota Lutz
READ MORE

A glowing, orange ring outlines a black hole.

Find out how scientists created a virtual telescope as large as Earth itself to capture the first image of a black hole's silhouette.


Accomplishing what was previously thought to be impossible, a team of international astronomers has captured an image of a black hole’s silhouette.

Evidence of the existence of black holes – mysterious places in space where nothing, not even light, can escape – has existed for quite some time, and astronomers have long observed the effects on the surroundings of these phenomena. In the popular imagination, it was thought that capturing an image of a black hole was impossible because an image of something from which no light can escape would appear completely black. For scientists, the challenge was how, from thousands or even millions of light-years away, to capture an image of the hot, glowing gas falling into a black hole.

An ambitious team of international astronomers and computer scientists has managed to accomplish both. Working for well over a decade to achieve the feat, the team improved upon an existing radio astronomy technique for high-resolution imaging and used it to detect the silhouette of a black hole – outlined by the glowing gas that surrounds its event horizon, the precipice beyond which light cannot escape. Learning about these mysterious structures can help students understand gravity and the dynamic nature of our universe, all while sharpening their math skills.

How They Did It

Though scientists had theorized they could image black holes by capturing their silhouettes against their glowing surroundings, the ability to image an object so distant still eluded them. A team formed to take on the challenge, creating a network of telescopes known as the Event Horizon Telescope, or the EHT. They set out to capture an image of a black hole by improving upon a technique that allows for the imaging of far-away objects, known as Very Long Baseline Interferometry, or VLBI.

Telescopes of all types are used to see distant objects. The larger the diameter, or aperture, of the telescope, the greater its ability to gather more light and the higher its resolution (or ability to image fine details). To see details in objects that are far away and appear small and dim from Earth, we need to gather as much light as possible with very high resolution, so we need to use a telescope with a large aperture.

That’s why the VLBI technique was essential to capturing the black hole image. VLBI works by creating an array of smaller telescopes that can be synchronized to focus on the same object at the same time and act as a giant virtual telescope. In some cases, the smaller telescopes are also an array of multiple telescopes. This technique has been used to track spacecraft and to image distant cosmic radio sources, such as quasars.

More than a dozen antennas pointing forward sit on barren land surrounded by red and blue-purple mountains in the distance.

Making up one piece of the EHT array of telescopes, the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile has 66 high-precision antennas. Image credit: NRAO/AUI/NSF | + Expand image

The aperture of a giant virtual telescope such as the Event Horizon Telescope is as large as the distance between the two farthest-apart telescope stations – for the EHT, those two stations are at the South Pole and in Spain, creating an aperture that’s nearly the same as the diameter of Earth. Each telescope in the array focuses on the target, in this case the black hole, and collects data from its location on Earth, providing a portion of the EHT’s full view. The more telescopes in the array that are widely spaced, the better the image resolution.

This video shows the global network of radio telescopes in the EHT array that performed observations of the black hole in the galaxy M87. Credit: C. Fromm and L. Rezzolla (Goethe University Frankfurt)/Black Hole Cam/EHT Collaboration | Watch on YouTube

To test VLBI for imaging a black hole and a number of computer algorithms for sorting and synchronizing data, the Event Horizon Telescope team decided on two targets, each offering unique challenges.

The closest supermassive black hole to Earth, Sagittarius A*, interested the team because it is in our galactic backyard – at the center of our Milky Way galaxy, 26,000 light-years (156 quadrillion miles) away. (An asterisk is the astronomical standard for denoting a black hole.) Though not the only black hole in our galaxy, it is the black hole that appears largest from Earth. But its location in the same galaxy as Earth meant the team would have to look through “pollution” caused by stars and dust to image it, meaning there would be more data to filter out when processing the image. Nevertheless, because of the black hole’s local interest and relatively large size, the EHT team chose Sagittarius A* as one of its two targets.

An image showing a smattering of orange stars against the black backdrop of space with a small black circle in the middle and a rectangle identifying the location of the M87 black hole.

A close-up image of the core of the M87 galaxy, imaged by the Chandra X-ray Observatory. Image credit: NASA/CXC/Villanova University/J. Neilsen | + Expand image

A blue jet extends from a bright yellow point surrounded by smaller yellow stars.

This image from NASA's Hubble Space Telescope shows a jet of subatomic particles streaming from the center of M87*. Image credits: NASA and the Hubble Heritage Team (STScI/AURA) | + Expand image

The second target was the supermassive black hole M87*. One of the largest known supermassive black holes, M87* is located at the center of the gargantuan elliptical galaxy Messier 87, or M87, 53 million light-years (318 quintillion miles) away. Substantially more massive than Sagittarius A*, which contains 4 million solar masses, M87* contains 6.5 billion solar masses. One solar mass is equivalent to the mass of our Sun, approximately 2x10^30 kilograms. In addition to its size, M87* interested scientists because, unlike Sagittarius A*, it is an active black hole, with matter falling into it and spewing out in the form of jets of particles that are accelerated to velocities near the speed of light. But its distance made it even more of a challenge to capture than the relatively local Sagittarius A*. As described by Katie Bouman, a computer scientist with the EHT who led development of one of the algorithms used to sort telescope data during the processing of the historic image, it’s akin to capturing an image of an orange on the surface of the Moon.

By 2017, the EHT was a collaboration of eight sites around the world – and more have been added since then. Before the team could begin collecting data, they had to find a time when the weather was likely to be conducive to telescope viewing at every location. For M87*, the team tried for good weather in April 2017 and, of the 10 days chosen for observation, a whopping four days were clear at all eight sites!

Each telescope used for the EHT had to be highly synchronized with the others to within a fraction of a millimeter using an atomic clock locked onto a GPS time standard. This degree of precision makes the EHT capable of resolving objects about 4,000 times better than the Hubble Space Telescope. As each telescope acquired data from the target black hole, the digitized data and time stamp were recorded on computer disk media. Gathering data for four days around the world gave the team a substantial amount of data to process. The recorded media were then physically transported to a central location because the amount of data, around 5 petabytes, exceeds what the current internet speeds can handle. At this central location, data from all eight sites were synchronized using the time stamps and combined to create a composite set of images, revealing the never-before-seen silhouette of M87*’s event horizon. The team is also working on generating an image of Sagittarius A* from additional observations made by the EHT.

This zoom video starts with a view of the ALMA telescope array in Chile and zooms in on the heart of M87, showing successively more detailed observations and culminating in the first direct visual evidence of a supermassive black hole’s silhouette. Credit: ESO/L. Calçada, Digitized Sky Survey 2, ESA/Hubble, RadioAstron, De Gasperin et al., Kim et al., EHT Collaboration. Music: Niklas Falcke | Watch on YouTube

As more telescopes are added and the rotation of Earth is factored in, more of the image can be resolved, and we can expect future images to be higher resolution. But we might never have a complete picture, as Katie Bouman explains here (under “Imaging a Black Hole”).

To complement the EHT findings, several NASA spacecraft were part of a large effort to observe the black hole using different wavelengths of light. As part of this effort, NASA’s Chandra X-ray Observatory, Nuclear Spectroscopic Telescope Array (NuSTAR) and Neil Gehrels Swift Observatory space telescope missions – all designed to detect different varieties of X-ray light – turned their gaze to the M87 black hole around the same time as the EHT in April 2017. NASA’s Fermi Gamma-ray Space Telescope was also watching for changes in gamma-ray light from M87* during the EHT observations. If the EHT observed changes in the structure of the black hole’s environment, data from these missions and other telescopes could be used to help figure out what was going on.

Though NASA observations did not directly trace out the historic image, astronomers used data from Chandra and NuSTAR satellites to measure the X-ray brightness of M87*’s jet. Scientists used this information to compare their models of the jet and disk around the black hole with the EHT observations. Other insights may come as researchers continue to pore over these data.

Why It's Important

Learning about mysterious structures in the universe provides insight into physics and allows us to test observation methods and theories, such as Einstein’s theory of general relativity. Massive objects deform spacetime in their vicinity, and although the theory of general relativity has directly been proven accurate for smaller-mass objects, such as Earth and the Sun, the theory has not yet been directly proven for black holes and other regions containing dense matter.

One of the main results of the EHT black hole imaging project is a more direct calculation of a black hole’s mass than ever before. Using the EHT, scientists were able to directly observe and measure the radius of M87*’s event horizon, or its Schwarzschild radius, and compute the black hole’s mass. That estimate was close to the one derived from a method that uses the motion of orbiting stars – thus validating it as a method of mass estimation.

The size and shape of a black hole, which depend on its mass and spin, can be predicted from general relativity equations. General relativity predicts that this silhouette would be roughly circular, but other theories of gravity predict slightly different shapes. The image of M87* shows a circular silhouette, thus lending credibility to Einstein’s theory of general relativity near black holes.

An illustration of a black hole surrounded by a bright, colorful swirl of material. Text describes each part of the black hole and its surroundings.

This artist’s impression depicts a rapidly spinning supermassive black hole surrounded by an accretion disc. Image credit: ESO | + Expand image

The data also offer some insight into the formation and behavior of black hole structures, such as the accretion disk that feeds matter into the black hole and plasma jets that emanate from its center. Scientists have hypothesized about how an accretion disk forms, but they’ve never been able to test their theories with direct observation until now. Scientists are also curious about the mechanism by which some supermassive black holes emit enormous jets of particles traveling at near light-speed.

These questions and others will be answered as more data is acquired by the EHT and synthesized in computer algorithms. Be sure to stay tuned for that and the next expected image of a black hole – our Milky Way’s own Sagittarius A*.

Update: May 12, 2022 – Scientists have released the first image of Sagittarius A* captured by the Event Horizon Telescope. › Read more from Teachable Moments

Teach It

Capture your students’ enthusiasm about black holes by challenging them to solve these standards-aligned math problems.

Model black-hole interaction with this NGSS-aligned lesson:

Explore More


Check out these related resources for students from NASA’s Space Place

TAGS: Black Hole, Teachable Moments, Science, K-12 Education, Teachers, Educators, Universe

  • Ota Lutz
READ MORE