Brittney Cooper stands in a sandy area holding a controller attached to a rover

Brittney Cooper loves studying weather – and she's taking that passion all the way to Mars. A graduate student at York University in Toronto, Cooper has spent the past two years working with the science team for NASA's Mars rover Curiosity. In January, she authored her first science paper on a study she designed with the Curiosity team that looked at how clouds scatter light and what that tells us about the shapes of their ice crystals. Despite her involvement in the Curiosity mission, the Canada native has never actually been to a NASA center. But that's about to change this summer when she'll embark on her first internship at JPL in Pasadena, California. We caught up with Cooper to find out what she's looking forward to most about her internship and how she's planning to take her studies of Martian clouds even farther.

You're currently earning your master's at York University in Toronto. What are you studying and what got you interested in that field?

I'm doing my master's in Earth and space science. But if you really want an interesting story [laughs] … I've always been interested in astronomy, space and science, but I also really love art. Coming to the end of high school, I realized that maybe it was going to be too hard for me to pursue science. Maybe I was a little scared and I didn't really think I was going to be able to do it. So I went to university for photography for two years. After two years, I realized photography wasn't challenging me in the right ways and wasn't what I wanted to do for the rest of my life. So I left. I did night school to get credits for calculus and all the grade-12 physics and chemistry that I needed to pursue a degree in atmospheric science, which is not even remotely astronomy, but I've also always loved weather – pretty much anything in the sky. I still had a passion for astronomy, so I started volunteering at the Allan I. Carswell observatory at York. There, I met a professor who I ended up doing research with for many years. He told me, "There's the field called planetary science, where you can study the atmospheres of other planets and you can kind of marry those two fields that you're interested in [astronomy and atmospheric science]." So I ended up adding an astronomy major.

Brittney Copper stands in the snow surrounded by pine trees and holds out a device to measure the flux of solar radiation

Cooper measures the downward flux of solar radiation during a winter snow survey. Image courtesy Brittney Cooper | + Expand image

Later, I started doing research with this professor, John Moores, as an undergrad. In my last year, there was a Ph.D. student who was a participating scientist on NASA's Mars Science Laboratory mission and he was graduating. John had said something along the lines of, "There's an opening, and I know it's always been your dream to work in mission control, so do you want to be on the mission?" And I was, like, "Yes, I definitely do!" I couldn't believe it. And I was never intending to do a master's, but then I realized I really loved the work I was doing, working on constraining physical properties of Martian water-ice clouds using the Mars Curiosity rover. We got to design this observation, which ran on the rover, and then I got to work with the data from it, which was really cool. So I stayed on to do my master's, and I'm still on the mission, which is pretty awesome.

In January you authored your first science paper on that research. Tell me more about that.

A black and white animated image showing light, wispy clouds moving across the Martian sky

Wispy clouds float across the Martian sky in this accelerated sequence of images from NASA's Curiosity Mars rover. Image credit: NASA/JPL-Caltech/York University | › Full image and caption

My research focuses on the physical scattering properties of Martian water-ice clouds. A lot of people don't even realize that there are clouds on Mars, which I totally get because Mars doesn't have much of an atmosphere. But it does have enough of an atmosphere to create very thin, wispy, almost cirrus-like clouds similar to the ones we have on Earth. They're made up of small, water-ice crystals. These kinds of clouds do have a noticeable impact on Earth's climate, so we have now started thinking about what these clouds are doing in Mars' climate. The scattering properties can tell us a bit about that. They can tell us how much radiation is scattered back to space by these clouds or kept in Mars' atmosphere and whether or not we can see really fun things like halos, glories and different types of optical phenomena that we can see here on Earth.

We designed this observation that uses the Navcam imager on Curiosity. The engineering folks with the mission helped us design it. I got to present at a science discussion, which was superscary, but everyone was so kind. And then the observation was approved to run on Mars once a week from September 2017 to March 2018. During this observation window, Curiosity would take images of the sky to capture clouds at as many different scattering angles as possible. Once we got all the data back, we were able to constrain the dominant ice crystal shapes in the clouds based upon this thing called the phase function, which tells you how these clouds scatter light and radiation. I was the lead author on the research paper that came from that, and it got accepted. We started working on this right when I was really new to the mission, and it was my first paper. I couldn't believe everyone wasn't, like, "Who the heck are you? Why are we going to let you do anything?" But everyone was so kind, and it was just such a great experience.

What was the hardest part about writing that first paper?

The hardest part was probably just getting over the fear of thinking people aren't going to listen to you or you aren't going to be smart enough or you won't be able to answer questions. It was really just getting over my own fears and worries and not holding myself back because of them. I have a really great mentor who pushed me to do all these things, so I was able to suck it up and say, "If he believes in me and he thinks I can do it, maybe he's right." Every time I did a presentation or I would talk about the observation or try to advocate for it, I was just met with such positivity that I was, like, "OK, these fears are rooted in nothing."

In July, you're coming to JPL for your first internship here. What will you be working on?

Yes, I'm so excited! I'll be working with two scientists, Michael Mischna and Manuel de la Torre Juarez. We're going to be working with the Rover Environmental Monitoring Station, or REMS, which is an instrument on Curiosity that measures the temperature, relative humidity and pressure around the rover on Mars. From those measurements, we're going to try to infer the presence of clouds at night. So far, the way we've used Curiosity to study clouds is with optical instruments [or cameras]. So we take pictures of the clouds. But that's not really something we can do at night. So using REMS and its temperature sensors at night, we can try to see if clouds around the rover are emitting infrared radiation, heating up the atmosphere around the rover. We can try to detect them that way. So that's what we're going to try to do – look for some patterns and see what we can come up with. We'll also be comparing what we find with data from NASA's Mars Climate Sounder, which is in orbit around Mars and takes nighttime measurements of the atmosphere.

What are you most excited about coming to JPL?

I would be lying if I said it wasn't just getting to come to a NASA center – especially as a Canadian. It's every little space enthusiast's dream. I'm also excited to meet all the people who I've been working with for the last two years. The people are such an awesome part of this mission that I've been a part of. So I'm looking forward to meeting them in person and working with them in a closer way.

What do you see as the ultimate goal of your research?

We're just trying to better understand Mars. It's kind of a crazy place. There is a lot of evidence that shows us that there's a lot more going on than we know now and it's just about trying to put the pieces of the puzzle together. There are also a lot of similarities to Earth. So we can try to take what we learn about Mars and apply it to our planet as well.

What's your ultimate career goal?

What I would really love is to work in spacecraft operations. I absolutely love working in science and working with data, but getting a chance to be a part of this mission and do operations – be part of a team and do multidisciplinary work – it's so exciting, and it's something that I never thought that I'd get to experience. And now that I've had a bit of a taste, I'm wanting more. So that's what I'm hoping for in the future.

Do you ever think about how you moved away from studying photography but are using photography to do science on Mars?

Yes! Every once in a while, that hits me, and I think to myself, "That's so cool." It's just very, very cool. Ten years ago, I never thought I'd be where I am now. But also just to know that there's that connection, that I'm working with visual data, with optical data – I don't think it's a coincidence. I really love working with images, so I think it's pretty cool that I get to do that.

Just one last fun question: If you could travel to any place in space, where would you go and what would you do there?

Without a doubt, it would have to be [Saturn's moon] Titan. I actually would probably go there to study the atmosphere. The first research project that I ever did was trying to find methane and ethane fog on Titan and the surface data was quite limited, so I would like to go there. I want to see water-ice rocks. I want to see methane lakes and methane rain, set up a little vacation spot there [laughs].


Explore JPL’s summer and year-round internship programs and apply at: https://www.jpl.nasa.gov/edu/intern

The laboratory’s STEM internship and fellowship programs are managed by the JPL Education Office. Extending the NASA Office of Education’s reach, JPL Education seeks to create the next generation of scientists, engineers, technologists and space explorers by supporting educators and bringing the excitement of NASA missions and science to learners of all ages.

TAGS: Higher Education, College, Internships, Interns, Students, Science, Mars, Rovers, Weather

  • Kim Orr
READ MORE

The Millennium Falcon takes on TIE fighters in a scene from 'Star Wars: The Force Awakens.'

This feature was originally published on May 3, 2016.


In the News

What do "Star Wars," NASA's Dawn spacecraft and Newton's Laws of Motion have in common? An educational lesson that turns science fiction into science fact using spreadsheets – a powerful tool for developing the scientific models addressed in the Next Generation Science Standards.

The TIE (Twin Ion Engine) fighter is a staple of the "Star Wars" universe. Darth Vader flew one in "A New Hope." Poe Dameron piloted one in "The Force Awakens." And many, many Imperial pilots met their fates in them. While the fictional TIE fighters in "Star Wars" flew a long time ago in a galaxy far, far away, ion engines are a reality in this galaxy today – and have a unique connection to NASA’s Jet Propulsion Laboratory.

Launched in 1998, the first spacecraft to use an ion engine was Deep Space 1, which flew by asteroid 9969 Braille and comet Borrelly. Fueled by the success of Deep Space 1, engineers at JPL set forth to develop the next spacecraft that would use ion propulsion. This mission, called Dawn, would take ion-powered spacecraft to the next level by allowing Dawn to go into orbit twice – around the two largest objects in the asteroid belt: Vesta and Ceres.

How Does It Work?

Ion engines rely on two principles that Isaac Newton first described in 1687. First, a positively charged atom (ion) is pushed out of the engine at a high velocity. Newton’s Third Law of Motion states that for every action there is an equal and opposite reaction, so then a small force pushes back on the spacecraft in the opposite direction – forward! According to Newton’s Second Law of Motion, there is a relationship between the force (F) exerted on an object, its mass (m) and its acceleration (a). The equation F=ma describes that relationship, and tells us that the small force applied to the spacecraft by the exiting atom provides a small amount of acceleration to the spacecraft. Push enough atoms out, and you'll get enough acceleration to really speed things up.


Why is It Important?

Compared with traditional chemical rockets, ion propulsion is faster, cheaper and safer:

  • Faster: Spacecraft powered by ion engines can reach speeds of up to 90,000 meters per second (more than 201,000 mph!)
  • Cheaper: When it comes to fuel efficiency, ion engines can reach more than 90 percent fuel efficiency, while chemical rockets are only about 35 percent efficient.
  • Safer: Ion thrusters are fueled by inert gases. Most of them use xenon, which is a non-toxic, chemically inert (no risk of exploding), odorless, tasteless and colorless gas.

These properties make ion propulsion a very attractive solution when engineers are designing spacecraft. While not every spacecraft can use ion propulsion – some need greater rates of acceleration than ion propulsion can provide – the number and types of missions using these efficient engines is growing. In addition to being used on the Dawn spacecraft and communication satellites orbiting Earth, ion propulsion could be used to boost the International Space Station into higher orbits and will likely be a part of many future missions exploring our own solar system.

Teach It

Newton’s Laws of Motion are an important part of middle and high school physical science and are addressed specifically by the Next Generation Science Standards as well as Common Core Math standards. The lesson "Ion Propulsion: Using Spreadsheets to Model Additive Velocity" lets students study the relationship between force, mass and acceleration as described by Newton's Second Law as they develop spreadsheet models that apply those principles to real-world situations.

This lesson meets the following Next Generation Science and Common Core Math Standards:

NGSS Standards:

  • MS-PS2-2: Plan an investigation to provide evidence that the change in an object’s motion depends on the sum of the forces on the object and the mass of the object.
  • HS-PS2-1: Analyze data to support the claim that Newton’s second law of motion describes the mathematical relationship among the net force on a macroscopic object, its mass, and its acceleration.
  • HS-PS2-1: Use mathematical representations to support the claim that the total momentum of a system of objects is conserved when there is no net force on the system.

Common Core Math Standards:

  • Grade 8: Expressions and Equations A.4: Perform operations with numbers expressed in scientific notation, including problems where both decimal and scientific notation are used. Use scientific notation and choose units of appropriate size for measurements of very large or very small quantities (e.g., use millimeters per year for seafloor spreading). Interpret scientific notation that has been generated by technology.
  • High School: Algebra CED.A.4: Rearrange formulas to highlight a quantity of interest, using the same reasoning as in solving equations.
  • High School: Functions LE.A: Construct and compare linear, quadratic, and exponential models and solve problems.
  • High School: Functions BF.A.1: Write a function that describes a relationship between two quantities.
  • High School: Statistics and Probability ID.C: Interpret linear Models
  • High School: Number and Quantity Q.A.1: Use units as a way to understand problems and to guide the solution of multi-step problems; choose and interpret units consistently in formulas; choose and interpret the scale and the origin in graphs and data displays."

Explore More

TAGS: May the Fourth, Star Wars Day, F=ma, ion propulsion, Dawn, Deep Space 1, lesson, classroom activity, NGSS, Common Core Math

  • Lyle Tavernier
READ MORE

A glowing, orange ring outlines a black hole.

In the News

Accomplishing what was previously thought to be impossible, a team of international astronomers has captured an image of a black hole’s silhouette. Evidence of the existence of black holes – mysterious places in space where nothing, not even light, can escape – has existed for quite some time, and astronomers have long observed the effects on the surroundings of these phenomena. In the popular imagination, it was thought that capturing an image of a black hole was impossible because an image of something from which no light can escape would appear completely black. For scientists, the challenge was how, from thousands or even millions of light-years away, to capture an image of the hot, glowing gas falling into a black hole. An ambitious team of international astronomers and computer scientists has managed to accomplish both. Working for well over a decade to achieve the feat, the team improved upon an existing radio astronomy technique for high-resolution imaging and used it to detect the silhouette of a black hole – outlined by the glowing gas that surrounds its event horizon, the precipice beyond which light cannot escape. Learning about these mysterious structures can help students understand gravity and the dynamic nature of our universe, all while sharpening their math skills.

How They Did It

Though scientists had theorized they could image black holes by capturing their silhouettes against their glowing surroundings, the ability to image an object so distant still eluded them. A team formed to take on the challenge, creating a network of telescopes known as the Event Horizon Telescope, or the EHT. They set out to capture an image of a black hole by improving upon a technique that allows for the imaging of far-away objects, known as Very Long Baseline Interferometry, or VLBI.

Telescopes of all types are used to see distant objects. The larger the diameter, or aperture, of the telescope, the greater its ability to gather more light and the higher its resolution (or ability to image fine details). To see details in objects that are far away and appear small and dim from Earth, we need to gather as much light as possible with very high resolution, so we need to use a telescope with a large aperture.

That’s why the VLBI technique was essential to capturing the black hole image. VLBI works by creating an array of smaller telescopes that can be synchronized to focus on the same object at the same time and act as a giant virtual telescope. In some cases, the smaller telescopes are also an array of multiple telescopes. This technique has been used to track spacecraft and to image distant cosmic radio sources, such as quasars.

More than a dozen antennas pointing forward sit on barren land surrounded by red and blue-purple mountains in the distance.

Making up one piece of the EHT array of telescopes, the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile has 66 high-precision antennas. Image credit: NRAO/AUI/NSF | + Expand image

The aperture of a giant virtual telescope such as the Event Horizon Telescope is as large as the distance between the two farthest-apart telescope stations – for the EHT, those two stations are at the South Pole and in Spain, creating an aperture that’s nearly the same as the diameter of Earth. Each telescope in the array focuses on the target, in this case the black hole, and collects data from its location on Earth, providing a portion of the EHT’s full view. The more telescopes in the array that are widely spaced, the better the image resolution.

This video shows the global network of radio telescopes in the EHT array that performed observations of the black hole in the galaxy M87. Credit: C. Fromm and L. Rezzolla (Goethe University Frankfurt)/Black Hole Cam/EHT Collaboration | Watch on YouTube

To test VLBI for imaging a black hole and a number of computer algorithms for sorting and synchronizing data, the Event Horizon Telescope team decided on two targets, each offering unique challenges.

The closest supermassive black hole to Earth, Sagittarius A*, interested the team because it is in our galactic backyard – at the center of our Milky Way galaxy, 26,000 light-years (156 quadrillion miles) away. (An asterisk is the astronomical standard for denoting a black hole.) Though not the only black hole in our galaxy, it is the black hole that appears largest from Earth. But its location in the same galaxy as Earth meant the team would have to look through “pollution” caused by stars and dust to image it, meaning there would be more data to filter out when processing the image. Nevertheless, because of the black hole’s local interest and relatively large size, the EHT team chose Sagittarius A* as one of its two targets.

An image showing a smattering of orange stars against the black backdrop of space with a small black circle in the middle and a rectangle identifying the location of the M87 black hole.

A close-up image of the core of the M87 galaxy, imaged by the Chandra X-ray Observatory. Image credit: NASA/CXC/Villanova University/J. Neilsen | + Expand image

A blue jet extends from a bright yellow point surrounded by smaller yellow stars.

This image from NASA's Hubble Space Telescope shows a jet of subatomic particles streaming from the center of M87*. Image credits: NASA and the Hubble Heritage Team (STScI/AURA) | + Expand image

The second target was the supermassive black hole M87*. One of the largest known supermassive black holes, M87* is located at the center of the gargantuan elliptical galaxy Messier 87, or M87, 53 million light-years (318 quintillion miles) away. Substantially more massive than Sagittarius A*, which contains 4 million solar masses, M87* contains 6.5 billion solar masses. One solar mass is equivalent to the mass of our Sun, approximately 2x10^30 kilograms. In addition to its size, M87* interested scientists because, unlike Sagittarius A*, it is an active black hole, with matter falling into it and spewing out in the form of jets of particles that are accelerated to velocities near the speed of light. But its distance made it even more of a challenge to capture than the relatively local Sagittarius A*. As described by Katie Bouman, a computer scientist with the EHT who led development of one of the algorithms used to sort telescope data during the processing of the historic image, it’s akin to capturing an image of an orange on the surface of the Moon.

By 2017, the EHT was a collaboration of eight sites around the world – and more have been added since then. Before the team could begin collecting data, they had to find a time when the weather was likely to be conducive to telescope viewing at every location. For M87*, the team tried for good weather in April 2017 and, of the 10 days chosen for observation, a whopping four days were clear at all eight sites!

Each telescope used for the EHT had to be highly synchronized with the others to within a fraction of a millimeter using an atomic clock locked onto a GPS time standard. This degree of precision makes the EHT capable of resolving objects about 4,000 times better than the Hubble Space Telescope. As each telescope acquired data from the target black hole, the digitized data and time stamp were recorded on computer disk media. Gathering data for four days around the world gave the team a substantial amount of data to process. The recorded media were then physically transported to a central location because the amount of data, around 5 petabytes, exceeds what the current internet speeds can handle. At this central location, data from all eight sites were synchronized using the time stamps and combined to create a composite set of images, revealing the never-before-seen silhouette of M87*’s event horizon. The team is also working on generating an image of Sagittarius A* from additional observations made by the EHT.

This zoom video starts with a view of the ALMA telescope array in Chile and zooms in on the heart of M87, showing successively more detailed observations and culminating in the first direct visual evidence of a supermassive black hole’s silhouette. Credit: ESO/L. Calçada, Digitized Sky Survey 2, ESA/Hubble, RadioAstron, De Gasperin et al., Kim et al., EHT Collaboration. Music: Niklas Falcke | Watch on YouTube

As more telescopes are added and the rotation of Earth is factored in, more of the image can be resolved, and we can expect future images to be higher resolution. But we might never have a complete picture, as Katie Bouman explains here (under “Imaging a Black Hole”).

To complement the EHT findings, several NASA spacecraft were part of a large effort to observe the black hole using different wavelengths of light. As part of this effort, NASA’s Chandra X-ray Observatory, Nuclear Spectroscopic Telescope Array (NuSTAR) and Neil Gehrels Swift Observatory space telescope missions – all designed to detect different varieties of X-ray light – turned their gaze to the M87 black hole around the same time as the EHT in April 2017. NASA’s Fermi Gamma-ray Space Telescope was also watching for changes in gamma-ray light from M87* during the EHT observations. If the EHT observed changes in the structure of the black hole’s environment, data from these missions and other telescopes could be used to help figure out what was going on.

Though NASA observations did not directly trace out the historic image, astronomers used data from Chandra and NuSTAR satellites to measure the X-ray brightness of M87*’s jet. Scientists used this information to compare their models of the jet and disk around the black hole with the EHT observations. Other insights may come as researchers continue to pore over these data.

Why It's Important

Learning about mysterious structures in the universe provides insight into physics and allows us to test observation methods and theories, such as Einstein’s theory of general relativity. Massive objects deform spacetime in their vicinity, and although the theory of general relativity has directly been proven accurate for smaller-mass objects, such as Earth and the Sun, the theory has not yet been directly proven for black holes and other regions containing dense matter.

One of the main results of the EHT black hole imaging project is a more direct calculation of a black hole’s mass than ever before. Using the EHT, scientists were able to directly observe and measure the radius of M87*’s event horizon, or its Schwarzschild radius, and compute the black hole’s mass. That estimate was close to the one derived from a method that uses the motion of orbiting stars – thus validating it as a method of mass estimation.

The size and shape of a black hole, which depend on its mass and spin, can be predicted from general relativity equations. General relativity predicts that this silhouette would be roughly circular, but other theories of gravity predict slightly different shapes. The image of M87* shows a circular silhouette, thus lending credibility to Einstein’s theory of general relativity near black holes.

An illustration of a black hole surrounded by a bright, colorful swirl of material. Text describes each part of the black hole and its surroundings.

This artist’s impression depicts a rapidly spinning supermassive black hole surrounded by an accretion disc. Image credit: ESO | + Expand image

The data also offer some insight into the formation and behavior of black hole structures, such as the accretion disk that feeds matter into the black hole and plasma jets that emanate from its center. Scientists have hypothesized about how an accretion disk forms, but they’ve never been able to test their theories with direct observation until now. Scientists are also curious about the mechanism by which some supermassive black holes emit enormous jets of particles traveling at near light-speed.

These questions and others will be answered as more data is acquired by the EHT and synthesized in computer algorithms. Be sure to stay tuned for that and the next expected image of a black hole – our Milky Way’s own Sagittarius A*.

Teach It

Capture your students’ enthusiasm about black holes by challenging them to solve these standards-aligned math problems.

Model black-hole interaction with this NGSS-aligned lesson:

Explore More


Check out these related resources for students from NASA’s Space Place

TAGS: Black Hole, Teachable Moments, Science, K-12 Education, Teachers, Educators

  • Ota Lutz
READ MORE

Illustration of spacecraft against a starry background

Update: March 15, 2019 – The answers to the 2018 NASA Pi Day Challenge are here! View the illustrated answer key


In the News

The excitement of Pi Day – and our annual excuse to chow down on pie – is upon us! The holiday celebrating the mathematical constant pi arrives on March 14, and with it comes the sixth installment of the NASA Pi Day Challenge from the Jet Propulsion Laboratory’s Education Office. This challenge gives students in grades 6-12 a chance to solve four real-world problems faced by NASA scientists and engineers. (Even if you’re done with school, they’re worth a try for the bragging rights.)

https://www.jpl.nasa.gov/edu/teach/activity/pi-in-the-sky-6/

Visit the "Pi in the Sky 6" lesson page to explore classroom resources and downloads for the 2019 NASA Pi Day Challenge. Image credit: NASA/JPL-Caltech/Kim Orr | + Expand image

Why March 14?

Pi, the ratio of a circle’s circumference to its diameter, is what is known as an irrational number. As an irrational number, its decimal representation never ends, and it never repeats. Though it has been calculated to trillions of digits, we use far fewer at NASA. In fact, 3.14 is a good approximation, which is why March 14 (or 3/14 in U.S. month/day format) came to be the date that we celebrate this mathematical marvel.

The first-known Pi Day celebration occurred in 1988. In 2009, the U.S. House of Representatives passed a resolution designating March 14 as Pi Day and encouraging teachers and students to celebrate the day with activities that teach students about pi.

The 2019 Challenge

This year’s NASA Pi Day Challenge features four planetary puzzlers that show students how pi is used at the agency. The challenges involve weathering a Mars dust storm, sizing up a shrinking storm on Jupiter, estimating the water content of a rain cloud on Earth and blasting ice samples with lasers!

›Take on the 2019 NASA Pi Day Challenge!

The Science Behind the Challenge

In late spring of 2018, a dust storm began stretching across Mars and eventually nearly blanketed the entire planet in thick dust. Darkness fell across Mars’ surface, blocking the vital sunlight that the solar-powered Opportunity rover needed to survive. It was the beginning of the end for the rover’s 15-year mission on Mars. At its height, the storm covered all but the peak of Olympus Mons, the largest known volcano in the solar system. In the Deadly Dust challenge, students must use pi to calculate what percentage of the Red Planet was covered by the dust storm.

The Terra satellite, orbiting Earth since 1999, uses the nine cameras on its Multi-Angle Imaging SpectroRadiometer, or MISR, instrument to provide scientists with unique views of Earth, returning data about atmospheric particles, land-surface features and clouds. Estimating the amount of water in a cloud, and the potential for rainfall, is serious business. Knowing how much rain may fall in a given area can help residents and first responders prepare for emergencies like flooding and mudslides. In Cloud Computing, students can use their knowledge of pi and geometric shapes to estimate the amount of water contained in a cloud.

Jupiter’s Great Red Spot, a giant storm that has been fascinating observers since the early 19th century, is shrinking. The storm has been continuously observed since the 1830s, but measurements from spacecraft like Voyager, the Hubble Space Telescope and Juno indicate the storm is getting smaller. How much smaller? In Storm Spotter, students can determine the answer to that very question faced by scientists.

Scientists studying ices found in space, such as comets, want to understand what they’re made of and how they interact and react with the environment around them. To see what molecules may form in space when a comet comes into contact with solar wind or sunlight, scientists place an ice sample in a vacuum and then expose it to electrons or ultraviolet photons. Scientists have analyzed samples in the lab and detected molecules that were later observed in space on comet 67P/Churyumov-Gerasimenko. To analyze the lab samples, an infrared laser is aimed at the ice, causing it to explode. But the ice will explode only if the laser is powerful enough. Scientist use pi to figure out how strong the laser needs to be to explode the sample – and students can do the same when they solve the Icy Intel challenge.

Explore More

Participate

Join the conversation and share your Pi Day Challenge answers with @NASAJPL_Edu on social media using the hashtag #NASAPiDayChallenge

Blogs and Features

Related Activities

Multimedia

Facts and Figures

Missions and Instruments

Websites

TAGS: Pi Day, K-12, STEM, Science, Engineering, Technology, Math, Pi, Educators, Teachers, Informal Education, Museums

  • Lyle Tavernier
READ MORE

In the News

This summer, a global dust storm encircled Mars, blocking much of the vital solar energy that NASA’s Opportunity rover needs to survive. After months of listening for a signal, the agency has declared that the longest-lived rover to explore Mars has come to the end of its mission. Originally slated for a three-month mission, the Opportunity rover lived a whopping 14.5 years on Mars. Opportunity beat the odds many times while exploring the Red Planet, returning an abundance of scientific data that paved the way for future exploration.

Scientists and engineers are celebrating this unprecedented mission success, still analyzing data collected during the past decade and a half and applying lessons learned to the design of future spacecraft. For teachers, this historic mission provides lessons in engineering design, troubleshooting and scientific discovery.

How They Did It

Launched in 2003 and landed in early 2004, the twin Mars Exploration Rovers, Spirit and Opportunity, were the second spacecraft of their kind to land on our neighboring planet.

Preceded by the small Sojourner rover in 1997, Spirit and Opportunity were substantially larger, weighing about 400 pounds, or 185 kilograms, on Earth (150 pounds, or 70 kilograms, on Mars) and standing about 5 feet tall. The solar-powered rovers were designed for a mission lasting 90 sols, or Mars days, during which they would look for evidence of water on the seemingly barren planet.

Dust in the Wind

Scientists and engineers always hope a spacecraft will outlive its designed lifetime, and the Mars Exploration Rovers did not disappoint. Engineers at NASA’s Jet Propulsion Laboratory in Pasadena, California, expected the lifetime of these sun-powered robots to be limited by dust accumulating on the rovers’ solar panels. As expected, power input to the rovers slowly decreased as dust settled on the panels and blocked some of the incoming sunlight. However, the panels were “cleaned” accidentally when seasonal winds blew off the dust. Several times during the mission, power levels were restored to pre-dusty conditions. Because of these events, the rovers were able to continue their exploration much longer than expected with enough power to continue running all of their instruments.

Side-by-side images of Opportunity on Mars, showing dust on its solar panels and then relatively clean solar panels

A self-portrait of NASA's Mars Exploration Rover Opportunity taken in late March 2014 (right) shows that much of the dust on the rover's solar arrays was removed since a similar portrait from January 2014 (left). Image Credit: NASA/JPL-Caltech/Cornell Univ./Arizona State Univ. | › Full image and caption

Terrestrial Twin

To troubleshoot and overcome challenges during the rovers’ long mission, engineers would perform tests on a duplicate model of the spacecraft, which remained on Earth for just this purpose. One such instance was in 2005, when Opportunity got stuck in the sand. Its right front wheel dug into loose sand, reaching to just below its axle. Engineers and scientists worked for five weeks to free Opportunity, first using images and spectroscopy obtained by the rover’s instruments to recreate the sand trap on Earth and then placing the test rover in the exact same position as Opportunity. The team eventually found a way to get the test rover out of the sand trap. Engineers tested their commands repeatedly with consistent results, giving them confidence in their solution. The same commands were relayed to Opportunity through NASA’s Deep Space Network, and the patient rover turned its stuck wheel just the right amount and backed out of the trap that had ensnared it for over a month, enabling the mission to continue.

Engineers test moves on a model of the Opportunity rover in the In-Situ Instrument Laboratory at JPL

Inside the In-Situ Instrument Laboratory at JPL, rover engineers check how a test rover moves in material chosen to simulate some difficult Mars driving conditions. | › Full image and caption

A few years later, in 2009, Spirit wasn’t as lucky. Having already sustained some wheel problems, Spirit got stuck on a slope in a position that would not be favorable for the Martian winter. Engineers were not able to free Spirit before winter took hold, denying the rover adequate sunlight for power. Its mission officially ended in 2011. Meanwhile, despite a troubled shoulder joint on its robotic arm that first started showing wear in 2006, Opportunity continued exploring the Red Planet. It wasn’t until a dust storm completely enveloped Mars in the summer of 2018 that Opportunity finally succumbed to the elements.

The Final Act

animation showing a dust storm moving across Mars

This set of images from NASA’s Mars Reconnaissance Orbiter (MRO) shows a giant dust storm building up on Mars in 2018, with rovers on the surface indicated as icons. Image credit: NASA/JPL-Caltech/MSSS | › Full image and caption

simulated views of the sun as the 2018 dust storm darkened from Opportunity's perspective on Mars

This series of images shows simulated views of a darkening Martian sky blotting out the Sun from NASA’s Opportunity rover’s point of view in the 2018 global dust storm. Each frame corresponds to a tau value, or measure of opacity: 1, 3, 5, 7, 9, 11. Image credit: NASA/JPL-Caltech/TAMU | › Full image and caption

Dust storm season on Mars can be treacherous for solar-powered rovers because if they are in the path of the dust storm, their access to sunlight can be obstructed for months on end, longer than their batteries can sustain them. Though several dust storms occurred on Mars during the reign of the Mars Exploration Rovers, 2018 brought a large, thick dust storm that covered the entire globe and shrouded Opportunity’s access to sunlight for four months. Only the caldera of Olympus Mons, the largest known volcano in the solar system, peeked out above the dust.

The transparency or “thickness” of the dust in Mars’ atmosphere is denoted by the Greek letter tau. The higher the tau, the less sunlight is available to charge a surface spacecraft’s batteries. An average tau for Opportunity’s location is 0.5. The tau at the peak of the 2018 dust storm was 10.8. This thick dust was imaged and measured by the Curiosity Mars rover on the opposite side of the planet. (Curiosity is powered by a radioisotope thermoelectric generator.)

Since the last communication with Opportunity on June 10, 2018, NASA has sent more than 1,000 commands to the rover that have gone unanswered. Each of these commands was an attempt to get Opportunity to send back a signal saying it was alive. A last-ditch effort to reset the rover’s mission clock was met with silence.

Why It’s Important

The Mars Exploration Rovers were designed to give a human-height perspective of Mars, using panoramic cameras approximately 5 feet off the surface, while their science instruments investigated Mars’ surface geology for signs of water. Spirit and Opportunity returned more than 340,000 raw images conveying the beauty of Mars and leading to scientific discoveries. The rovers brought Mars into classrooms and living rooms around the world. From curious geologic formations to dune fields, dust devils and even their own tracks on the surface of the Red Planet, the rovers showed us Mars in a way we had never seen it before.

tracks on Mars with a patch of white soil showing

This mosaic shows an area of disturbed soil made by the Spirit rover's stuck right front wheel. The trench exposed a patch of nearly pure silica, with the composition of opal. Image credit: NASA/JPL-Caltech/Cornell | › Full image and caption

Mineral vein on the surface of Mars

This color view of a mineral vein was taken by the Mars rover Opportunity on Nov. 7, 2011. Image credit: NASA/JPL-Caltech/Cornell/ASU | › Full image and caption

The rovers discovered that Mars was once a warmer, wetter world than it is today and was potentially able to support microbial life. Opportunity landed in a crater and almost immediately discovered deposits of hematite, which is a mineral known to typically form in the presence of water. During its travels across the Mars surface, Spirit found rocks rich in magnesium and iron carbonates that likely formed when Mars was warm and wet, and sustained a near-neutral pH environment hospitable to life. At one point, while dragging its malfunctioning wheel, Spirit excavated 90 percent pure silica lurking just below the sandy surface. On Earth, this sort of silica usually exists in hot springs or hot steam vents, where life as we know it often finds a happy home. Later in its mission, near the rim of Endeavor crater, Opportunity found bright-colored veins of gypsum in the rocks. These veins likely formed when water flowed through underground fractures in the rocks, leaving calcium behind. All of these discoveries lead scientists to believe that Mars was once more hospitable to life than it is today, and they laid the groundwork for future exploration.

Imagery from the Mars Reconnaissance Orbiter and Mars Odyssey, both orbiting the Red Planet, has been combined with surface views and data from the Mars Exploration Rovers for an unprecedented understanding of the planet’s geology and environment.

Not only did Spirit and Opportunity add to our understanding of Mars, but also the rovers set the stage for future exploration. Following in their tracks, the Curiosity rover landed in 2012 and is still active, investigating the planet’s surface chemistry and geology, and confirming the presence of past water. Launching in 2020 is the next Mars rover, currently named Mars 2020. Mars 2020 will be able to analyze soil samples for signs of past microbial life. It will carry a drill that can collect samples of interesting rocks and soils, and set them aside in a cache on the surface of Mars. In the future, those samples could be retrieved and returned to Earth by another mission. Mars 2020 will also do preliminary research for future human missions to the Red Planet, including testing a method of producing oxygen from Mars’ atmosphere.

It’s thanks to three generations of surface-exploring rovers coupled with the knowledge obtained by orbiters and stationary landers that we have a deeper understanding of the Red Planet’s geologic history and can continue to explore Mars in new and exciting ways.

Teach It

Use these standards-aligned lessons and related activities to get students doing engineering, troubleshooting and scientific discovery just like NASA scientists and engineers!

Explore More

Try these related resources for students from NASA’s Space Place

TAGS: K-12 Education, Teachers, Educators, Students, Opportunity, Mars rover, Rovers, Mars, Lessons, Activities, Missions

  • Ota Lutz
READ MORE

2019 Los Angeles Regional Science Bowl winners

After a full day of intense competition, a team of students from University High School in Irvine, California, earned first place in a regional round of the U.S. Department of Energy National Science Bowl on Jan. 26, 2019. This is the second consecutive year that the school has placed first in the regional round, and it's the 27th year that NASA's Jet Propulsion Laboratory in Pasadena, California, has hosted the competition.

› Read the full story on JPL News


TAGS: High School, Science Bowl, Student Competitions, Science, Events

READ MORE

The supermoon lunar eclipse captured as it moved over NASA’s Glenn Research Center on September 27, 2015.

In the News

Looking up at the Moon can create a sense of awe at any time, but those who do so on the evening of January 20 will be treated to the only total lunar eclipse of 2019. Visible for its entirety in North and South America, this eclipse is being referred to by some as a super blood moon – “super” because the Moon will be closest to Earth in its orbit during the full moon (more on supermoons here) and “blood" because the total lunar eclipse will turn the Moon a reddish hue (more on that below). This is a great opportunity for students to observe the Moon – and for teachers to make connections to in-class science content.

How It Works

Eclipses can occur when the Sun, the Moon and Earth align. Lunar eclipses can happen only during a full moon, when the Moon and the Sun are on opposite sides of Earth. At that point, the Moon can move into the shadow cast by Earth, resulting in a lunar eclipse. However, most of the time, the Moon’s slightly tilted orbit brings it above or below Earth’s shadow.

Watch on YouTube

The time period when the Moon, Earth and the Sun are lined up and on the same plane – allowing for the Moon to pass through Earth’s shadow – is called an eclipse season. Eclipse seasons last about 34 days and occur just shy of every six months. When a full moon occurs during an eclipse season, the Moon travels through Earth’s shadow, creating a lunar eclipse.

Graphic showing the alignment of the Sun, Earth and Moon when a full moon occurs during an eclipse season versus a non-eclipse season

When a full moon occurs during an eclipse season, the Moon travels through Earth's shadow, creating a lunar eclipse. Credit: NASA/JPL-Caltech | + Enlarge image

Unlike solar eclipses, which require special glasses to view and can be seen only for a few short minutes in a very limited area, a total lunar eclipse can be seen for about an hour by anyone on the nighttime side of Earth – as long as skies are clear.

What to Expect

The Moon passes through two distinct parts of Earth’s shadow during a lunar eclipse. The outer part of the cone-shaped shadow is called the penumbra. The penumbra is less dark than the inner part of the shadow because it’s penetrated by some sunlight. (You have probably noticed that some shadows on the ground are darker than others, depending on how much outside light enters the shadow; the same is true for the outer part of Earth’s shadow.) The inner part of the shadow, known as the umbra, is much darker because Earth blocks additional sunlight from entering the umbra.

At 6:36 p.m. PST (9:36 p.m. EST) on January 20, the edge of the Moon will begin entering the penumbra. The Moon will dim very slightly for the next 57 minutes as it moves deeper into the penumbra. Because this part of Earth’s shadow is not fully dark, you may notice only some dim shading (if anything at all) on the Moon near the end of this part of the eclipse.

Graphic showing the positions of the Moon, Earth and Sun during a partial lunar eclipse

During a total lunar eclipse, the Moon first enters into the penumbra, or the outer part of Earth's shadow, where the shadow is still penetrated by some sunlight. Credit: NASA | + Enlarge image

At 7:33 p.m. PST (10:33 p.m. EST), the edge of the Moon will begin entering the umbra. As the Moon moves into the darker shadow, significant darkening of the Moon will be noticeable. Some say that during this part of the eclipse, the Moon looks as if it has had a bite taken out of it. That “bite” gets bigger and bigger as the Moon moves deeper into the shadow.

The Moon as seen during a partial lunar eclipse

As the Moon starts to enter into the umbra, the inner and darker part of Earth's shadow, it appears as if a bite has been taken out of the Moon. This "bite" will grow until the Moon has entered fully into the umbra. Credit: NASA | + Enlarge image

At 8:41 p.m. PST (11:41 p.m. EST), the Moon will be completely inside the umbra, marking the beginning of the total lunar eclipse. The moment of greatest eclipse, when the Moon is halfway through the umbra, occurs at 9:12 p.m. PST (12:12 a.m. EST).

Graphic showing the Moon inside the umbra

The total lunar eclipse starts once the moon is completely inside the umbra. And the moment of greatest eclipse happens with the Moon is halfway through the umbra as shown in this graphic. Credit: NASA | + Enlarge image

As the Moon moves completely into the umbra, something interesting happens: The Moon begins to turn reddish-orange. The reason for this phenomenon? Earth’s atmosphere. As sunlight passes through it, the small molecules that make up our atmosphere scatter blue light, which is why the sky appears blue. This leaves behind mostly red light that bends, or refracts, into Earth’s shadow. We can see the red light during an eclipse as it falls onto the Moon in Earth’s shadow. This same effect is what gives sunrises and sunsets a reddish-orange color.

The Moon as seen during a total lunar eclipse at the point of greatest eclipse

As the Moon moves completely into the umbra, it turns a reddish-orange color. Credit: NASA | + Enlarge image

A variety of factors affect the appearance of the Moon during a total lunar eclipse. Clouds, dust, ash, photochemical droplets and organic material in the atmosphere can change how much light is refracted into the umbra. Additionally, the January 2019 lunar eclipse takes place when the full moon is at or near the closest point in its orbit to Earth – a time popularly known as a supermoon. This means the Moon is deeper inside the umbra shadow and therefore may appear darker. The potential for variation provides a great opportunity for students to observe and classify the lunar eclipse based on its brightness. Details can be found in the “Teach It” section below.

At 9:43 p.m. PST (12:43 a.m. EST), the edge of the Moon will begin exiting the umbra and moving into the opposite side of the penumbra. This marks the end of the total lunar eclipse.

At 10:50 p.m. PST (1:50 a.m. EST), the Moon will be completely outside the umbra. It will continue moving out of the penumbra until the eclipse ends at 11:48 p.m (2:48 a.m. EST).

What if it’s cloudy where you live? Winter eclipses always bring with them the risk of poor viewing conditions. If your view of the Moon is obscured by the weather, explore options for watching the eclipse online, such as the Time and Date live stream.

Why It’s Important

Lunar eclipses have long played an important role in understanding Earth and its motions in space.

In ancient Greece, Aristotle noted that the shadows on the Moon during lunar eclipses were round, regardless of where an observer saw them. He realized that only if Earth were a spheroid would its shadows be round – a revelation that he and others had many centuries before the first ships sailed around the world.

Earth wobbles on its axis like a spinning top that’s about to fall over, a phenomenon called precession. Earth completes one wobble, or precession cycle, over the course of 26,000 years. Greek astronomer Hipparchus made this discovery by comparing the position of stars relative to the Sun during a lunar eclipse to those recorded hundreds of years earlier. A lunar eclipse allowed him to see the stars and know exactly where the Sun was for comparison – directly opposite the Moon. If Earth didn’t wobble, the stars would appear to be in the same place they were hundreds of years earlier. When Hipparchus saw that the stars’ positions had indeed moved, he knew that Earth must wobble on its axis!

Lunar eclipses are also used for modern-day science investigations. Astronomers have used ancient eclipse records and compared them with computer simulations. These comparisons helped scientists determine the rate at which Earth’s rotation is slowing.

Teach It

Ask students to observe the lunar eclipse and evaluate the Moon’s brightness using the Danjon Scale of Lunar Eclipse Brightness. The Danjon scale illustrates the range of colors and brightness the Moon can take on during a total lunar eclipse, and it’s a tool observers can use to characterize the appearance of an eclipse. View the lesson guide below. After the eclipse, have students compare and justify their evaluations of the eclipse.

Use these standards-aligned lessons and related activities to get your students excited about the eclipse, Moon phases and Moon observations:

TAGS: Lunar Eclipse, Moon, Teachers, Educators, K-12 Education, Astronomy

  • Lyle Tavernier
READ MORE

This illustration shows the position of NASA's Voyager 1 and Voyager 2 probes, outside of the heliosphere, a protective bubble created by the Sun that extends well past the orbit of Pluto.

In the News

The Voyager 2 spacecraft, launched in 1977, has reached interstellar space, a region beyond the heliosphere – the protective bubble of particles and magnetic fields created by the Sun – where the only other human-made object is its twin, Voyager 1.

The achievement means new opportunities for scientists to study this mysterious region. And for educators, it’s a chance to get students exploring the scale and anatomy of our solar system, plus the engineering and math required for such an epic journey.

How They Did It

Launched just 16 days apart, Voyager 1 and Voyager 2 were designed to take advantage of a rare alignment of the outer planets that only occurs once every 176 years. Their trajectory took them by the outer planets, where they captured never-before-seen images. They were also able to steal a little momentum from Jupiter and Saturn that helped send them on a path toward interstellar space. This “gravity assist” gave the spacecraft a velocity boost without expending any fuel. Though both spacecraft were destined for interstellar space, they followed slightly different trajectories.

Illustration of the trajectories of Voyager 1 and 2

An illustration of the trajectories of Voyager 1 and Voyager 2. Image credit: NASA/JPL-Caltech | + Expand image

Voyager 1 followed a path that enabled it to fly by Jupiter in 1979, discovering the gas giant’s rings. It continued on for a 1980 close encounter with Saturn’s moon Titan before a gravity assist from Saturn hurled it above the plane of the solar system and out toward interstellar space. After Voyager 2 visited Jupiter in 1979 and Saturn in 1981, it continued on to encounter Uranus in 1986, where it obtained another assist. Its last planetary visit before heading out of the solar system was Neptune in 1989, where the gas giant’s gravity sent the probe in a southward direction toward interstellar space. Since the end of its prime mission at Neptune, Voyager 2 has been using its onboard instruments to continue sensing the environment around it, communicating data back to scientists on Earth. It was this data that scientists used to determine Voyager 2 had entered interstellar space.

How We Know

Interstellar space, the region between the stars, is beyond the influence of the solar wind, charged particles emanating from the Sun, and before the influence of the stellar wind of another star. One hint that Voyager 2 was nearing interstellar space came in late August when the Cosmic Ray Subsystem, an instrument that measures cosmic rays coming from the Sun and galactic cosmic rays coming from outside our solar system, measured an increase in galactic cosmic rays hitting the spacecraft. Then on November 5, the instrument detected a sharp decrease in high energy particles from the Sun. That downward trend continued over the following weeks.

The data from the cosmic ray instrument provided strong evidence that Voyager 2 had entered interstellar space because its twin had returned similar data when it crossed the boundary of the heliosheath. But the most compelling evidence came from its Plasma Science Experiment – an instrument that had stopped working on Voyager 1 in 1980. Until recently, the space surrounding Voyager 2 was filled mostly with plasma flowing out from our Sun. This outflow, called the solar wind, creates a bubble, the heliosphere, that envelopes all the planets in our solar system. Voyager 2’s Plasma Science Experiment can detect the speed, density, temperature, pressure and flux of that solar wind. On the same day that the spacecraft’s cosmic ray instrument detected a steep decline in the number of solar energetic particles, the plasma science instrument observed a decline in the speed of the solar wind. Since that date, the plasma instrument has observed no solar wind flow in the environment around Voyager 2, which makes mission scientists confident the probe has entered interstellar space.

graph showing data from the cosmic ray and plasma science instruments on Voyager 2

This animated graph shows data returned from Voyager 2's cosmic ray and plasma science instruments, which provided the evidence that the spacecraft had entered interstellar space. Image credit: NASA/JPL-Caltech/GSFC | + Expand image

Though the spacecraft have left the heliosphere, Voyager 1 and Voyager 2 have not yet left the solar system, and won't be leaving anytime soon. The boundary of the solar system is considered to be beyond the outer edge of the Oort Cloud, a collection of small objects that are still under the influence of the Sun's gravity. The width of the Oort Cloud is not known precisely, but it is estimated to begin at about 1,000 astronomical units from the Sun and extend to about 100,000 AU. (One astronomical unit, or AU, is the distance from the Sun to Earth.) It will take about 300 years for Voyager 2 to reach the inner edge of the Oort Cloud and possibly 30,000 years to fly beyond it. By that time, both Voyager spacecraft will be completely out of the hydrazine fuel used to point them toward Earth (to send and receive data) and their power sources will have decayed beyond their usable lifetime.

Why It’s Important

Since the Voyager spacecraft launched more than 40 years ago, no other NASA missions have encountered as many planets (some of which had never been visited) and continued making science observations from such great distances. Other spacecraft, such as New Horizons and Pioneer 10 and 11, will eventually make it to interstellar space, but we will have no data from them to confirm their arrival or explore the region because their instruments already have or will have shut off by then.

Watch on YouTube

Interstellar space is a region that’s still mysterious because until 2012, when Voyager 1 arrived there, no spacecraft had visited it. Now, data from Voyager 2 will help add to scientists’ growing understanding of the region. Scientists are hoping to continue using Voyager 2’s plasma science instrument to study the properties of the ionized gases, or plasma, that exist in the interstellar medium by making direct measurements of the plasma density and temperature. This new data may shed more light on the evolution of our solar neighborhood and will most certainly provide a window into the exciting unexplored region of interstellar space, improving our understanding of space and our place in it.

As power wanes on Voyager 2, scientists will have to make tough choices about which instruments to keep turned on. Further complicating the situation is the freezing cold temperature at which the spacecraft is currently operating – perilously close to the freezing point of its hydrazine fuel. But for as long as both Voyager spacecraft are able to maintain power and communication, we will continue to learn about the uncharted territory of interstellar space.

Teach It

Use these standards-aligned lessons and related activities to get students doing math and science with a real-world (and space!) connection.

Explore More

TAGS: Teachers, Educators, Science, Engineering, Technology, Solar System, Voyager, Spacecraft, Educator Resources, Lessons, Activities

  • Ota Lutz
READ MORE

Michelle Vo poses for a photo in front of a full-size model of the Curiosity Mars rover at JPL.

Michelle Vo poses for a photo in front of a full-size model of the Curiosity Mars rover at JPL.

Until she discovered game development, Michelle Vo’s daydreams were a problem. She couldn’t focus in her computer science classes. Her grades were dipping. She wondered whether she was cut out to be a programmer or for school at all. So she took a break to make something just for fun, a self-help game. And help her, it did. Now focusing on virtual and augmented reality, Vo is back at school, studying not just computer science, but also cognitive science, linguistics and digital humanities. It’s a lot, but to create a virtual world, she says one has to first understand how people navigate the real one. This summer, at NASA’s Jet Propulsion Laboratory, the UCLA student applied her talents to VR and AR experiences that help scientists explore a totally different world, Mars. While Vo’s tendency to daydream hasn’t gone away, she now knows how to use the distractions for good; she turns them into VR inspiration.

What are you working on at JPL?

I've been working on multiple things. I work on this project called OnSight, which just won NASA Software of the Year, which we're really excited about. It's an AR project. ...

[A hummingbird flies past us and Michelle stops to point it out.]

Sorry, just got distracted. This is me on a daily basis, just distracted by everything, which is kind of how I work. I get distracted so easily, and it always takes my full attention. So if I get distracted by my work, it holds all my attention. I found out this year I have ADHD, which probably explains why I struggled so much in school. Oh gosh, this is distracting from the interview. [Laughs.] No, this is good for visibility. I struggled a lot in school because I was always distracted by my own daydreams. ADHD is often undetected in girls, since it’s not so much exhibited as fidgeting but, for me at least, spacing out and daydreaming. It was always really hard for me to focus if I wasn’t engaged. Thankfully, I’ve finally found a career where I can actually utilize that skill. My daydreaming is how I come up with my VR design ideas, and I’m so glad I can use it to help others. Maybe I didn't perform too well in school, but hey, look where I am now!

That's so great that you were able to channel it in that way. How did you go from struggling in school to doing VR?

When I first tried on a VR headset, I was like, "This is the future. I need to do whatever I can to learn about this." I decided to study computer science, but it was such a huge struggle. Not a lot of people know this, but I was on academic probation for a while. I had a 1.8 GPA at one point, because I was too shy to ask for help. I would get distracted and, overall, I felt discouraged. So I stopped studying computer science for a little bit.

When I took a break from school, I decided I wanted to try making a game. I wanted to do something just for fun, and I was determined to fix my bad habits. So with some friends, I created a self-help game at AthenaHacks, a women’s hackathon. For 24 hours, I was just immersed in my work. I had never felt that way about anything in my life, where I was just zoned in, in my own world, doing the thing I love. And that's when I realized, I think it's game development. I think this is what I want.

So I spent the year teaching myself [game development], and I got a lot more comfortable using the Unity game engine. I knew I eventually wanted to be a VR developer, so I saved up and invested in myself to learn the skills at Make School’s VR Summer Academy. That smaller learning environment opened up the world for me. It boosted my confidence more than anything to have the support I needed. I was like, "Maybe my grades aren’t so great, but I know how to make VR." And the world needs VR right now.

So when I went back to my university, I thought, "I'll try again. I'm going to go back to computer science.” And so far so good. I'm into my fourth year at UCLA studying cognitive science, linguistics, computer science and digital humanities. It sounds like a lot, but they're all related in the sense that they're all connected to VR, because VR is mainly a study of the mind and how we perceive reality. It’s not so much about computer science; you really have to know more about humans to create good VR.

So I went from literally the worst student in the class to killing it at NASA. [Laughs.]

Sorry, ugh, that was a lot. Haha, I look at a bird and go off on a tangent. That's my life.

So going back to your JPL internship, how are you using your VR skills to help scientists and engineers?

Michelle Vo in the InSight testbed at JPL

Michelle Vo poses for a photo with InSight Testbed Lead, Marleen Sundgaard. Image courtesy Michelle Vo | + Expand image

I’m interning in the Ops Lab, and the project I've been working on primarily is called OnSight. OnSight uses Microsoft’s HoloLens [mixed-reality software] to simulate walking on Mars. Mars scientists use it to collaborate with each other. We had “Meet on Mars” this morning, actually. On certain days, Mars scientists will put on their headsets and hang out virtually on Mars. They see each other. They talk. They look at Mars rocks and take notes. It's based on images from the Curiosity Mars rover. We converted those images to 3-D models to create the virtual terrain, so through VR, we can simulate walking on Mars without being there.

For a few weeks, I worked on another project with the InSight Mars lander mission. We took the terrain model that's generated from images of [the landing site] and made it so the team could see that terrain on top of their testbed [at JPL] with a HoloLens. For them, that's important because they're trying to recreate the terrain to … Wait, I recorded this.

[Michelle quickly scans through the photo library on her phone and pulls up a video she recorded from JPL’s In-Situ Instruments Laboratory. Pranay Mishra, a testbed engineer for the InSight mission, stands in a simulated Mars landscape next to a working model of the lander and explains:]

“When InSight reaches Mars, we're going to get images of the terrain that we land on. The instruments will be deployed to that terrain, so we will want to practice those deployments in the testbed. One of the biggest things that affects our deployment ability is the terrain. If the terrain is tilted or there are rocks in certain spots, that all has a strong effect on our deployment accuracy. To practice it here, we want the terrain in the testbed to match the terrain on Mars. The only things we can view from Mars are the images that we get back [from the lander]. We want to put those into the HoloLens so that we can start terraforming, or “marsforming,” the testbed terrain to match the terrain on Mars. That way, we can maybe get a rough idea of what the deployment would look like on Mars by practicing it on Earth.”
Michelle Vo in the InSight testbed at JPL

Michelle Vo stands in the InSight testbed at JPL with testbed engineers Drew Penrod (left) and Pranay Mishra (right). Image courtesy Michelle Vo | + Expand image

› Learn more about how scientists and engineers are creating a version of InSight's Mars landing site on Earth

They already gave us photos of Mars, which they turned into a 3D model. I created an AR project, where you look through the HoloLens – looking at the real world – and the 3D model is superimposed on the testbed. So the [testbed team] will shovel through and shape the terrain to match what it’s like on Mars, at InSight’s landing site.

Did you know that this was an area that you could work in at JPL before interning here?

OnSight was a well known project in the VR/AR space, since it was the first project to use the Microsoft Hololens. I remember being excited to see a talk on the project at the VRLA conference. So when I finally got on board with the team, I was super excited. I also realized that there’s room for improvement, and that’s OK. That’s why I'm here as an intern; I can bring in a fresh look.

One of the things I did on this project was incorporate physical controllers. My critique when I first started was, "This is really hard to use." And if it's hard for me to use as a millenial, how is this going to be usable for people of all ages? I'm always thinking in terms of accessibility for everybody. Through lots of testing, I realized that people need to be touching things, physical things. That's what OnSight lacked, a physical controller. There were a lot of things that I experimented with, and eventually, it came down to a keyboard that allows you to manipulate the simulated Mars rovers. So now with OnSight, you can drive the [simulated] rovers around with a keyboard controller and possibly in the future, type notes within the application. Previously, you had to tap into the air to use an AR keyboard, and that's not intuitive. We still need to touch the physical world.

How has this project compared with other ones that you've done elsewhere?

Well, I was the only girl developer intern on the team. I’m always battling stereotypes wherever I go and usually on other projects, I'm fighting for my place and fighting to fit in. But at JPL, everyone’s here because they love what they do. People are secure in themselves, and they see me as an equal. They're like, "Michelle has good ideas. Let's bring her to the table." Right off the bat, I felt accepted and, for the first time ever, the imposter-syndrome voice went away. I felt like I could just be myself and actually have a voice to contribute. You know, I might be small, I might be the shortest one, but I'm mighty. It’s been such a positive and supportive environment. I've had an incredible internship and learned so much.

What has been the most unique experience that you've had at JPL?

Working in the Ops Lab has been such a unique experience. Every day, we’re tinkering with cutting-edge technology in AR and VR. I am so thankful to have my mentors, Victor Luo and Parker Abercrombie, who give me the support and guidance I need to grow and learn. Outside of the Ops Lab, I also had the unique opportunity to meet astronaut Kate Rubins and talk about VR with her. I had lunch with NASA Administrator Jim Bridenstine when he visited JPL. And working with the InSight mission and Marleen Sundgaard, the mission’s testbed lead, was especially cool. I can't believe I was able to use my skills for something the Mars InSight mission needed. Being able to say that is something I'm really proud of. And seeing how far I came, from knowing nothing to being here, makes me feel happy. If I can transform, anyone can do this too, if they choose to work hard, follow their own path and see it in themselves to take a risk.

What advice do you have for others looking to follow your path?

Listen to your gut. Your gut knows. It’s easy to feel discouraged. But trust me, you’re not alone. You’ve always got to stay optimistic about finding a solution. I've always been someone who has experimented with a lot of things, and I think learning is something you should definitely experiment with. If the classroom setting is not for you, try teaching yourself, try a bootcamp, try asking a friend – just any alternative. You just have to know how you learn best.

My biggest inspiration is the future. I think about it on a daily basis. The future is so cool. I know I have a very cheery, idealistic view on life, but I think, "What's wrong with that?" as long as you can bring it back to reality, which I think I’ve been able to do.

Speaking of that, what is your ultimate dream for your career and your future?

I was raised in the Bay Area, and I grew up in Santa Clara so the tech culture of Silicon Valley was inescapable. I love Silicon Valley, but we have our problems. We have a huge homelessness issue. I’ve always thought, “We have the brightest engineers and scientists doing the most amazing, crazy things, yet we still can't alleviate homelessness.” Everybody deserves a place to sleep and shower. People need to have their basic needs met. I’d love to see some sort of VR wellness center that could help people train for a job, overcome fears and treat mental health.

That's my idealistic dream, but back to present-day dreams: I'm actually doing a 180. I'm leaving tech for a little bit, and I’m taking Fall quarter off. I'll start back at UCLA in January, but I'm taking a leave to explore being an artist. I'm writing a science-fiction play about Vietnamese-American culture. I was inspired by my experience here at JPL. I feel really optimistic about the future of technology, which is funny because science fiction usually likes to depict tech as something crazy, like an apocalypse or the world crashing down. But I'm like, “Vietnamese people survived an actual war, and they’re still here.” For my parents and grandparents, their country as they knew it came crashing down on them when they were just about my age. They escaped Vietnam by boat and faced many hardships as immigrants who came to America penniless and without knowing English. For them to have survived all of that and sacrificed so much to make it possible for me to be here is incredible. I think it’s a testament to how, despite the worst things, there's always good that continues. I’m so grateful and thankful for my family. I wouldn’t be here living my dream without them, and I want to create a play about that.

It's funny. Before I used to be so shy, so shy. I used to be that one kid who would never talk to anybody. So it's kind of nice to see what happens when the introvert comes out of her shell. And this is what happens. All of this. [Laughs.]


Explore JPL’s summer and year-round internship programs and apply at: https://www.jpl.nasa.gov/edu/intern

The laboratory’s STEM internship and fellowship programs are managed by the JPL Education Office. Extending the NASA Office of Education’s reach, JPL Education seeks to create the next generation of scientists, engineers, technologists and space explorers by supporting educators and bringing the excitement of NASA missions and science to learners of all ages.

TAGS: Women in STEM, Higher Education, College, Students, STEM, VR, AR, Technology, Mars, InSight, Curiosity, Women in STEM

  • Kim Orr
READ MORE