Skip to main content

You can help NASA train Mars rovers for the Red Planet

A selfie taken by NASA's Perseverance rover, which landed on Mars on Feb. 18, 2021.
A selfie taken by NASA's Perseverance rover, which landed on Mars on Feb. 18, 2021. (Image credit: NASA/JPL-Caltech/MSSS)

You can help NASA make its Mars rovers even better explorers.

NASA is asking citizen scientists to label "features of scientific interest in imagery taken by NASA's Perseverance Mars rover." The project — called AI4Mars (opens in new tab), and hosted on Zooniverse — continues work initiated last year with the Curiosity rover that ended up generating an object-classifying algorithm.

"Images from Perseverance will further improve [the algorithm] by expanding the kinds of identifying labels that can be applied to features on the Martian surface," officials at NASA's Jet Propulsion Laboratory (JPL) in Southern California, which manages the missions of both Curiosity and Perseverance, wrote in a statement (opens in new tab)

"AI4Mars now provides labels to identify more refined details, allowing people to choose options like float rocks (“islands” of rocks) or nodules (BB-size balls, often formed by water, of minerals that have been cemented together)," the officials added.

Related: Where to find the latest Mars photos from NASA's Perseverance rover

The tool produced via the Curiosity imagery is called SPOC (Soil Property and Object Classification (opens in new tab)). It was based upon work in which people labeled nearly half a million images, outlining features like sand and rock. The tool is getting such features right 98% of the time, and rover drivers are already using SPOC to plan Red Planet routes, JPL officials said. 

Perseverance has 23 cameras and sends dozens to hundreds of images to Earth each day. Mission team members would like to reduce the time between when the images are received and when instructions are uploaded to Perseverance from its teams. This can take hours, because engineers and geologists search the photos for specific features of interest as well as terrain that may be hazardous for the rover to traverse.

"It's not possible for any one scientist to look at all the downlinked images with scrutiny in such a short amount of time, every single day," Vivian Sun, a JPL scientist who helps coordinate Perseverance's daily operations and consulted on the AI4Mars project, said in the same statement. "It would save us time if there was an algorithm that could say, 'I think I saw rock veins or nodules over here,' and then the science team can look at those areas with more detail."

In the further future, labeling such geologic features could also help with the ongoing search for life on Mars, which includes a range of rover, orbiter and sample return spacecraft all expected to reach the Red Planet in the coming decade.

Follow Elizabeth Howell on Twitter @howellspace. Follow us on Twitter @Spacedotcom (opens in new tab) or Facebook (opens in new tab).  

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: community@space.com.

Elizabeth Howell, Ph.D., is a contributing writer for Space.com (opens in new tab) since 2012. As a proud Trekkie and Canadian, she tackles topics like spaceflight, diversity, science fiction, astronomy and gaming to help others explore the universe. Elizabeth's on-site reporting includes two human spaceflight launches from Kazakhstan, and embedded reporting from a simulated Mars mission in Utah. She holds a Ph.D. and M.Sc (opens in new tab). in Space Studies from the University of North Dakota, and a Bachelor of Journalism from Canada's Carleton University. Her latest book, NASA Leadership Moments, is co-written with astronaut Dave Williams. Elizabeth first got interested in space after watching the movie Apollo 13 in 1996, and still wants to be an astronaut someday.