Of all the technical foes that assailed Ingenuity over its 72 flights on Mars, few proved quite so vexing as the planet’s own dull scenery: mile after mile of sand and rock with no sharp visual features to speak of, confusing its vision-based navigation system, which estimates motion by tracking ground textures. The latest software development from NASA, called Extended Robust Aerial Autonomy, aims to eradicate that blind spot, letting future flyers travel over low‑texture landscapes without losing their sense of position.

To help validate the system, JPL engineers turned to some of Earth’s most Mars‑like environments. Death Valley National Park’s Mesquite Flats Sand Dunes were a perfect analogue for the featureless Martian surfaces that had troubled Ingenuity, with their barren expanses. Three research drones were flown under extreme conditions in late April and early September, with temperatures reaching 45°C, in order to test navigation algorithms and assess how different camera filters affected ground tracking. The team also trialed safe‑landing routines in cluttered terrain at Mars Hill, a site historically used for preparations of the Viking landers. “We want future vehicles to be more versatile and not have to worry about flying over challenging areas like these sand dunes,” says Roland Brockers, JPL researcher and drone pilot.
Additional flights at the Dumont Dunes in the Mojave Desert added rippled sand surfaces to further complicate the vision‑based challenges. It is important to perform such analogue testing because vision‑based navigation and SLAM algorithms are challenged when texture cues are not plentiful. In low‑texture environments, feature extraction for motion estimation becomes unreliable; this results in drift or loss of localization. The Extended Robust Aerial Autonomy system integrates refined image‑processing pipelines and adaptive landing algorithms to overcome these limitations.
While aerial autonomy is advancing, NASA’s Mars Exploration Program is also investing in next‑generation ground robotics. At White Sands National Park, the LASSIE‑M robot demonstrated proprioceptive terrain sensing. Its leg actuators measure contact forces and surface compliance, enabling it to classify sandy, crusty, or rocky substrates in real time. This capability is vital for avoiding hazards that have historically immobilized rovers, such as the wheelsinkage incidents that plagued Spirit. Proprioceptive sensing, which directly measures mechanical interaction with the terrain, complements vision‑based classification, which is subject to lighting or dust cover.
Research has underlined the importance of correct terrain classification, showing that Mars rovers should automatically distinguish between sandy terrain, hard terrain, and gravel terrain to optimize mobility without causing damage. By combining color, texture, and geometric features, random forest classifiers reached 94.66% accuracy between these types, outperforming support vector machines and k‑nearest neighbor approaches. Misclassifying either ST or GT as HT can cause dangerous routing decisions, increasing the risk of getting immobilized or causing wheel damage. Combining proprioceptive data with sophisticated classifiers allows robots like LASSIE‑M to dynamically adjust both gait and path to mitigate such risks.
On the aerial side, NASA’s Langley Research Center is developing the Mars Electric Reusable Flyer-MERF, a winged vehicle with twin vertical-lift propellers designed for long-range reconnaissance. At full scale, MERF will span the length of a small school bus but stay light enough to fly in Mars’ thin atmosphere. It trades fuselage mass for aerodynamic efficiency to achieve high-speed surface mapping with instruments slung under its belly. Testing of a half-scale prototype in Virginia has concentrated on validating aerodynamic stability and lightweight material durability through cycles of vertical take-off and landing.
The interplay between the aerial and the ground is vital in NASA’s vision of Independent Mars Research. Extended Robust Aerial Autonomy would include flying drones over featureless plains to identify science targets, while robots such as LASSIE‑M navigate across treacherous terrain to get to them. Such terrain classification, using high accuracy random forest classifiers with multi-modal sensing strategies, will enable both flying and driving units to function with little human involvement. Testing in analog environments like Death Valley and White Sands is more than a rehearsal; it’s a proving ground where algorithms meet the physical realities of extreme terrain, validating the engineering that would head to Mars.

