Planetary exploration, by its nature, requires operation in highly unstructured environments. The lack of structure on the surfaces of Mars or the Moon challenges established machine perception techniques because the irregular terrain found on planetary surfaces does not satisfy standard constraints on shape, e.g., symmetry, or surface properties, e.g., smoothness, nor does it permit the standard tricks of controlled lighting or fixtured objects.
Over the past five years, we have addressed this challenge by developing and validating laser rangefinder systems for two outdoor rovers:
Over the past year, we have shifted our focus from Mars to the Moon, as well as from missions driven exclusively by science objectives to missions that jointly satisfy entertainment and scientific purposes. Given these changes, we have undertaken a transition from walkers to rolling robots, and from range to video sensing.
Despite all these changes, perception remains as one of the critical technologies enabling mobile robots to operate autonomously for extended periods of time in rugged, natural, unstructured environments. Thus, we continue to investigate new approaches to the old problems of avoiding obstacles and fixing position, all to enable robotic exploration of planetary surfaces.
In this paper, we present new approaches and results in stereo driving and in position estimation. Both approaches are entirely passive; they do not require scanning or transmission of electromagnetic radiation into the environment. In addition, both are consistent with our aim for maximal rover self-reliance; they diminish requirements for close human supervision. We are validating experimentally these approaches on rovers-including the Ratler and the HMMWV (Figure 1)-and in outdoor trials. Future efforts will transfer the developed technology into Lunar Rover demonstration and flight programs.
In Section 2, we present an innovative stereo driving approach that bypasses traditional three-dimensional reconstruction. In Section 3, we develop position estimation approach that computes the latitude and longitude of a camera based on its observations of the Sun or other celestial bodies. In Section 4, we summarize the contributions, and discuss future directions.