| Underwater vehicles are being used extensively to explore the ocean depths, but to enhance their utility for marine scientists and other end-users, new navigation capabilities must be developed. To determine vehicle location for the purpose of navigation, vision-based mosaicking is a promising new technology with several inherent benefits: it is inexpensive, utilizes any existing camera on-board the vehicle, and does not require extensive set-up or calibration.; While vision sensing is an attractive option, it also has its challenges. The vehicle must always be within visual range of the ocean floor to make motion measurements, so this limits navigation to the near-bottom environment. Current techniques for vision-based navigation are inaccurate and unreliable because they are based on dead reckoning. In dead reckoning, the absolute vehicle position is estimated by integrating the vision-based relative motion measurements along the vehicle path. Even though the relative motion measurements are precise, the error in absolute position is unbounded because of the small errors that accrue over the length of the vehicle path. This situation results in unbounded navigational errors, and it is the focus of this work.; The purpose of this thesis is to develop a map-based approach to visual navigation. The composite-image mosaics are used as reference maps for navigation. Once the map is constructed, the live image from the vehicle can be compared directly to the goal location within the map. Since the relative displacement between the current and desired vehicle locations can be measured directly in map coordinates, the navigational error will be bounded, regardless of the accuracy of absolute position estimation with respect to some global coordinate system.; Theoretical and experimental results of this work are presented, including demonstrations of the complete navigation system on both the OTTER autonomous underwater vehicle in the test tank and the Ventana remotely-operated vehicle in the ocean. This work has been performed under a joint effort between the Aerospace Robotics Laboratory (ARL) at Stanford University and the Monterey Bay Aquarium Bay Aquarium Research Institute (MBARI). |