Event Title

Rover Localization and Analog Astronaut Detection Using Machine Vision

Presenter Information

Bradley Hoffman

Loading...

Media is loading
 

Location

Clifford Hall, Room 210

Document Type

presentation

Start Date

9-5-2019 2:00 PM

End Date

9-5-2019 2:15 PM

Description

Human robotic teams in space exploration are becoming ever prevalent. Autonomous robotics can prove beneficial during human spaceflight and crewed planetary missions by offsetting task loads. A platform to aid in technology development regarding human-robotic interactions can be found through current crewed mockups and analog habitat mission studies. In this study the test bed of an Inflatable Lunar/Mars Analog Habitat (ILMAH) is utilized to develop and test an object detection machine vision algorithm for analog astronaut identification and rover localization using horizon features.

By applying Speed Up Robust Features (SURF) techniques for feature extraction in tandem to camera triangulation, accurate descriptions of an analog space suit, fiducial markers and the horizon have been identified. Through the use of SURF features and Random Sample Consensus (RANSAC), image stitching techniques were created to generate a 180° panoramic image of the analog habitat environment. Correlation of feature matching and object detection allowed for successful identification of an analog astronaut within the panoramic view of the habitat. A zerohorizon line was detected within the panorama allowing interpolation between the matched horizon pixel position. A corresponding pixel position value of the detected analog astronaut is quantified within the environment correlated to the 0 to 180° view. This angle will be used in future studies for autonomous rover movement.

This document is currently not available here.

Share

COinS
 
May 9th, 2:00 PM May 9th, 2:15 PM

Rover Localization and Analog Astronaut Detection Using Machine Vision

Clifford Hall, Room 210

Human robotic teams in space exploration are becoming ever prevalent. Autonomous robotics can prove beneficial during human spaceflight and crewed planetary missions by offsetting task loads. A platform to aid in technology development regarding human-robotic interactions can be found through current crewed mockups and analog habitat mission studies. In this study the test bed of an Inflatable Lunar/Mars Analog Habitat (ILMAH) is utilized to develop and test an object detection machine vision algorithm for analog astronaut identification and rover localization using horizon features.

By applying Speed Up Robust Features (SURF) techniques for feature extraction in tandem to camera triangulation, accurate descriptions of an analog space suit, fiducial markers and the horizon have been identified. Through the use of SURF features and Random Sample Consensus (RANSAC), image stitching techniques were created to generate a 180° panoramic image of the analog habitat environment. Correlation of feature matching and object detection allowed for successful identification of an analog astronaut within the panoramic view of the habitat. A zerohorizon line was detected within the panorama allowing interpolation between the matched horizon pixel position. A corresponding pixel position value of the detected analog astronaut is quantified within the environment correlated to the 0 to 180° view. This angle will be used in future studies for autonomous rover movement.