US Patent Application 18224414. SYSTEM AND METHOD FOR CONCURRENT ODOMETRY AND MAPPING simplified abstract

From WikiPatents
Jump to navigation Jump to search

SYSTEM AND METHOD FOR CONCURRENT ODOMETRY AND MAPPING

Organization Name

Google LLC


Inventor(s)

Esha Nerurkar of Mountian View CA (US)

Simon Lynen of Mountain View CA (US)

Sheng Zhao of Mountain View CA (US)

SYSTEM AND METHOD FOR CONCURRENT ODOMETRY AND MAPPING - A simplified explanation of the abstract

This abstract first appeared for US patent application 18224414 titled 'SYSTEM AND METHOD FOR CONCURRENT ODOMETRY AND MAPPING

Simplified Explanation

- The patent application describes an electronic device that tracks its motion in an environment and builds a three-dimensional visual representation of the environment. - The device uses feature descriptors, which are visual representations of spatial features of objects in the environment, to estimate its poses. - A mapping module combines the feature descriptors and estimated poses to create a three-dimensional visual representation of the environment. - This representation is then used by a localization module to identify correspondences between stored and observed feature descriptors. - The localization module performs a loop closure by minimizing discrepancies between matching feature descriptors to compute a localized pose. - The localized pose corrects any drift in the estimated pose generated by the motion tracking module.


Original Abstract Submitted

An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used to correct drift in the tracked motion. A motion tracking module estimates poses of the electronic device based on feature descriptors corresponding to the visual appearance of spatial features of objects in the environment. A mapping module builds a three-dimensional visual representation of the environment based on a stored plurality of maps, and feature descriptors and estimated device poses received from the motion tracking module. The mapping module provides the three-dimensional visual representation of the environment to a localization module, which identifies correspondences between stored and observed feature descriptors. The localization module performs a loop closure by minimizing the discrepancies between matching feature descriptors to compute a localized pose. The localized pose corrects drift in the estimated pose generated by the motion tracking module.