17955841. HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS simplified abstract (NVIDIA Corporation)
Contents
- 1 HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Potential Commercial Applications
- 1.9 Possible Prior Art
- 1.10 Original Abstract Submitted
HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
Organization Name
Inventor(s)
Zoran Nikolic of Sugarland TX (US)
Eric Viscito of Shelburne VT (US)
HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS - A simplified explanation of the abstract
This abstract first appeared for US patent application 17955841 titled 'HARDWARE-BASED FEATURE TRACKER FOR AUTONOMOUS SYSTEMS AND APPLICATIONS
Simplified Explanation
The abstract describes techniques for using hardware feature trackers in autonomous or semi-autonomous systems, involving determining flow vectors and feature point locations in images using processors, including optical flow accelerators and vision processors.
- Hardware feature trackers used in autonomous systems
- Processors determine flow vectors and feature point locations in images
- Optical flow accelerators and vision processors are utilized
- Lookup tables are stored in hardware units for determining feature point locations
- Subpixel locations are considered in determining feature point locations
Potential Applications
The technology described in the patent application could be applied in various fields such as:
- Autonomous vehicles
- Robotics
- Surveillance systems
- Augmented reality
Problems Solved
The technology addresses the following issues:
- Accurate tracking of features in images
- Improved performance of autonomous systems
- Enhanced object recognition capabilities
Benefits
The technology offers the following benefits:
- Increased efficiency in image processing
- Enhanced accuracy in determining feature point locations
- Improved overall performance of autonomous or semi-autonomous systems
Potential Commercial Applications
The technology could be commercially applied in:
- Automotive industry for self-driving cars
- Security and surveillance systems
- Virtual reality and augmented reality applications
Possible Prior Art
One possible prior art could be the use of traditional optical flow algorithms for feature tracking in images. Another could be the use of vision processors in autonomous systems for object recognition.
What are the specific hardware units used in the technology described?
The specific hardware units used in the technology described include optical flow accelerators (OFAs) that store lookup tables for determining feature point locations in images.
How does the technology improve the accuracy of feature tracking in images?
The technology improves the accuracy of feature tracking in images by considering flow vectors associated with pixel locations and subpixel locations of feature points in the images, leading to more precise determination of feature point locations in subsequent images.
Original Abstract Submitted
In various examples, techniques for using hardware feature trackers in autonomous or semi-autonomous systems are described. Systems and methods are disclosed that use a processor(s) to determine flow vectors associated with pixel locations in a first image. The systems also use the processor(s) to determine a location of a feature point in a second image based at least on one or more of the flow vectors and a subpixel location of the feature point in the first image. In some examples, the processor(s) may include an optical flow accelerator (OFA) that includes a hardware unit storing a lookup table that is used to determine the location of the feature point in the second image. In some examples, the processor(s) may include an OFA to determine the flow vectors and a vision processor to determine the location of the feature point in the second image.