Nvidia corporation (20240176017). SENSOR FUSION USING ULTRASONIC SENSORS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS simplified abstract

From WikiPatents
Jump to navigation Jump to search

SENSOR FUSION USING ULTRASONIC SENSORS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

Organization Name

nvidia corporation

Inventor(s)

David Weikersdorfer of Moutain View CA (US)

Qian Lin of Berkeley CA (US)

Aman Jhunjhunwala of Toronto (CA)

Emilie Lucie Eloïse Wirbel of Nogent-sur-Marne (FR)

Sangmin Oh of San Jose CA (US)

Minwoo Park of Saratoga CA (US)

Gyeong Woo Cheon of San Jose CA (US)

Arthur Henry Rajala of Greenville OH (US)

Bor-Jeng Chen of San Jose CA (US)

SENSOR FUSION USING ULTRASONIC SENSORS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS - A simplified explanation of the abstract

This abstract first appeared for US patent application 20240176017 titled 'SENSOR FUSION USING ULTRASONIC SENSORS FOR AUTONOMOUS SYSTEMS AND APPLICATIONS

Simplified Explanation

The abstract describes techniques for sensor-fusion based object detection and free-space detection using ultrasonic sensors. Systems process sensor data to generate input data representing object locations, inputting this data into neural networks to output maps of the environment.

  • Ultrasonic sensors are used for object detection and free-space detection.
  • Sensor data is processed to generate input data representing object locations.
  • Neural networks are trained to output maps of the environment, such as height and occupancy maps.
  • The machine uses these outputs to perform operations.

Potential Applications

The technology can be applied in autonomous vehicles for obstacle detection and navigation, in industrial settings for object detection and collision avoidance, and in robotics for environment mapping and path planning.

Problems Solved

This technology solves the problem of accurate object detection and free-space detection using ultrasonic sensors, improving the overall safety and efficiency of machines operating in various environments.

Benefits

The benefits of this technology include enhanced object detection capabilities, improved navigation and collision avoidance, and better understanding of the environment for autonomous systems.

Potential Commercial Applications

Potential commercial applications of this technology include autonomous vehicles, industrial automation systems, robotics platforms, and smart infrastructure for smart cities.

Possible Prior Art

One possible prior art is the use of lidar sensors for environment mapping and object detection in autonomous vehicles and robotics systems. Another prior art could be the use of radar sensors for obstacle detection in industrial settings.

Unanswered Questions

How does this technology perform in different environmental conditions, such as extreme temperatures or weather conditions?

The article does not provide information on how the technology performs in various environmental conditions, which could impact its reliability and effectiveness in real-world applications.

What are the limitations of using ultrasonic sensors for object detection and free-space detection compared to other sensor technologies?

The article does not discuss the limitations of ultrasonic sensors in detail, such as range limitations or interference from other sources, which could affect the overall performance of the system.


Original Abstract Submitted

in various examples, techniques for sensor-fusion based object detection and/or free-space detection using ultrasonic sensors are described. systems may receive sensor data generated using one or more types of sensors of a machine. in some examples, the systems may then process at least a portion of the sensor data to generate input data, where the input data represents one or more locations of one or more objects within an environment. the systems may then input at least a portion of the sensor data and/or at least a portion of the input data into one or more neural networks that are trained to output one or more maps or other output representations associated with the environment. in some examples, the map(s) may include a height, an occupancy, and/or height/occupancy map generated, e.g., from a birds-eye-view perspective. the machine may use these outputs to perform one or more operations.