17862821. METHOD AND APPARATUS WITH LANE GENERTION simplified abstract (SAMSUNG ELECTRONICS CO., LTD.)

From WikiPatents
Revision as of 00:42, 4 January 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

METHOD AND APPARATUS WITH LANE GENERTION

Organization Name

SAMSUNG ELECTRONICS CO., LTD.

Inventor(s)

NAYEON Kim of Suwon-si (KR)

MOONSUB Byeon of Seoul (KR)

DOKWAN Oh of Hwaseong-si (KR)

DAE HYUN Ji of Hwaseong-si (KR)

METHOD AND APPARATUS WITH LANE GENERTION - A simplified explanation of the abstract

This abstract first appeared for US patent application 17862821 titled 'METHOD AND APPARATUS WITH LANE GENERTION

Simplified Explanation

The abstract describes a method for generating lane information using a neural network. Here is a simplified explanation of the abstract:

  • The method uses a neural network to generate a lane probability map based on an input image.
  • Another neural network is used to generate lane feature information and depth feature information by applying the lane probability map.
  • A third neural network is then used to generate depth distribution information based on the depth feature information.
  • Spatial information is generated by combining the lane feature information and the depth distribution information.
  • A fourth neural network is used to generate offset information, which represents the displacement between a lane position and a reference line.
  • Finally, three-dimensional (3D) lane information is generated using the offset information.

Potential applications of this technology:

  • Autonomous driving systems: The generated lane information can be used by autonomous vehicles to accurately detect and navigate within lanes.
  • Advanced driver assistance systems (ADAS): The technology can be used to enhance ADAS features such as lane departure warning and lane keeping assist.
  • Traffic management: The generated lane information can be used to monitor and analyze traffic patterns, aiding in traffic flow optimization and congestion management.

Problems solved by this technology:

  • Accurate lane detection: The neural network-based approach improves the accuracy of lane detection compared to traditional methods, even in challenging conditions such as poor lighting or occlusions.
  • Depth estimation: By incorporating depth feature information, the method can estimate the distance of the detected lanes from the vehicle, providing additional contextual information.
  • 3D lane representation: The technology enables the generation of three-dimensional lane information, which can be useful for various applications such as path planning and obstacle avoidance.

Benefits of this technology:

  • Improved safety: Accurate lane information can enhance the safety of autonomous vehicles and ADAS by enabling precise lane keeping and warning systems.
  • Enhanced situational awareness: The depth information and 3D lane representation provide a more comprehensive understanding of the road environment, aiding in decision-making for autonomous systems.
  • Robust performance: The use of neural networks allows the system to handle various challenging scenarios, making it more reliable and adaptable in real-world conditions.


Original Abstract Submitted

A method of generating lane information using a neural network includes generating a lane probability map based on an input image, generating lane feature information and depth feature information by applying the lane probability map to a second neural network, generating depth distribution information by applying the depth feature information to a third neural network, generating spatial information based on the lane feature information and the depth distribution information, generating offset information including a displacement between a position of a lane and a reference line by applying the spatial information to a fourth neural network, and generating three-dimensional (3D) lane information using the offset information.