Patent Application 18605343 - Method for Determining a Cleaning Information - Rejection
Appearance
Patent Application 18605343 - Method for Determining a Cleaning Information
Title: Method for Determining a Cleaning Information, Method for Training of a Neural Network Algorithm, Control Unit, Camera Sensor System, Vehicle, Computer Program and Storage Medium
Application Information
- Invention Title: Method for Determining a Cleaning Information, Method for Training of a Neural Network Algorithm, Control Unit, Camera Sensor System, Vehicle, Computer Program and Storage Medium
- Application Number: 18605343
- Submission Date: 2025-05-15T00:00:00.000Z
- Effective Filing Date: 2024-03-14T00:00:00.000Z
- Filing Date: 2024-03-14T00:00:00.000Z
- National Class: 348
- National Sub-Class: 148000
- Examiner Employee Number: 87380
- Art Unit: 2422
- Tech Center: 2400
Rejection Summary
- 102 Rejections: 1
- 103 Rejections: 1
Cited Patents
The following patents were cited in the rejection:
Office Action Text
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on 3/14/2024 was filed after the mailing date of the claims on 3/14/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Objections Claim 15 is objected to because of the following informalities: “Non-transient” should be changed to “non-transitory”. Appropriate correction is required. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. 1. Claim(s) 1-5, 7-8, 10-15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by U.S. Patent Application 2020/0094784. Herman et al. (hereinafter Herman). 2. Regarding Claim 1, Herman discloses Method for determining a cleaning information for an at least partially blocked camera sensor (Fig. 1; [0048], If the point specified by coordinates x′, y′ on the exterior surface 250 of the sensor 130 is partially or fully blocked, e.g., by rain drop, fog, etc., the image data for the pixel x″, y″ of the camera sensor 130 may be incorrect (i.e., may not reflect the light beams received from the point x, y, z).), which comprises a blockage on a transparent camera sensor ([0048], “Partially blocked,” in the present context, means that the occlusion is translucent, e.g., a rain drop allows light beams to pass through, but affects received image data from the sensor 130) component in an optical path of the camera sensor (optical path 230 in fig. 2A, [0052], transparency of the sensor 130), wherein the method comprises: — controlling the camera sensor to capture at least one camera image ([0047], captured by a camera sensor 130), — processing, by a computing device, the at least one camera image with a neural network algorithm ([0058], A “neural network” (NN) is a computing system implemented in software and/or hardware that is inspired by biological neural networks. A neural network learns to perform tasks by studying examples generally without being programmed with any task-specific rules. A neural network can be a software program that can be loaded in memory and executed by a processor included in a computer, for example the computer 110), wherein the neural network algorithm is configured to determine as an output, from the at least one camera image, a degree of camera sensor blockage by segmentation of at least a part of the at least one camera image ([0071], blocking coefficient) and a blockage class ([0056], In another example, the occlusion class may include sub-classes such as rain drop, fog, smudge, etc., The example image 600 illustrates occlusions 510a, 510b classified as rain drop sub-class”) of the camera sensor blockage from a plurality of blockage classes by classification of at least a part of the at least one camera image, — determining cleaning information in dependency of the degree of camera sensor blockage and the blockage class of the camera sensor blockage, wherein the cleaning information describes that a cleaning of the camera sensor is required, if a cleaning criterion is assigned to the blockage class of the camera sensor blockage ([0070], “a first cleaning duration D, e.g., 1 second, associated with a sub-class rain drop may be shorter than a cleaning duration D, e.g., 3 seconds, associated with a sub class inspect, bird by product, etc.,”) and if at least one degree threshold is exceeded by the determined degree of camera sensor blockage ([0071], threshold), and — transmitting the cleaning information to a cleaning device associated with the camera sensor in order to clean the camera sensor according to the cleaning information ([0072], The computer 110 may be programmed to actuate the cleaning actuator 120 based on one or more score thresholds Th.sub.1, Th.sub.2, Th.sub.3. For example, the computer 110 may be programmed to actuate the cleaning actuator 120 upon determining that the score s (x, y, t) of at least one location on the surface 250 exceeds the threshold Th.sub.1 and the vehicle 100 speed is 0 (zero)). 3. Regarding Claim 2, Herman discloses Method according to claim 1, wherein the neural network algorithm comprises at least one of a semantic segmentation algorithm ([0053], a semantic segmentation of sensor 130 image 500 data) or an algorithm comprising both a binary segmentation and a classifier model. 4. Regarding Claim 3, Herman discloses Method according to claim 1, wherein the plurality of blockage classes comprises at least one unblocked class to which no cleaning criterion is assigned (Fig. 5: area where there is no occlusion 510a 510b of image 500 received from vehicle camera sensor 130). 5. Regarding Claim 4, Herman discloses Method according to claim 1, wherein the plurality of blockage classes comprises at least one of a soiling class, a droplet class, or a condensation class, to which each cleaning criterion is assigned ([0056], the occlusion class may include sub-classes such as rain drop, fog, smudge, etc.). 6. Regarding Claim 5, Herman discloses Method according to claim 1, wherein the cleaning information is determined by a further algorithm, wherein the cleaning information describes a cleaning strategy for cleaning the camera sensor ([0062], programmed to select a cleaning plan), wherein the cleaning strategy is determined from a plurality of cleaning strategies in dependency of at least one of the determined blockage class (claimed in the alternative) or by comparison of the determined degree of camera sensor blockage to two (claimed in the alternative) or more different degree thresholds assigned to each cleaning strategy ([0069], a high scores (x, y), e.g., 1 (one), may indicate that a cleaning plan for the surface 250 is warranted (or needed) and [0070], e.g., including a time of cleaning t.sub.c and/or duration D of cleaning, based on the determined score s (x, y, t)). 8. Regarding Claim 7, Herman discloses Method according to claim 1, wherein the cleaning information comprises cleaning commands for at least one of a liquid-based cleaning ([0083], sprayer), an air-based cleaning (claimed in the alternative) or an actuator-based cleaning ([0054], wiper). 9. Regarding Claim 8, Herman discloses Method for training of a neural network algorithm ([0058], A “neural network” (NN) is a computing system implemented in software) for use in a method according to claim 1, comprising: — providing a plurality of training camera images captured by at least one unblocked camera sensor ([0001], A vehicle may include one or more optical or image sensors such as camera sensors), — augmenting at least some of the training camera images by superimposing a blockage mask on each training camera image ([0060] FIG. 7 shows an image 700 that illustrates priorities of points of the example image 500 superimposed on the image 500), wherein the blockage mask is assigned to a blockage class of the plurality of blockage classes (table 1: Rain drop Occlusion class includes any partial or full Fog blockage within the optical path, e.g., on Dust and/or in the lens, protective surface, Smudge windshield, etc. Insect Bird byproduct), wherein the blocking mask blocks a portion of the training camera image according to a blocking degree of the blockage mask (Fig.5, [0001], A transparent surface such as a camera lens is typically subject to environmental conditions, e.g., dust, smudge, rain, fog, etc., that can impair visibility), wherein the blocking degree is determined stochastically for each training camera image ([0052], FIG. 5 shows an example rain drop occlusion 510a and an example insect occlusion 510b), — associate a label to each training camera image ([0056], features may be learnt based a labeled data set where a machine learning algorithm), wherein the label describes the blockage class ([0056], an occlusion class) and the blocking degree of the blockage mask augmented to the training camera image ([0075], a larger occlusion 510a, 510b), — generating an output of the neural network algorithm for each augmented training camera image by processing the augmented camera image through one or more network layers of the neural network algorithm ([0058], neural network typically includes a plurality of layers) in accordance with parameters associated with the one or more network layers ([0080], The computer 110 may be programmed to detect the occlusion 510a, 510 based on the output of a trained neural network), — comparing the generated output [0071], a rain drop may have a lower blocking coefficient blockage.sub.coeff compared to a blocking coefficient blockage.sub.coeff of an insect, bird byproduct, etc.) for each augmented camera image with the label associated ([0056], the “features” include points, lines, edges, corners, color, textures, and/or other geometric entities found in the image 500. features may be learnt based a labeled data set where a machine learning algorithm, e.g. a neural network) with the augmented camera image using an objective function ([ 0052], optical attributes may include focal length, lens 240 distortion model parameter), and — updating the parameters ([0057], the computer 110 may be programmed to perform the segmentation of the image 500 data based on an output of a neural network trained to detect multiple feature classes including at least an occlusion class) associated with the one or more network layers ([0058], neural network typically includes a plurality of layers) based on the comparison [0071], a rain drop may have a lower blocking coefficient blockage.sub.coeff compared to a blocking coefficient blockage.sub.coeff of an insect, bird byproduct, etc. In one example, the computer 110 may be programmed to determine that an occlusion 510a, 510b exists at location coordinates x, y of the surface 250, upon determining that the blocking coefficients blockage.sub.coeff at the location coordinates x, y exceeds a specified threshold, e.g., 0.1). 11. Regarding Claim 10, Herman discloses Method according to claim 9, wherein the at least one further parameter is varied within one or more intervals associated with the blockage class ([0055], FIG. 6, the computer 110 may be programmed to detect feature(s) in the received image 500 data and classify the features based on specified feature classes) assigned to the blockage mask ([0029], selecting a cleaning plan for the surface based on the detected occlusion). 12. Regarding Claim 11, Herman discloses Control unit comprising a computer, wherein a control unit comprising a computer is configured to carry out a method according to claim 1 (Fig. 1; [0055], the computer 110 may be programmed to detect feature(s)). 13. Regarding Claim 12, Herman discloses Camera sensor system comprising the at least one camera sensor ([0052], camera sensor 130), the cleaning device associated with the camera sensor and a control unit ([0053], select a cleaning plan for the surface 250 based on the detected occlusion 510a, 510b, map data, and/or vehicle 100 route data) according to claim 11. 14. Regarding Claim 13, Herman discloses Vehicle comprising the camera sensor system ([0052], camera sensor 130) according to claim 12. 15. Regarding Claim 14, Herman discloses Computer program comprising computer program instructions which, when executed by a computer ([0084], a computer readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein), control the computer to carry out the method ([0085], 0085] A computer readable medium includes any medium that participates in providing data (e.g., instructions), which may be read by a computer) according to claim 1. 16. Regarding Claim 15, Herman discloses Non-transient storage medium comprising the computer program ([0084], computer 105 is generally a collection of data stored on a computer readable medium, such as a storage medium, a random-access memory) according to claim 14. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 17. Claim(s) 6 is rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Application 2020/0094784. Herman et al. (hereinafter Herman) in view of U.S. Patent Application 2020/0139936, Yamauchi et al. (hereinafter Yamauchi). 18. Regarding Claim 6, Herman discloses Method according to claim 5, wherein the cleaning strategy is determined ([0062], programmed to select a cleaning plan) However, Herman does not explicitly disclose additionally in dependence of at least one cleaning device state information which describes a current state of the cleaning device. Yamauchi teaches additionally in dependence of at least one cleaning device state information which describes a current state of the cleaning device ([0090], a vehicle cleaning system that detects a level of washer fluid and that cleans when there is a sufficient level of washer fluid in a washer tank, see applicant’s spec [0030], “The cleaning device state information may be for instance a level of a cleaning fluid in a container”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine a level sensor as taught in Yamauchi to incorporate it to the cleaning plan as taught in Herman in order to determine the amount of fluid levels of a current state of the cleaning device. Hence, helps alert the user recognize the amount of fluid available in the reservoir of the cleaning device. Allowable Subject Matter Claim 9 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to OMER KHALID whose telephone number is (571)270-5997. The examiner can normally be reached Monday- Friday 9am-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Miller can be reached at (571) 272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /OMER KHALID/Examiner, Art Unit 2422 /JOHN W MILLER/Supervisory Patent Examiner, Art Unit 2422