Patent Application 18610781 - Graphically Adaptive Vehicle User Interface - Rejection
Appearance
Patent Application 18610781 - Graphically Adaptive Vehicle User Interface
Title: Graphically Adaptive Vehicle User Interface Apparatus and Method
Application Information
- Invention Title: Graphically Adaptive Vehicle User Interface Apparatus and Method
- Application Number: 18610781
- Submission Date: 2025-04-07T00:00:00.000Z
- Effective Filing Date: 2024-03-20T00:00:00.000Z
- Filing Date: 2024-03-20T00:00:00.000Z
- National Class: 345
- National Sub-Class: 156000
- Examiner Employee Number: 73943
- Art Unit: 2619
- Tech Center: 2600
Rejection Summary
- 102 Rejections: 0
- 103 Rejections: 1
Cited Patents
The following patents were cited in the rejection:
Office Action Text
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over WATANABE et al. (JP 2023006618) in view of Faulkner et al. (US 2021/0096726). As to claims 1-2, 14, and 17-20, Watanabe discloses a user interface (Figs. 27-29), method for operating, including computer program in a memory (Fig. 3C, 1108, 1109, 1170) comprising instructions executed by a processor (Fig. 3C, image control unit (1110, 1160)), for a vehicle (see paragraph 3 of the BEST MODE FOR CARRYING OUT THE INVENTION), comprising: an imaging device (Fig. 3C, The spatial floating image display device (1000 ) configured to display a virtual image in an image plane (Figs. 3A, 3B, 3C (3)) in an environment of the user interface (see Figs. 27-29), a sensing device configured to sense a user input in relation to the image plane (Fig. 3C, (1351)), a data processing device (Fig. 3C, (1110, 1160) adapted to control the imaging device and to obtain the user input from the sensing device (Fig. 3C, (1110, 1160)), wherein the imaging device (Fig. 3, (1000) comprises a display device (Figs 3A, 3B, (11), and Fig. 3C, (1102) adapted to emit light in response to at least one control signal from the data processing device (Fig. 3C, (1160): (“The image display unit 1102 is a display unit that modulates transmitted light and generates an image based on a video signal that is input under the control of the image control unit 1160), and an optics device (Figs. 3A, 3B, (21, 101), and Fig. 3C, (1104)) configured to display the virtual image (Figs. 3A, 3B, 3C, (3))based on the emitted light (“The light guide 1104 guides the light generated by the light source 1105”), and the data processing device (Fig. 3C, image control unit (1110, 1160)) is adapted to control the display device to graphically adapt the virtual image (Figs. 27-29, (3) in response to the user input (Figs. 27-29, (virtual shadow (1510) display processing). Further, Watanable discloses graphically adapting the virtual image comprises at least one of the group consisting of a scaling of the virtual image, a distortion of the virtual image, and a displacement of the virtual image (see Figs. 27-29, virtual shadow (1510): (“A virtual shadow 1510 simulating the shadow of the finger 210 formed by the light emitted from the virtual light source 1500 is displayed in the floating image 3 . In the example of FIGS. 27-29, virtual shadow 1510 is displayed to the left of finger 210 . This virtual shadow 1510 assists the user in performing a touch operation.”). However, further, Watanable does not specifically disclose distortion of the virtual image. Faulkner discloses graphically adapting the virtual image [0165, 0168, 0169, 0194, 0200], including simulating manipulation of a shape representative of a graphical user interface mechanism in response to a user input (Figs. 7G, 7O, 7P) [0185, 0193, 0207), comprising a distortion of the virtual image shape, (Figs. 7G-7P, [0138, 0199, 0203, 0285, 0292]). It would have been obvious to one of ordinary skill in the art at the time of filing to have the distortion of the virtual image, as taught by Faulker, in the device of Watanabe, since this helps to allow the user to quickly reorient himself/herself when he/she feels insure about his/her body position in the physical environment, without completely exiting the immersive experience [0204]. As to claims 3, 5, and 15, Watanabe discloses, further, data processing device is adapted to graphically adapt the virtual image (virtual shadow, (Figs. 27-29) (1510) in dependence on a threshold condition relating to a distance between the image plane and the user input (Figs. 27A (dx1), 27B(dz1), 28A(dx2), 28B(dz2). As to claims 4, 7, and 16, Watanabe discloses, further, the data processing device is adapted to graphically adapt the virtual image in real-time as the user input is sensed (“In the state of FIG. 27(B), compared with the states of FIG. 28(B) and FIG. furthest away. Therefore, in FIG. 27A, the tip of the virtual shadow 1510 is the farthest in the horizontal direction from the first button BUT1 to be touched compared to the states of FIGS. 28A and 29A. formed at the position Therefore, in FIG. 27A, the horizontal distance between the tip of the finger 210 and the tip of the virtual shadow 1510 when the display surface 3a of the floating image 3 is viewed from the front is the state shown in FIG. Compared to the state of FIG. 29(A), it becomes the largest. In FIG. 27A, the distance between the tip of the finger 210 and the tip of the virtual shadow 1510 in the horizontal direction of the display surface 3a of the floating image 3 is dx1”). As to claim 6, further, Watanabe discloses the data processing device is adapted to control the display device to graphically adapt the virtual image corresponding to a lateral distance between the image plane and the user input (see Fig. 5, and (“it is possible to detect not only the coordinates in the planar direction of the object but also the coordinates in the depth direction and the direction and speed of movement of the object”). As to claims 8-11, further, the sensing device is adapted to sense a position of the one or more fingers, and wherein the data processing device interprets the position and the movement as the user input.(“it is possible to detect not only the coordinates in the planar direction of the object but also the coordinates in the depth direction and the direction and speed of movement of the object”). As to claim 12, further, Watanabe discloses the virtual image is context-related (see Figs. 27-29 and 37-39). As to claim 13, further, Watanabe discloses the virtual image is dynamic or time-dependent (FIG. 27 shows the state at the first point in time when the user tries to touch the first button BUT1 on the display surface 3a of the floating image 3 with the finger 210, and FIG. 29 shows the state at the third point in time when the finger 210 touches the first button BUT1 on the display surface 3a of the floating image 3 in space”). Response to Arguments Applicant’s arguments with respect to claim(s) 01/02/2025 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICARDO OSORIO whose telephone number is (571)272-7676. The examiner can normally be reached M-F 9 AM-5:30 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached on 571-272-7671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RICARDO OSORIO/ Primary Examiner, Art Unit 2619