Jump to content

Patent Application 18369502 - Devices Methods and Graphical User Interfaces - Rejection

From WikiPatents

Patent Application 18369502 - Devices Methods and Graphical User Interfaces

Title: Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments

Application Information

  • Invention Title: Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
  • Application Number: 18369502
  • Submission Date: 2025-05-14T00:00:00.000Z
  • Effective Filing Date: 2023-09-18T00:00:00.000Z
  • Filing Date: 2023-09-18T00:00:00.000Z
  • National Class: 345
  • National Sub-Class: 633000
  • Examiner Employee Number: 90162
  • Art Unit: 2616
  • Tech Center: 2600

Rejection Summary

  • 102 Rejections: 0
  • 103 Rejections: 1

Cited Patents

The following patents were cited in the rejection:

Office Action Text



    DETAILED ACTION
Notice of Pre-AIA  or AIA  Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .

Response to Amendment
The amendment filed on November 27, 2023 has been entered.
In view of the amendment to the specification, clean substitute specification has been acknowledged.
	
	
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows: 
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.

Claim 9 is rejected under 35 USC 101 for being directed to nonstatutory computer readable medium. 

Claim 9 is directed to non-statutory subject matter in the form of a “computer-readable storage medium”.  The claims fall outside the scope of patent-eligible subject matter at least because the claimed computer-readable storage medium is broad enough to encompass transitory embodiments (e.g., one of ordinary skill in the art could reasonably be expected to interpret the computer-readable storage medium as a carrier wave onto which instructions could be coded). The broadest reasonable interpretation of a claim drawn to a computer readable medium typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of computer readable media. See MPEP 2106. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 U.S.C. § 101 as covering non-statutory subject matter.
See also the Official Gazette Notice 1351 OG 212 February 23, 2010 “Subject Matter Eligibility of Computer Readable Media” which states in relevant part “[i]n an effort to assist the patent community in overcoming a rejection or potential rejection under 35 U.S.C. § 101 in this situation, the USPTO suggests the following approach.  A claim drawn to such a computer readable media that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 U.S.C. § 101 by adding the limitation ‘non-transitory’ to the claim.”   
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b)  CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.


The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.


Claims 6-8 and 15-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA  the applicant regards as the invention.

Claims 6, 7 and 8 depend upon independent claim 1. Claim 1 recites limitations for detecting a first input. However, claim 6 recites limitations for detecting a third input; claim 7 recites limitations for detecting a fourth input; claim 8 recites limitations for detecting a fifth input. The issue is persons of ordinary skill in the art reading the specification is not able to understand how to detect third, fourth and fifth input without detecting a second input. Therefore, the examiner deems the claims indefinite as they fail to particularly point out and distinctly claim what Applicant regards as the invention. Accordingly, the claims are rejected under U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph.

Claims 15, 16 and 17 depend upon independent claim 10. Claim 10 recites limitations for detecting a first input. However, claim 15 recites limitations for detecting a third input; claim 16 recites limitations for detecting a fourth input; claim 17 recites limitations for detecting a fifth input. The issue is persons of ordinary skill in the art reading the specification is not able to understand how to detect third, fourth and fifth input without detecting a second input. Therefore, the examiner deems the claims indefinite as they fail to particularly point out and distinctly claim what Applicant regards as the invention. Accordingly, the claims are rejected under U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph.
	
	
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains.  Patentability shall not be negated by the manner in which the invention was made.


Claims 1, 9 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over WHELAN et al (U.S. Patent Application Publication 2022/0397988 A1).

	Regarding claim 1, WHELAN discloses a method, comprising:
at a computer system (Paragraph [0044], FIG. 2 shows a block diagram of a system 200 of a touch device that is configured for improvements and enhancements in Uls, e.g., for pen-specific user interface controls. System 200 is an embodiment of touch device 104 in system 100A of FIG. 1A ...) that includes or is in communication with a display generation component (Paragraph [0045], system 200 may also include UIs and menus 218) and one or more input devices (Paragraph [0045], an input/output (I/O) interface(s) 222; paragraph [0048], I/O interface(s) 222 may comprise hardware and/or software and may support any number of input devices ...): 
while displaying via the display generation component an application user interface (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein. Uls and menus 218 may comprise portions of any types of software applications, such as apps 228 ...  3-D software/virtual environments ...), detecting a first input to an input device of the one or more input devices (Paragraph [0052],  Input detector 210 may be configured to receive inputs from one or more input interfaces of I/O interface(s) 222 ...); and 
in response to detecting the first input to the input device (Paragraph [0052], Input detector 210 may be configured to determine characterization information or characteristics of the contact instrument interaction with the touch interface, and to identify commands associated with the input for pen-specific user interface controls, such as when a specific activator of a touch pen is activated (e.g., tail activator 116 described above for FIGS. 1A-1C). As an example, one or more of communication signal 114 may be received by a digitizer of system 200, such as digitizer 106, that are indicative of tail activator 116 being activated; paragraph [0065], turning now to FIGS. 4A, 4B, 4C, 4D, and 4E, diagrams of user interfaces with pen menus for pen-specific user interface controls are shown, in accordance with example embodiments. The Uls of FIGS. 4A, 4B, 4C, 4D, and/or 4E may be embodiments of system 200 and Uls and menus 218 of FIG. 2, and may be presented/displayed via one or more displays described herein, such as display 130 and/or display 132 of FIG. 1D, which may be included in system 200 of FIG. 2): 
in accordance with a determination that the application user interface is in a first mode of display (FIG. 4D; paragraph [0073], application UI 424 that is displayed in display 132 for another executing software application. Thus, the top display 132 of FIG. 4D can be interpreted as a first mode of display), wherein the first mode of display comprises an Paragraph [0073], the display 132 only displays content of application UI 424;  paragraph [0049], Uls and menus 218 may comprise portions of any types of software applications, such as  3-D software/virtual environments), displaying via the display generation component the application user interface in a second mode of display (Paragraph [0074], When tail activator 116 of a touch pen is activated in the illustrated embodiment for FIG. 4D, touch pen menu 416 is caused to be displayed on display 132 above application UI 424, Thus, the bottom display 132 of FIG. 4D can be interpreted as a second mode of display), wherein the second mode of display comprises a Paragraph [0074], the bottom display 132 displays content of application UI 424) and other content are concurrently displayed (Paragraph [0074], the bottom display 132 displays content of touch pen menu 416); and 
in accordance with a determination that the application user interface is in the second mode of display (As shown in FIG. 4D, the bottom display 132), replacing display of at least a portion of the application user interface by displaying a home menu user interface via the display generation component (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein ... Uls and menus 218 may include a touch pen menu 220 having pen-specific user interface controls, e.g., selectable controls to launch pen-specific applications; paragraph [0074], touch pen menu 416 is caused to be displayed on display 132 above application UI 424).
It is noted that WHELAN does not use terms of “immersive” and “non-immersive” for the first display mode and the second display mode. However, the claim describes “immersive” mode for only displaying content of the application user interface and “non-immersive” mode for displaying content of the application user interface and other content. The prior art reference WHELAN describes that the top display 132 of FIG. 4D only displays “application UI 424” and the bottom display 132 of FIG. 4D displays “application UI 424” and “touch pen menu 416”. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to understand that WHELAN discloses “immersive” mode and “non-immersive” mode as specified in claim 1.

	Regarding claim 9, WHELAN discloses a computer-readable storage medium storing one or more programs configured to be executed by one or more processors of a computer system (Paragraph [0044], FIG. 2 shows a block diagram of a system 200 of a touch device that is configured for improvements and enhancements in Uls, e.g., for pen-specific user interface controls. System 200 is an embodiment of touch device 104 in system 100A of FIG. 1A ...; paragraphs [0045]-[0046], system 200 and computing device 202 include one or more of a processor (“processor”) 204, one or more of a memory and/or other physical storage device (“memory”) 206 ... Memory 206 is configured to store such computer program instructions/code ... ) that is in communication with a display generation component (Paragraph [0045], system 200 may also include UIs and menus 218) and one or more input devices (Paragraph [0045], an input/output (I/O) interface(s) 222; paragraph [0048], I/O interface(s) 222 may comprise hardware and/or software and may support any number of input devices ...), the one or more programs including instructions (Paragraph [0051], ... In software implementations, one or more components of menu manager 208 may be stored in memory 206 and are executed by processor 204) for:
while displaying via the display generation component an application user interface (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein. Uls and menus 218 may comprise portions of any types of software applications, such as apps 228 ...  3-D software/virtual environments ...), detecting a first input to an input device of the one or more input devices (Paragraph [0052],  Input detector 210 may be configured to receive inputs from one or more input interfaces of I/O interface(s) 222 ...); and 
in response to detecting the first input to the input device (Paragraph [0052], Input detector 210 may be configured to determine characterization information or characteristics of the contact instrument interaction with the touch interface, and to identify commands associated with the input for pen-specific user interface controls, such as when a specific activator of a touch pen is activated (e.g., tail activator 116 described above for FIGS. 1A-1C). As an example, one or more of communication signal 114 may be received by a digitizer of system 200, such as digitizer 106, that are indicative of tail activator 116 being activated; paragraph [0065], turning now to FIGS. 4A, 4B, 4C, 4D, and 4E, diagrams of user interfaces with pen menus for pen-specific user interface controls are shown, in accordance with example embodiments. The Uls of FIGS. 4A, 4B, 4C, 4D, and/or 4E may be embodiments of system 200 and Uls and menus 218 of FIG. 2, and may be presented/displayed via one or more displays described herein, such as display 130 and/or display 132 of FIG. 1D, which may be included in system 200 of FIG. 2): 
in accordance with a determination that the application user interface is in a first mode of display (FIG. 4D; paragraph [0073], application UI 424 that is displayed in display 132 for another executing software application. Thus, the top display 132 of FIG. 4D can be interpreted as a first mode of display), wherein the first mode of display comprises an Paragraph [0073], the display 132 only displays content of application UI 424;  paragraph [0049], Uls and menus 218 may comprise portions of any types of software applications, such as  3-D software/virtual environments), displaying via the display generation component the application user interface in a second mode of display (Paragraph [0074], When tail activator 116 of a touch pen is activated in the illustrated embodiment for FIG. 4D, touch pen menu 416 is caused to be displayed on display 132 above application UI 424, Thus, the bottom display 132 of FIG. 4D can be interpreted as a second mode of display), wherein the second mode of display comprises a Paragraph [0074], the bottom display 132 displays content of application UI 424) and other content are concurrently displayed (Paragraph [0074], the bottom display 132 displays content of touch pen menu 416); and 
in accordance with a determination that the application user interface is in the second mode of display (As shown in FIG. 4D, the bottom display 132), replacing display of at least a portion of the application user interface by displaying a home menu user interface via the display generation component (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein ... Uls and menus 218 may include a touch pen menu 220 having pen-specific user interface controls, e.g., selectable controls to launch pen-specific applications; paragraph [0074], touch pen menu 416 is caused to be displayed on display 132 above application UI 424).
It is noted that WHELAN does not use terms of “immersive” and “non-immersive” for the first display mode and the second display mode. However, the claim describes “immersive” mode for only displaying content of the application user interface and “non-immersive” mode for displaying content of the application user interface and other content. The prior art reference WHELAN describes that the top display 132 of FIG. 4D only displays “application UI 424” and the bottom display 132 of FIG. 4D displays “application UI 424” and “touch pen menu 416”. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to understand that WHELAN discloses “immersive” mode and “non-immersive” mode as specified in claim 9.

	Regarding claim 10, WHELAN discloses a computer system (Paragraph [0044], FIG. 2 shows a block diagram of a system 200 of a touch device that is configured for improvements and enhancements in Uls, e.g., for pen-specific user interface controls. System 200 is an embodiment of touch device 104 in system 100A of FIG. 1A ...)  that is in communication with a display generation component (Paragraph [0045], system 200 may also include UIs and menus 218) and one or more input devices (Paragraph [0045], an input/output (I/O) interface(s) 222; paragraph [0048], I/O interface(s) 222 may comprise hardware and/or software and may support any number of input devices ...), the computer system comprising:
one or more processors (Paragraph [0045], system 200 and computing device 202 include one or more of a processor (“processor”) 204); and 
memory (Paragraph [0045], system 200 and computing device 202 include one or more of a memory and/or other physical storage device (“memory”) 206,) storing one or more programs (Paragraph [0046], memory 206 is configured to store such computer program instructions/code, as well as to store other information and data described in this disclosure including, without limitation, UIs and menus 218, profile and state information 224, apps 228, etc.) configured to be executed by the one or more processors, the one or more programs including instructions (Paragraph [0051], one or more components of menu manager 208 may be stored in memory 206 and are executed by processor 204) for: 
while displaying via the display generation component an application user interface (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein. Uls and menus 218 may comprise portions of any types of software applications, such as apps 228 ...  3-D software/virtual environments ...), detecting a first input to an input device of the one or more input devices (Paragraph [0052],  Input detector 210 may be configured to receive inputs from one or more input interfaces of I/O interface(s) 222 ...); and 
in response to detecting the first input to the input device (Paragraph [0052], Input detector 210 may be configured to determine characterization information or characteristics of the contact instrument interaction with the touch interface, and to identify commands associated with the input for pen-specific user interface controls, such as when a specific activator of a touch pen is activated (e.g., tail activator 116 described above for FIGS. 1A-1C). As an example, one or more of communication signal 114 may be received by a digitizer of system 200, such as digitizer 106, that are indicative of tail activator 116 being activated; paragraph [0065], turning now to FIGS. 4A, 4B, 4C, 4D, and 4E, diagrams of user interfaces with pen menus for pen-specific user interface controls are shown, in accordance with example embodiments. The Uls of FIGS. 4A, 4B, 4C, 4D, and/or 4E may be embodiments of system 200 and Uls and menus 218 of FIG. 2, and may be presented/displayed via one or more displays described herein, such as display 130 and/or display 132 of FIG. 1D, which may be included in system 200 of FIG. 2): 
in accordance with a determination that the application user interface is in a first mode of display (FIG. 4D; paragraph [0073], application UI 424 that is displayed in display 132 for another executing software application. Thus, the top display 132 of FIG. 4D can be interpreted as a first mode of display), wherein the first mode of display comprises an Paragraph [0073], the display 132 only displays content of application UI 424;  paragraph [0049], Uls and menus 218 may comprise portions of any types of software applications, such as  3-D software/virtual environments), displaying via the display generation component the application user interface in a second mode of display (Paragraph [0074], When tail activator 116 of a touch pen is activated in the illustrated embodiment for FIG. 4D, touch pen menu 416 is caused to be displayed on display 132 above application UI 424, Thus, the bottom display 132 of FIG. 4D can be interpreted as a second mode of display), wherein the second mode of display comprises a Paragraph [0074], the bottom display 132 displays content of application UI 424) and other content are concurrently displayed (Paragraph [0074], the bottom display 132 displays content of touch pen menu 416); and 
in accordance with a determination that the application user interface is in the second mode of display (As shown in FIG. 4D, the bottom display 132), replacing display of at least a portion of the application user interface by displaying a home menu user interface via the display generation component (Paragraph [0049], UIs and menus 218 may include, for example, user interfaces and menus displayed to users via output devices described herein that may be interacted with via input devices described herein ... Uls and menus 218 may include a touch pen menu 220 having pen-specific user interface controls, e.g., selectable controls to launch pen-specific applications; paragraph [0074], touch pen menu 416 is caused to be displayed on display 132 above application UI 424).
It is noted that WHELAN does not use terms of “immersive” and “non-immersive” for the first display mode and the second display mode. However, the claim describes “immersive” mode for only displaying content of the application user interface and “non-immersive” mode for displaying content of the application user interface and other content. The prior art reference WHELAN describes that the top display 132 of FIG. 4D only displays “application UI 424” and the bottom display 132 of FIG. 4D displays “application UI 424” and “touch pen menu 416”. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the invention to understand that WHELAN discloses “immersive” mode and “non-immersive” mode as specified in claim 10.

Allowable Subject Matter
Claims 2-5 and 11-14 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.

Dependent claims 2-3 depend upon independent claim 1 and recite additional limitations to support “non-immersive” mode.

Dependent claims 11-12 depend upon independent claim 10 and recite additional limitations to support “non-immersive” mode.
However, the search results failed to show the obviousness of the claims as a whole. None of the prior art cited alone or in combination provides the motivation to teach the limitations recited in claims 2-3 and 11-12. 
Dependent claims 4-5 and 13-14 depend from dependent claims 3 and 12 and they have the same reasons as stated above.

In additional, the examiner discovered the closest prior art references:
The prior art reference RUDMAN et al (U.S. Patent Application Publication 2024/0119682 A1) discloses systems and methods for enabling user interface display mode toggling. More specifically, the method presents information in a first display region; presents a second display region beyond the predefined boundaries of the first display region via a wearable extended reality appliance; provides a control for altering a location of the user interface as displayed in a first mode or in a second mode; enables toggling between the first mode and the second mode via the control (As shown in FIG. 9). 
The prior art reference Jagannathan et al (U.S. Patent Application Publication 2022/0319108 A1) discloses a method of provisioning a virtual experience of a building based on user preference. More specifically, the method receives an identity data associated with an identity of a user; retrieves a user profile data based on the identity data; analyzes the user profile data using a machine learning model; determines one preference data based on the analyzing; identifies one virtual utility object based on the at least one preference data; generates an interactive 3D model data comprising the at least one virtual utility object; transmits the interactive 3D model data to a user device configured to present the interactive 3D model data; receives a reaction data from the user device and updates the interactive 3D model data based on the reaction data (As shown in FIG. 18). 
The prior art reference PARK et al (U.S. Patent Application Publication 2017/0115728 A1) discloses a system including a mobile terminal capable of controlling a head-mounted display (HMD), and a method of controlling the same. More specifically, the method outputs a preset first region of the virtual space on the display; outputs second region different from first region on the display; outputs first region on the display when the main body of the mobile terminal is moved as a preset movement in state in which second region is output (As shown in FIG. 6).

Examiner’s Comment
Claims 6-8 and 15-17 have not art rejection but rejected under U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph. A final determination of patentability, after further search, will be mode upon resolution of above 35 U.S.C. 112 rejection.

	
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Xilin Guo whose telephone number is (571)272-5786. The examiner can normally be reached Monday - Friday 9:00 AM-5:30 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Daniel Hajnik can be reached at 571-272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.





/XILIN GUO/Primary Examiner, Art Unit 2616                                                                                                                                                                                                        


    
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
    


Cookies help us deliver our services. By using our services, you agree to our use of cookies.