Jump to content

Patent Application 18029768 - POSITION ESTIMATION SYSTEM POSITION ESTIMATION - Rejection

From WikiPatents

Patent Application 18029768 - POSITION ESTIMATION SYSTEM POSITION ESTIMATION

Title: POSITION ESTIMATION SYSTEM, POSITION ESTIMATION METHOD, AND COMPUTER PROGRAM

Application Information

  • Invention Title: POSITION ESTIMATION SYSTEM, POSITION ESTIMATION METHOD, AND COMPUTER PROGRAM
  • Application Number: 18029768
  • Submission Date: 2025-05-16T00:00:00.000Z
  • Effective Filing Date: 2023-03-31T00:00:00.000Z
  • Filing Date: 2023-03-31T00:00:00.000Z
  • National Class: 382
  • National Sub-Class: 103000
  • Examiner Employee Number: 91293
  • Art Unit: 2669
  • Tech Center: 2600

Rejection Summary

  • 102 Rejections: 1
  • 103 Rejections: 4

Cited Patents

The following patents were cited in the rejection:

Office Action Text


    DETAILED ACTION
Notice of Pre-AIA  or AIA  Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-14 are currently pending in U.S. Patent Application No. 18/029,768 and an Office action on the merits follows.

Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.

The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art.  The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, is invoked. 
As explained in MPEP 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph:
 (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; 
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as "configured to" or "so that"; and 
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. 
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. 
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. 
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means/step”, but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, because the claim limitations use generic placeholder(s) “at least one processor that is configured to execute instructions”, and “reflecting unit”, that is/are coupled with functional language without reciting sufficient structure to perform the recited function (see MPEP 2181).  Example limitations include in particular “at least one processor” configured to perform the functional language of the entire computer implemented method and also that “reflecting unit” for the case of claims 1-14.  With reference to MPEP 2181 sub-section (1)(A):
The following is a list of non-structural generic placeholders that may invoke 35 U.S.C 112(f):  "mechanism for," "module for," "device for," "unit for," "component for," "element for," "member for," "apparatus for," "machine for," or "system for." Welker Bearing Co., v. PHD, Inc., 550 F.3d 1090, 1096, 89 USPQ2d 1289, 1293-94 (Fed. Cir. 2008); Mass. Inst. of Tech. v. Abacus Software, 462 F.3d 1344, 1354, 80 USPQ2d 1225, 1228 (Fed. Cir. 2006); Personalized Media, 161 F.3d at 704, 48 USPQ2d at 1886–87; Mas-Hamilton Group v. LaGard, Inc., 156 F.3d 1206, 1214-1215, 48 USPQ2d 1010, 1017 (Fed. Cir. 1998). Note that there is no fixed list of generic placeholders that always result in 35 U.S.C 112(f) interpretation, and likewise there is no fixed list of words that always avoid 35 U.S.C 112(f) interpretation. Every case will turn on its own unique set of facts.

With reference to MPEP 2181 sub-section (II)(B) Computer-Implemented Means-Plus-Function Limitations:
For a computer-implemented 35 U.S.C 112(f) claim limitation, the specification must disclose an algorithm for performing the claimed specific computer function, or else the claim is indefinite under 35 U.S.C 112(b). See Net MoneyIN, Inc. v. Verisign. Inc., 545 F.3d 1359, 1367, 88 USPQ2d 1751, 1757 (Fed. Cir. 2008). See also In re Aoyama, 656 F.3d 1293, 1297, 99 USPQ2d 1936, 1939 (Fed. Cir. 2011) ("[W]hen the disclosed structure is a computer programmed to carry out an algorithm, ‘the disclosed structure is not the general purpose computer, but rather that special purpose computer programmed to perform the disclosed algorithm.’") (quoting WMS Gaming, Inc. v. Int’l Game Tech., 184 F.3d 1339, 1349, 51 USPQ2d 1385, 1391 (Fed. Cir. 1999)).

In cases involving a special purpose computer-implemented means-plus-function limitation, the Federal Circuit has consistently required that the structure be more than simply a general purpose computer or microprocessor and that the specification must disclose an algorithm for performing the claimed function. See, e.g., Noah Systems Inc. v. Intuit Inc., 675 F.3d 1302, 1312, 102 USPQ2d 1410, 1417 (Fed. Cir. 2012); Aristocrat, 521 F.3d at 1333, 86 USPQ2d at 1239.

Because one or more claim limitation(s) are being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph, applicant may:  (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure (to include algorithmic structure) to perform the claimed function – see also Computer-Implemented Means-Plus-Function limitations as described in MPEP 2181(II)(B), stated differently, even housing the modules within the structure of a generic processor/memory combination, would not be sufficient structure at prong C - the Federal Circuit has stated that "a microprocessor can serve as structure for a computer-implemented function only where the claimed function is ‘coextensive’ with a microprocessor itself." EON Corp. IP Holdings LLC v. AT&T Mobility LLC, 785 F.3d 616, 622, 114 USPQ2d 1711, 1714 (Fed. Cir. 2015), citing In re Katz Interactive Call Processing Patent Litigation, 639 F.3d 1303, 1316, 97 USPQ2d 1737, 1747 (Fed. Cir. 2011). "‘It is only in the rare circumstances where any general-purpose computer without any special programming can perform the function that an algorithm need not be disclosed.’" EON Corp., 785 F.3d at 621, 114 USPQ2 at 1714, quoting Ergo Licensing, LLC v. CareFusion 303, Inc., 673 F.3d 1361, 1365, 102 USPQ2d 1122, 1125 (Fed. Cir. 2012).); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA  35 U.S.C. 112, sixth paragraph.


Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b)  CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.


The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.


Claim(s) 5-6 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA  35 U.S.C. 112, the applicant), regards as the invention.
Claim 5, line 4, recites the limitation “of the images for estimation”.  There is insufficient antecedent basis for this limitation in the claim.  Claim 1 discloses an acquisition of only a single/‘an image’, and while the language in question features ‘a plurality’ preceding ‘the images’, this language (a plurality) establishes required basis for only e.g. a subset (or all) of ‘the images’ which is being referenced specifically and without properly established basis.  While the acquisition of additional images may not necessarily be precluded by the claim(s), the claim(s) fail to establish basis for ‘the images’ being referenced.  Dependent claim 6 is similarly rejected as it inherits and fails to cure that deficiency identified above for the case of intervening claim 5. 


Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.


Claim(s) 1-14 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception, in particular an Abstract Idea falling under the (a) mathematical concepts category (mathematical relationships, formulas or equations, and/or calculations), not ‘integrated into a practical application’ at Prong Two of Step 2A and without ‘significantly more’ at Step 2B. 
Step 1:  The claim(s) in question are directed to primarily a computer implemented method/process for determining a target/display pose (following ‘Yes’ path at Step 1).  Corresponding system and non-transitory CRM claim(s) are congruent in scope, and while featuring generic computer hardware considered under the ‘apply it’ considerations of MPEP 2106.05(f), these claims also are understood to be directed to a machine, manufacture and/or composition of matter for the purposes of analysis at Step 1. (Step 1: Yes).
Step 2A, Prong One: This part of the eligibility analysis evaluates whether the claim recites a judicial exception. As explained in MPEP 2106.04, subsection II, a claim “recites” a judicial exception when the judicial exception is “set forth” or “described” in the claim.  Representative claim(s) 1/9/10 recite(s) –  “estimat[ing] a first relative position…” and “estimating a second relative position… on the basis of the first relative position”, drawn to the mathematical concepts Abstract Idea grouping.  The first step of “acquiring an image” is an ‘additional element’ that constitutes a data gathering considered in accordance with MPEP 2106.05(g), to be discussed below at Prong Two.  Concerning the mathematical concepts Abstract Idea Grouping Applicant may see MPEP 2106.04(a)(2), and subsection (C) Mathematical Calculations more specifically.  Claim 5 further recites that integrated relative position calculation similarly drawn to the mathematical concepts Abstract Idea grouping.  Reference may also be made to the July 17 2024 PEG identifying various processes steps identified as being drawn to the mathematical concepts Abstract Idea grouping – e.g. Example 47 claim 2 step(s) (b) (at page 7 describing the recited ‘discretizing’ as encompassing a mathematical concept e.g. rounding data values (that may also be performed mentally)) and (c) (interpreted so as to include mathematical calculations such as performing backpropagation and gradient descent algorithm(s)), in addition to Example 48 claim(s) 1 and 2 steps (b) (a ‘converting’ involving a mathematical operation using an STFT), (c) (an ‘embedding’ on the basis of an explicitly recited formula), and (e) (‘applying binary masks’) (see page 23 of the PEG – available https://www.uspto.gov/sites/default/files/documents/2024-AI-SMEUpdateExamples47-49.pdf ).  MPEP 2106.04(a)(2)(C):
A mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number, e.g., performing an arithmetic operation such as exponentiation. There is no particular word or set of words that indicates a claim recites a mathematical calculation. That is, a claim does not have to recite the word "calculating" in order to be considered a mathematical calculation. For example, a step of "determining" a variable or number using mathematical methods or "performing" a mathematical operation may also be considered mathematical calculations when the broadest reasonable interpretation of the claim in light of the specification encompasses a mathematical calculation.

  (Step 2A, Prong One: Yes).
Step 2A, Prong Two:  This part of the eligibility analysis evaluates whether the claim as a whole integrates the recited judicial exception into a practical application of the exception. This evaluation is performed by (1) identifying whether there are any ‘additional elements’ recited in the claim beyond the judicial exception, and (2) evaluating those additional elements individually and in combination to determine whether the claim as a whole integrates the exception into a practical application.  See MPEP 2106.04(d).  Examiner notes for consideration at Prong Two of 2A that MPEP 2106.05(a), (b), (c), and (e) generally concern limitations that are indicative of integration, whereas 2106.05(f), (g), and (h) generally concern limitations that are not indicative of integration.  As an additional note, ‘additional elements’ are generally limitations excluded from interpretation under the Abstract Idea groupings, and may comprise portions of limitations otherwise identified as falling under those Abstract Idea groupings of the 2019 PEG (e.g. any ‘determination’ that may be made mentally accompanied by the use of a neural network and/or generic computer hardware considered under the ‘apply it’ considerations of 2106.05(f)).  Any ‘providing’/outputting broadly, and ‘collection’ of data (i.e. image acquisition(s)), be they images for training any learning model and/or data/images visually observable/ evaluated by a user/operator, also fail(s) to integrate at least in view of MPEP 2106.05(g) (extra-solution data gathering/output) and/or 2106.05(h) as ‘generally linking’ the exception to a field of use involving machine learning and/or imagery so acquired.  The same determination holds for dependent claims that serve to limit the collection of data/images (by means of what is collected based on recited conditions (e.g. claim(s) 4, 6-8)) and/or introduce limitations generally linking to a field of use (2-3, 7, etc,).  None of the instant claims appear to explicitly/clearly capture/recite any disclosed improvement in technology (see MPEP 2106.05(a)) and any ‘additional elements’, even when considered in combination, fail to integrate at Prong Two of Step 2A accordingly.  The claim(s) in question rest with that final position/pose calculation/estimation, which in itself is not/cannot be a ‘practical application’.  Integration in view of subsection (a) requires an identification of the manner in which the improvement is achieved, to be explicitly and specifically (not at a high level of generality) recited in the claims, as ‘additional elements’ precluded from interpretation under any of the Abstract Idea groupings (since the improvement cannot be to the exception itself).  In view of MPEP 2106.05(f), the improvement cannot be merely/broadly automating what is otherwise the exception, nor can it be e.g. a ‘novel’ pose/position calculation per se.  With reference to MPEP 2106.05(a): 
It is important to note, the judicial exception alone cannot provide the improvement. The improvement can be provided by one or more additional elements. See the discussion of Diamond v. Diehr, 450 U.S. 175, 187 and 191-92, 209 USPQ 1, 10 (1981))

  Even when viewed in combination, the ‘additional elements’ present do not integrate the recited judicial exception into a practical application (Step 2A, Prong Two: No), and the claims are directed to the judicial exception. (Revised Step 2A: Yes [Wingdings font/0xE0] Step 2B).
Step 2B:  This part of the eligibility analysis evaluates whether the claim as a whole amounts to ‘significantly more’ than the recited exception, i.e., whether any ‘additional element’, or combination of additional elements, adds an inventive concept to the claim.  The considerations of Step 2A Prong 2 and Step 2B overlap, but differ in that 2B also requires considering whether the claims feature any “specific limitation(s) other than what is well-understood, routine, conventional activity in the field” (WURC) (MPEP 2106.05(d)).  Such a limitation if specifically recited however, must still be excluded from interpretation under any of the Abstract Idea groupings.  Step 2B further requires a re-evaluation of any additional elements drawn to extra-solution activity in Step 2A (e.g. gathering imagery) – however no limitations appear directed to any novel collection per se (HMD regularly acquire images of the eyes for gaze tracking purposes).  Limitations not indicative of an inventive concept/ ‘significantly more’ include those that are not specifically recited (instead recited at a high level of generality), those that are established as WURC, and/or those that are not ‘additional elements’ by nature of their analysis at Prong One (i.e. reciting the exception).  Reference may also be made to the 2024 PEG describing that an improvement/ inventive concept (for ‘significantly more’ determination(s)) cannot be to the judicial exception itself.  The claim(s) in question recite little beyond those limitations recited at a high level of generality and falling under the mathematical concepts Abstract Idea grouping – as the claim(s) rest with a target/display/HMD pose determination itself, and would monopolize the exception accordingly.  (Step 2B: No).


Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –

(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.

(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.

1.	Claims 1-2 and 8-12 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Vlaskamp (US 2022/0391013 A1).

As to claim 1, Vlaskamp discloses a position estimation system (Fig. 6, Fig. 7A, Fig. 18, etc.,) comprising:
at least one memory that is configured to store instructions ([0173], [0324], [0328], etc.,); and
at least one processor that is configured to execute the instructions (Fig. 6 612. Fig. 7A, [0006-0009] “The augmented reality system comprises a head-mounted display configured to present virtual content by outputting light to a user, an imaging device configured to capture images of eyes of the user, and at least one processor communicatively coupled to the head-mounted display and the imaging device”, etc.,) to
acquire an image for estimation including a target that is disposed out of an imaging range of an imaging unit, by imaging a reflecting unit that reflects a light and that is disposed in the imaging range of the imaging unit (camera 324/402 FoV oriented towards the eye and not the target/display 220, [0006-0009] “an imaging system configured to capture images of eyes of the user”, [0181] “engine 334, may be coupled to the eye cameras 324 via communication link 274, and be coupled to a projecting subsystem 318 (which may project light into user's eyes 302, 304 via a scanned laser arrangement in a manner similar to a retinal scanning display) via the communication link 272. The rendering engine 334 may also be in communication with other processing units such as, e.g., the sensor pose processor 332 and the image pose processor 336 via links 276 and 294 respectively”, [0216] “As shown in FIG. 6, head-mounted display system 600 may include an eye tracking system including a camera 324 that captures images of a user's eye 610. If desired, the eye tracking system may also include light sources 326a and 326b (such as light emitting diodes "LED"s). The light sources 326a and 326b may generate glints (i.e., reflections off of the user's eyes that appear in images of the eye captured by camera 324). The positions of the light sources 326a and 326b relative to the camera 324 may be known and, as a consequence, the positions of the glints within images captured by camera 324 may be used in tracking the user's eyes”, [0223] “Registration observer 620 may use information from eye tracking module 614 to identify whether the head-mounted unit 602 is properly positioned on a user's head. As an example, the eye tracking module 614 may provide eye location information, such as the positions of the centers of rotation of the user's eyes, indicative of the three-dimensional position of the user's eyes relative to camera 324 and head-mounted unit 602 and the eye tracking module 614 may use the location information to determine if display 220 is properly aligned in the user's field of view, or if the head-mounted unit 602 (or headset) has slipped or is otherwise misaligned with the user's eyes”; as an interpretation note the ‘reflecting unit’ is not limited solely to but instead may comprise/include user eye(s) in addition to one or more portions of the HMD/wearable device);
estimate a first relative position that is a position of the reflecting unit with respect to the imaging unit, on the basis of the image for estimation (eye/‘reflecting unit’ position/pose/axis information as determined by interocular axis estimation module 740, Fig. 18 1806, [0009] “The at least one processor is configured to determine an interocular axis of the user that extends between the user's left and right eyes based at least in part on one or more images captured by the imaging system”, [0182], [0213] “As the eye 500 moves to look toward different objects, the eye pose will change relative to the natural resting direction 520. The current eye pose may be determined with reference to an eye pose direction 524, which is a direction orthogonal to the surface of the eye (and centered in within the pupil 516) but oriented toward the object at which the eye is currently directed”, [0228], [0229-0231], [0239], etc.,); and
estimate a second relative position that is a position (HMD/display pose) of the target (HMD as a whole and/or display portions thereof to which one or more cameras 324/462 imaging the eyes 410/500/610 are fixed with a known/predetermined position relationship – see Applicant’s Specification at 0071-0072 wherein the ‘target’ is a display) with respect to the imaging unit, on the basis of the image for estimation and the first relative position ([0009] “determine an orientation of the HMD relative to the interocular axis of the user; and provide the user with feedback based on the determined orientation of the HMD relative to the interocular axis of the user”, [0223] “In general, registration observer 620 may be able to determine if head-mounted unit 602, in general, and displays 220, in particular, are properly positioned in front of the user's eyes. In other words, the registration observer 620 may determine if a left-eye display in display system 220 is appropriately aligned with the user's left eye and a right-eye display in display system 220 is appropriately aligned with the user's right eye. The registration observer 620 may determine if the head-mounted unit 602 is properly positioned by determining if the head-mounted unit 602 is positioned and oriented within a desired range of positions and/or orientations relative to the user's eyes”, [0288] “As an example, the registration observer 620 may use an inward-facing imaging system 462, which may include an eye tracking system, to determine how relevant parts of the wearable system 200 are spatially oriented with respect to the user and, in particular, the user's eyes, ears, mouth, or other parts that interface with the wearable system 200”, etc.,).

As to claim 2, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system wherein a marker for detecting the reflecting unit from the image for estimation is attached to the reflecting unit (Figs. 1 and 3, IR sources 326 are attached to frame 230, and wherein IR glints serve as markers for detecting eye pose information, [0177] “With continued reference to FIG. 3, a pair of scanned-laser shaped-wavefront (e.g., for depth) light projector modules with display mirrors and optics configured to project light 338 into the eyes 302, 304 are shown. The depicted view also shows two miniature infrared cameras 324 paired with infrared light sources 326 (such as light emitting diodes "LED"s), which are configured to be able to track the eyes 302, 304 of the user to support rendering and user input. The cameras 324 may be part of the inward-facing imaging system 462 shown in FIG. 4.”, [0216] “In yet other embodiments, there may be one or more cameras 324 and one or more light sources 326 associated with one or each of a user's eyes 610. As a specific example, there may be two light sources 326a and 326b and one or more cameras 324 associated with each of a user's eyes 610. As another example, there may be three or more light sources such as light sources 326a and 326b and one or more cameras 324 associated with each of a user's eyes 610”, [0230-0231], etc.; see also that interpretation note identified for the case of claim 1 above – as claim 2 does not require markers/fiducials affixed to the user’s eye, as the reflecting unit as a whole comprises but is not limited solely to one or more eyes – see claim(s) 7-8).

As to claim 8, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system wherein the reflecting unit is an eyeball (cameras 324/462 imaging the eyes 410/500/610 reflecting glint(s) in addition to content/patterns rendered on corresponding left and right eye displays, [0009] “an imaging system configured to capture images of eyes of the user”, etc.,).

As to claim 9, this claim is the method claim corresponding to the system of claim 1 and is rejected accordingly.  

As to claim 10, this claim is the non-transitory CRM claim corresponding to the system of claim 1 and is rejected accordingly.  

As to claim 11, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system wherein the at least one processor is configured to execute the instructions to correction a distortion in the image for estimation based on a shape of the marker in the image for estimation ([0227] “Image preprocessing module 710 may receive images from an eye camera such as eye camera 324 and may perform one or more preprocessing (i.e., conditioning) operations on the received images. As examples, image preprocessing module 710 may apply a Gaussian blur to the images, may down sample the images to a lower resolution, may applying an unsharp mask, may apply an edge sharpening algorithm, or may apply other suitable filters that assist with the later detection, localization, and labelling of glints, a pupil, or other features in the images from eye camera 324. The image preprocessing module 710 may apply a low-pass filter or a morphological filter such as an open filter, which may remove high-frequency noise such as from the pupillary boundary 516a (see FIG. 5), thereby removing noise that may hinder pupil and glint determination. The image preprocessing module 710 may output preprocessed images to the pupil identification module 712 and to the glint detection and labeling module 714”).

As to claim 12, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system wherein the first relative position is estimated based on a size of the reflecting unit and a size of the reflecting unit in the image for estimation (Fig. 7A 614 utilizing ‘available data’ comprising 702, 704 and 706, 704 including assumed eye dimensions, [0226] “As examples, eye tracking module 614 may utilize available data including eye tracking extrinsics and intrinsics, such as the geometric arrangements of the eye-tracking camera 324 relative to the light sources 326 and the headmounted-unit 602; assumed eye dimensions 704 such as a typical distance of approximately 4.7 mm between a user's center of cornea curvature and the average center of rotation of the user's eye or typical distances between a user's center of rotation and center of perspective; and per-user calibration data 706 such as a particular user's interpupillary distance. Additional examples of extrinsics, intrinsics, and other information that may be employed by the eye tracking module 614 are described in U.S. patent application Ser. No. 15/497,726, filed Apr. 26, 2017 (Attorney Docket No. MLEAP.023A 7), which is incorporated by reference herein in its entirety”).


Claim Rejections - 35 USC § 103
	The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains.  Patentability shall not be negated by the manner in which the invention was made.


1.	Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over Vlaskamp (US 2022/0391013 A1) in view of Miettinen al. (US 2019/0138094 A1).

As to claim 3, Vlaskamp discloses the system of claim 2.
Vlaskamp further discloses the system wherein the marker is in a predetermined shape or in a predetermined color (glints have a predetermined shape as detected by module(s) 712/714, both individually and in conjunction for the case(s) of a plurality of light sources 326a-d, and glints have a false but predetermined color (for the case of IR source(s), [0230-0231] “Glint detection module 714 may use this data to detect and/or identify glints (i.e., reflections off of the user's eye of the light from light sources 326) within regions of the preprocessed images that show the user's pupil. As an example, the glint detection module 714 may search for bright regions within the eye tracking image, sometimes referred to herein as "blobs" or local intensity maxima, that are in the vicinity of the user's pupil. In at least some embodiments, the glint detection module 714 may rescale (e.g., enlarge) the pupil ellipse to encompass additional glints. The glint detection module 714 may filter glints by size and/or by intensity. The glint detection module 714 may also determine the 2D positions of each of the glints within the eye tracking image. In at least some examples, the glint detection module 714 may determine the 2D positions of the glints relative to the user's pupil, which may also be referred to as the pupil-glint vectors”, etc.,).
While even a circular ‘blob’ shape as taught/suggested by Vlaskamp may read on a predetermined marker/fiducial shape (particularly for the case of e.g. 4 of such glints with known/expected relative positions), Miettinen further evidences the obvious nature of gaze tracking (Fig. 5) using non-circular lights/markers with a predetermined shape/pattern (Figs. 3B-D, individual sources having Shape “V” and in conjunction a predetermined shape e.g. Fig. 3F, [0152] “FIG. 3D is a schematic illustration of an image 304 of a user's eyes 306 and 308 and reflections of some non-circular light sources from the user's eyes 306 and 308, in accordance with an embodiment of the present disclosure. By way of the image 304, 8 of the 10 non-circular light sources (of FIG. 3B) from where the reflections originated can be identified, based upon shapes, rotational orientations and relative positions of the reflections. This allows for differentiating the reflections of 8 non-circular light sources from visual artifacts 310 and 312”, [0156], [0163] “At a step 506, at least one of the plurality of non-circular light sources from where at least one of the reflections originated is identified, based upon shapes, rotational orientations and relative positions of the reflections of the plurality of non-circular light sources, to differentiate said reflections from visual artifacts”, etc.,).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Vlaskamp such that the glint based markers/fiducials comprise a plurality of markers with a known configuration/ relative position (shape(s) both individually and in the aggregate) as taught/suggested by Vlaskamp and also Miettinen, the motivation as similarly taught/suggested therein that such predetermined marker characteristics/shape(s) may serve to facilitate more accurate marker/glint detection robust to visual artifacts that may otherwise be detected as markers/glints (see Miettinen Figs. 3E-F and related disclosure).  While not relied upon/required for the rejection of claim 3, see also Kim et al. (US 2020/0192097 A1).


2.	Claims 4 and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Vlaskamp (US 2022/0391013 A1) in view of Miettinen al. (US 2019/0138094 A1) and Greenwald (US 9,898,082 B1).

As to claim 4, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system [0181], [0297], [0299], [0302], [0303] “As shown in FIG. 16, the HMD may provide different respective alignment markers to the user's left and right eyes in order to demonstrate any left-right vertical misalignment. For example, the HMD may display screen 1600a to a user's left eye and may display screen 1600b to the user's right eye. Screen 1600a may include a left-eye horizontal alignment marker 1502 and a vertical alignment marker 1506, while screen 1600b may include a right-eye horizontal alignment marker 1504 and a vertical alignment marker 1506”).  Vlaskamp further discloses a condition in which drawing patterns are displayed on 220 to facilitate a correction/calibration of misaligned HMD/display(s) ([0297] “It will be appreciated that the horizontal portions of the alignment markers are may take the form of other mirror image shapes or arrangements of lines which do not completely overlap when vertically misaligned”, [0299] “the alignment markers may take any suitable shape or form. Using the alignment markers, a user may be able to quickly perceive if the HMD is tilted on their head, including in which direction and by how much. Thus, the user can correct the tilt of the HMD until the HMD is properly leveled”, [0302] “the alignment markers make take the form of the letter "T" laying sideways. It will be appreciated, however, that the alignment markers may take other shapes or forms”, etc.) however these are understood to be patterns that are not necessarily relied upon for determining those eye pose related parameters upon which HMD/display pose determination is based prior to providing the user feedback/alignment markers via display 220.  Stated differently, while a disclosed condition, it is not a condition understood to exist at the time of capture for that/those ‘images for estimation’.
Miettinen however evidences the obvious nature of gaze tracking (Fig. 5) wherein the image for estimation is an image that is captured in a condition in which a pattern that varies depending on a display position is displayed (see Fig. 3F, and Fig. 4, wherein 3F varies depending on display position in view of those variously oriented “V” and the dotted line therebetween).
Greenwald further evidences the obvious nature of gaze tracking wherein the image for estimation is an image that is captured in a condition in which a pattern that varies depending on a display position is displayed on a target/display (Abs “A gaze tracking system uses visible light to track the pose of an eye. A display screen displays a display image, while a camera captures images of the eye . The images captured by the camera include a reflection of the display screen, reflecting from the cornea of the eye. The position of the reflection depends on the orientation of the eye”, Fig. 10A, camera 1010 capturing eye 900 and display image of 1000 reflected therein (camera image 1020), Fig. 11, Fig. 14 1406 on the basis of 1402, col 2 lines 1-10, etc.,).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Vlaskamp such that the image for estimation is an image that is captured in a condition in which a drawing pattern that varies depending on a display position is displayed on the target/display as taught/suggested by Miettinen and/or Greenwald, the motivation as taught/suggested in Miettinen that such a varied pattern may allow for better detection distinguished from visual artifacts, and also/alternatively as suggested in Greenwald that such a varied pattern/displayed image may be used to induce a particular/desired gaze/eye pose (or sequence of poses) that may in response thereto be more efficiently detected/validated.

As to claim 13, Vlaskamp in view of Miettinen and Greenwald teaches/suggests the system of claim 4.
Vlaskamp in view of Miettinen and Greenwald further teaches/suggests the system wherein the second relative position is estimated based on the first relative position (see Vlaskamp disclosure as identified above for the case of claim 1) and a size of the drawing pattern on the target and a size of the drawing pattern in the image for estimation (Vlaskamp 702, 704 and 706 in further view of Greenwald Fig. 11, col 10 lines 10-20, 20-30 “When considered from the camera's perspective, as the eye moves, both the position of the Purkinje image of the display screen, and the portion of the reflection that is visible, may change. If the eye is near the same position, many of the same features may be visible, but slightly offset and distorted; whereas if the eye is in a very different position, very few features may match – since they may not be visible or distorted beyond recognition … the position, orientation, and shape of reflection 1012 of the displayed star pattern, as it appears in camera image 1020, depends on the gaze direction of eye 900 when camera image 1020 is captured”, col 11,  col 19 mapping P accounting for scale, etc.,).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Vlaskamp in view of Greenwald such that the first and accordingly second relative position are determined based at least in part on a displayed size of a pattern (e.g. star pattern of display image(s) of Greenwald Fig. 11 left column) and a size of the pattern in the camera image (Fig. 11 right column) as taught/suggested by Greenwald in view of e.g. a projective mapping therebetween accounting for scale, the motivation as similarly taught/suggested therein (Greenwald col 10) that such a size correspondence is but one of a number of different features that may serve to indicate a corresponding gaze direction characterized by a reasonable expectation of success.

As to claim 14, Vlaskamp in view of Miettinen and Greenwald teaches/suggests the system of claim 4.
Vlaskamp in view of Miettinen and Greenwald further teaches/suggests the system wherein the at least one processor is configured to execute the instructions to estimate a position of the drawing pattern in the reflecting unit based on the drawing pattern on the target and the drawing pattern in the image for estimation that is captured in a horizontally inverted condition (Vlaskamp e.g. Fig. 8E wherein patterns/glints of Vlaskamp as modified by Miettinen and Greenwald are horizontally inverted as the image patterns as captured from the eyes are reflections/mirrored, and the configuration thereof (in terms of relative position(s) while mirrored) matches that of the source(s) (be they light sources separate from the display and/or part thereof as is the case for the proposed modification) – see Greenwald Fig. 11, and col 10 identified above for the case of claim 13 “the position, orientation and shape of the Purkinje image changes as the orientation of the eye changes”).


3.	Claims 5-6 are rejected under 35 U.S.C. 103 as being unpatentable over Vlaskamp (US 2022/0391013 A1).
As to claim 5, Vlaskamp discloses the system of claim 1.
Vlaskamp further discloses the system wherein the at least one processor is configured to execute the instructions to integrate a plurality of the first relative positions estimated from a plurality of the images for estimation to calculate an integrated relative position ([0237] “As an example, the CoR estimation module 724 may estimate the CoR by finding the average point of intersection of optical axes determined for various different eye poses over time. As additional examples, module 724 may filter or average estimated CoR positions over time, may calculate a moving average of estimated CoR positions over time, and/or may apply a Kalman filter and known dynamics of the eyes and eye tracking system to estimate the CoR positions over time”, [0238] “Module 726 may employ various techniques, such as those discussed in connection with CoR estimation module 724, to increase the accuracy of the estimated IPD. As examples, IPD estimation module 724 may apply filtering, averaging over time, weighted averaging including assumed IPD distances, Kalman filters, etc. as part of estimating a user's IPD in an accurate manner”).  Vlaskamp fails to explicitly disclose that a plurality of second/final/HMD/display poses are integrated, but does disclose such an integration for a plurality of first/eye poses estimated from a plurality of the images for estimation so as to calculate an integrated relative position.  The teaching of Vlaskamp for the case of that first relative position may be readily extended to that of the second, for the same purposes, of which are known/readily recognized by PHOSITA.  Namely, using a previous HMD/display pose determination as a candidate pose for refinement by implementing e.g. commonly relied upon moving average filter, a Kalman filter, etc..  Such an integrated relative position may serve to eliminate anomalies/outlier pose determinations arising from noisy sensor data, periodic motions (user breathing), etc..
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Vlaskamp such that e.g. a moving average filtering, Kalman filtering, etc., as implemented for those first/eye poses are similarly implemented for the case of a second/HMD/display pose deduced therefrom, the motivation(s) being that/those identified above, that such a filtering/integrated position determination may serve to minimize the impact of anomalous sensor data. 

As to claim 6, Vlaskamp teaches/suggests the system of claim 5.
Vlaskamp further discloses the system wherein the plurality of the images for estimation are images that are captured in a condition in which the first relative position varies (images are captured under a condition in which the eye pose(s) vary(ies), [0237] “In at least some embodiments, the CoR estimation module 724 may refine its estimate of the center of rotation of each of the user's eyes over time. As an example, as time passes, the user will eventually rotate their eyes (to look somewhere else, at something closer, further, or sometime left, right, up, or down) causing a shift in the optical axis of each of their eyes. CoR estimation module 724 may then analyze two (or more) optical axes identified by module 722 and locate the 3D point of intersection of those optical axes. The CoR estimation module 724 may then determine the center of rotation lies at that 3D point of intersection. Such a technique may provide for an estimate of the center of rotation, with an accuracy that improves over time”, etc.,).


4.	Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Vlaskamp (US 2022/0391013 A1) in view of Lam et al. (US 11,579,451 B1).

As to claim 7, Vlaskamp discloses the system of claim 1.
Vlaskamp fails to explicitly disclose the system wherein the reflecting unit includes glass, metal, acrylic, and polycarbonate.
Lam however evidences the obvious nature of a pose estimation system wherein the reflecting unit includes glass, metal, acrylic and polycarbonate (Figs. 1-4, prism array 180/redirection structure 510 in conjunction with eye 150, serve to reflect 172 and redirect 174 illumination/source light 170 from source 155 to camera assembly 160 of eye tracking system 115, col 6 lines 15-35 “The prisms 182 are made of a material that is substantially transparent to light in the first band. The prisms 182 may be composed of, e.g., glass, polymer, some other material that is substantially transparent to light in the first band, or some combination thereof. In one embodiment, the light in the second band 172 scattered from the eye is normally incident on the prism array 180. In some embodiments, some or all of the facets of the prisms 172 are coated with a dichroic material (i.e., a hot mirror). The dichroic material reflects light in the second band, but substantially transmits light in the first band. The dichroic material may be, e.g., thin metal films (e.g., gold), indium tin oxide, zinc oxide, some other material that is transparent in the first band of light and reflective in the second band of light, or some combination thereof”; Examiner notes while the language recited is in the conjunctive, and Lam discloses ‘polymer, some other material that is substantially transparent to light in the first band’ (which Examiner understands to be the broader Genus, where plastics such as those named are a species of polymer), Official Notice may be taken to the manner in which both acrylic and polycarbonate are known/commonly used transparent plastics/substrates, while differing in characteristics such as impact resistance, affordability, scratch resistance, clarity, etc., one or more may serve as Obvious to Try alternatives as a design choice constraint accounting for the abovementioned characteristics (cost vs performance in particular situations) not central to Applicant’s invention (see MPEP 2143 Rationales A, B and E)) and further characterized by a reasonable expectation of success.
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Vlaskamp such that the reflecting unit comprises a prism array/redirection structure in conjunction with user eyes wherein said array/structure further comprises glass, metal, and polymer/plastic substrates such as acrylic and polycarbonate as taught/suggested by Lam, the motivation as similarly taught/suggested therein and recognized by PHOSITA that such a redirection structure would enable optimized placement of cameras reducing overall HMD/wearable form factor and/or facilitating a camera field of view not occluded by structures such as user eyelids/eyelashes.  See also Kim et al. (US 2020/0192097 A1).


Additional References
Prior art made of record and not relied upon that is considered pertinent to applicant's disclosure:
Additionally cited references (see attached PTO-892) otherwise not relied upon above have been made of record in view of the manner in which they evidence the general state of the art.  See in particular Chida US 2023/0338832 A1 [0035] “The tracking information detector 22 includes a device for calculating the position and orientation of the HMD 20, i.e., the position and orientation of the head of the player. … These sensors can also be used to calculate the position of the HMD 20. Further, in addition to or instead of the above sensors, the tracking information detector 22 may include a sensor that directly detects the eye motion of the player, for example, a sightline detection sensor that emits near-infrared light to an iris and detects its reflected light”, Lewkowski US 2023/0172507 A1 Fig. 5D, [0101], [0115] “Known camera and/or infra-red LED positions on the apparatus may be used to determine, predict and/or estimate motion and/or location of a patient's eye(s)/pupil (s). For example, known camera and/or infra-red LED positions on the apparatus may provide known reference positions, such as distance and/or angle of separation between camera and LED(s) on the apparatus or relative to the eye(s)/pupil(s)”, Publicover et al. US 2013/0114850 A1 [0009] “First, cameras directed at the eye from different angles may be used in a “range-finder” mode to track the locations of glints produced by illumination sources. The positions of a glint viewed from multiple angles may be used to determine camera position relative to the surface of an eye. This may be particularly useful to account for initial positions and movements of eyewear or headwear during use”, etc.., all evidencing the well-understood, routine and/or conventional nature of determining a camera/device/HMD/display etc., pose based in part on a first pose determined based at least in part on reflected imagery.


Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IAN L LEMIEUX whose telephone number is (571)270-5796. The examiner can normally be reached Mon - Fri 9:00 - 6:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan Park can be reached on 571-272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.




/IAN L LEMIEUX/Primary Examiner, Art Unit 2669                                                                                                                                                                                                        


    
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
        
            
    


Cookies help us deliver our services. By using our services, you agree to our use of cookies.