Patent Application 18177564 - ULTRASOUND SYSTEM AND CONTROL METHOD OF - Rejection
Appearance
Patent Application 18177564 - ULTRASOUND SYSTEM AND CONTROL METHOD OF
Title: ULTRASOUND SYSTEM AND CONTROL METHOD OF ULTRASOUND SYSTEM
Application Information
- Invention Title: ULTRASOUND SYSTEM AND CONTROL METHOD OF ULTRASOUND SYSTEM
- Application Number: 18177564
- Submission Date: 2025-05-13T00:00:00.000Z
- Effective Filing Date: 2023-03-02T00:00:00.000Z
- Filing Date: 2023-03-02T00:00:00.000Z
- National Class: 600
- National Sub-Class: 440000
- Examiner Employee Number: 81884
- Art Unit: 3798
- Tech Center: 3700
Rejection Summary
- 102 Rejections: 0
- 103 Rejections: 2
Cited Patents
No patents were cited in this rejection.
Office Action Text
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The rejection under 35 U.S.C. 103 has been withdrawn in light of the amendment to the claims filed on 25 March 2025. Claim Rejections - 35 USC § 112(a) The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-20 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. In particular, Claim 1 now recites “a site specifying unit that specifies the epidermis and the region of interest in the ultrasound image based on an instruction by the user”; however, the instant specification fails to show possession of specifying both the region on interest AND the epidermis via user input because the instant specification does not describe “the claimed invention with all of its limitations using such descriptive means as words, structures, figures, diagrams, and formulas that fully set forth the claimed invention” as laid out in MPEP § 2163.02. More specifically, the instant specification discloses in ¶ [0052] that “[T]he site specifying unit 62 can specify at least one site in the ultrasound image on the basis of an instruction input from the user. Further, an image analysis unit that analyzes the ultrasound image may be provided, and the site specifying unit 62 may specify at least one site in the ultrasound image on the basis of an analysis result of the ultrasound image by the image analysis unit,” (emphasis added). Moreover, ¶ [0021] discloses that “It is preferable that the ultrasound system further includes an input device that receives an instruction input from a user, in which the site specifying unit specifies at least one of the region of interest or the epidermis in the ultrasound,” (emphasis added). However, the instant specification fails to provide explicit and/or inherent support for both actions via an instruction input from a user. Therefore, the instant specification does not convey with reasonable clarity how one of ordinary skill in the art, as of the filing date sought, can show the inventor was in possession of “a site specifying unit that specifies the epidermis and the region of interest in the ultrasound image based on an instruction by the user”. Therefore, Claim 1 fails to meet the written description requirement of 35 U.S.C. 112(a). Claim 14 recite similar limitations and are rejected under the same rationale as claim 1. Dependent claims are rejected by virtue of their dependency to abovementioned claims. Similarly, Claim 11 also recites “an image analysis unit that analyzes the ultrasound image, wherein the site specifying unit specifies at least one of the region of interest or the epidermis in the ultrasound image on the basis of an analysis result of the ultrasound image”; however, as noted directly above, the instant specification is completely silent with regards to the functionality of the image analysis or site specification. Accordingly, one of ordinary skill in the art would not deem the instant specification having sufficient detail so that they could understand how the inventor intended the function to be performed. Similarly, with regards to Claim 12, the claim recites “wherein the site specifying unit has a determination model that has learned, using learning ultrasound images including a region of interest of a breast of a subject as teacher data, a relationship between the learning ultrasound image and the region of interest and epidermis included in the learning ultrasound image, and the determination model uses the ultrasound image as an input, and specifies at least one of the region of interest or the epidermis in the ultrasound image”; however, the instant specification fails to explain the steps/procedure for “learn[ing] a relationship between the learning ultrasound image and the region of interest and epidermis included in the learning ultrasound image” and “specif[ying] at least one of the region of interest or the epidermis in the ultrasound image” i.e. computer function, in sufficient detail so that one of ordinary skill in the art would understand how the inventor intended the function to be performed. More specifically, ¶ [0065] of the instant specification briefly describes the determination model as claimed but fails to explain explicit details regarding the trained model {e.g. type of learning model, structure of learning model, type/structure/annotation of training data, etc.} that would inform one of ordinary skill in the art the functionality of the algorithm as intended by the inventor. Accordingly, the aforementioned claim fails to meet the written description requirement under 35 U.S.C. 112(a). Claim 11 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, because the specification, while being enabling for of the site specifying unit in which a site is manually specified via user input OR a site is automatically specified via the image analysis unit, does not reasonably provide enablement for the site specifying unit in which a site is manually specified via user input AND a site is automatically specified via the image analysis unit. The specification does not enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to use the invention commensurate in scope with these claims. In particular, the instant specification discloses in ¶ [0052] that “[T]he site specifying unit 62 can specify at least one site in the ultrasound image on the basis of an instruction input from the user. Further, an image analysis unit that analyzes the ultrasound image may be provided, and the site specifying unit 62 may specify at least one site in the ultrasound image on the basis of an analysis result of the ultrasound image by the image analysis unit,” (emphasis added). Moreover, ¶ [0021] discloses that “It is preferable that the ultrasound system further includes an input device that receives an instruction input from a user, in which the site specifying unit specifies at least one of the region of interest or the epidermis in the ultrasound,” (emphasis added). However, the instant specification fails to provide explicit and/or inherent support for both actions. Accordingly, the instant specification does not teach those skilled in the art how to use the full scope of the claimed invention without undue experimentation; therefore, the instant specification is not commensurate with the scope of protection sought by the claims. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-8, 10-12, 14-18 & 20 are rejected under 35 U.S.C. 103 as being unpatentable over Robinson et al. (US PGPUB 20160155247; hereinafter "Robinson") in view of Xu et al. (“Medical breast ultrasound image segmentation by machine learning,” (January 2019), Ultrasonics, Volume 91, January 2019, Pages 1-9; hereinafter "Xu"). With regards to Claim 1, Robinson discloses an ultrasound system (image recording & tissue mapping system 10; see Robinson FIG. 31 & ¶ [0139]) comprising: an ultrasound probe (hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]); a position sensor that outputs a position detection signal for detecting a position of the ultrasound probe in a three-dimensional space (position sensors 32 a-32 c, which are affixed to hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]); an image generation unit that generates an ultrasound image including epidermis and a region of interest of a breast of a subject (FIG. 12 of Robinson clearly illustrates the skin being specified in the US image, while FIG. 25 of Robinson illustrates a specified lesion), from a reception signal obtained by performing transmission and reception of an ultrasound beam with respect to the region of interest using the ultrasound probe, on the epidermis above the region of interest (display module/controller 40 for processing the image data; see Robinson ¶ [0140]); a position acquisition unit (position tracking module 22; see Robinson FIG. 31 & ¶ [0139]) that acquires a first position of the ultrasound probe on a nipple of the subject and a second position of the ultrasound probe on the epidermis above the region of interest in the three-dimensional space, which are detected on the basis of the position detection signal (see Robinson FIG. 25 which illustrates two scan directions of a lesion: one from the nipple and a second from an inferior position; see also Robinson ¶ [0122-0124]; identifying the location of image frames of the ROI relative to identified landmarks, e.g. nipple on the surface of the skin; see Robinson ¶ [0078]; it should be appreciated that location of the nipple scan and the scan at the second position are known relative to the location of the ROI, i.e. target/lesion); a site specifying unit that specifies (to measure all other image structures within an image set relative to a reference point {i.e. nipple and/or lesion}, such set being obtained over a period of time, and using that reference to map those image structures relative to each other; see Robinson ¶ [0048 & 0050] along with FIG. 31; the user specifies the location of the landmarks {i.e. based on user instruction} corresponding to reference points; see Robinson ¶ [0078-0079]); Robinson discloses that another additional aspect describes a device and method for obtaining a reference point {e.g. nipple and/or lesion} from which “to measure all other image structures within an image set, such set being obtained over a period of time, and using that reference to map those image structures relative to each other” (see Robinson ¶ [0048 & 0050]). While Robinson discloses mapping the entire breast (see Robinson FIG. 17 & ¶ [0089-0090]) including determining coordinate locations for all pixels locations based on the position tracking module 22 (see Robinson ¶ [0067-0072]) and relying Pythagorean calculations to determine pixel locations relative adjacent image and for image transformation (see Robinson ¶ [0076 & 0103]) along with the location of reference locations such as the nipple and a vector from the nipple to the middle of the breast on the chest wall (see Robinson ¶ [0085]), it appears that Robinson may be silent to the struck-through limitations directly above. However, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Robinson to provide at least the struck-through limitations directly above. Doing so would amount to amount to combining prior art elements according to known methods to yield predictable results because Robinson already discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location of the probe for each scanned image and the corresponding reconstruction and Robinson also teaches of “computing a distance between the first and second projections, and constructing the idealized map based on the relative angle and distance computed,” such as, for example, the two projections as illustrated in FIG. 25 (see Robinson ¶ [0016] & FIG. 22). Therefore, triangulating {i.e. via Pythagorean Theorem} any point {i.e. first/second/ROI positions and a distance calculation therebetween as claimed, e.g. first/second/third calculation units} relative to the ROI, nipple, chest wall centroid, or any distance therebetween including a distance between two projections such as those illustrated in FIG. 25 would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating various position relative to the same reference points. While Robinson clearly illustrates the epidermis being imaged in FIG. 12, as cited above, it appears that Robinson does not specify the epidermis with a site specifying unit. However, Xu teaches of a machine learning based method using a CNN to automatically segment 3D breast ultrasound images including the skin, fibroglandular tissue, mass, and fatty tissue (see Xu Abstract). Modified Robinson and Xu are both considered to be analogous to the claimed invention because they are in the same field of breast ultrasound, more specifically breast cancer diagnosis in ultrasound imaging. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have further modified Robinson to incorporate the above teachings of Xu to provide at least specifying the epidermis. Doing so would “provide an objective reference for radiologists on breast image segmentation, so as to help breast cancer diagnosis and breast density assessments” (see Xu pg. 8, ¶ 5). It should be appreciated that the same logic pattern and rationale are applied to Claim 14 as applied to Claim 1. With regards to Claim 21, modified Robinson teaches of wherein in a case where (indicating contingent limitation), by a first straight line from the first position to the second position, a second straight line from the region of interest to the epidermis above the region of interest in the ultrasound image, and a third straight line from the nipple to the region of interest in the ultrasound image in the three-dimensional space, a right triangle in which an angle formed by the first straight line and the second straight line is substantially a right angle is formed, the third distance calculation unit uses the Pythagoras' theorem to calculate the third linear distance L3 on the basis of the first linear distance L1 and the second linear distance L2 (as noted above, Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction. Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0050, 0076, & 0103]} any point {i.e. first/second/third straight lines & third linear distance L3} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating various position relative to the same reference points; furthermore, Claim 2 is a contingent limitation and one of ordinary skill in the art would understand that the Robinson system is capable of performing triangulation calculation via Pythagorean Theorem, as disclosed by Robinson). With regards to Claim 31, modified Robinson teaches of wherein in a case where (indicating contingent limitation), by a first straight line from the first position to the second position, a second straight line from the region of interest to the epidermis above the region of interest in the ultrasound image, and a third straight line from the nipple to the region of interest in the ultrasound image in the three-dimensional space, a right triangle in which an angle formed by the second straight line and the third straight line is substantially a right angle is formed, the third distance calculation unit uses the Pythagoras' theorem to calculate the third linear distance L3 on the basis of the first linear distance L1 and the second linear distance L2 (as noted above, Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction and mapping all other structures relative to the reference point {e.g. nipple and/or lesion}; see Robinson ¶ [0048 & 0050]). Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0076, & 0103]} any point {i.e. first/second/third straight lines & third linear distance L3} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating all other structures relative to the same reference points; furthermore, Claim 3 is a contingent limitation and one of ordinary skill in the art would understand that the Robinson system is capable of performing triangulation calculation via Pythagorean Theorem, as disclosed by Robinson). With regards to Claim 41, modified Robinson teaches of wherein the position sensor outputs an angle detection signal for detecting an angle of the ultrasound probe on the epidermis above the region of interest with respect to a vertical direction in the three-dimensional space (a position tracking system can further include at least one position and/or orientation sensor configured to provide data corresponding to the position and/or three-dimensional orientation of the manual image scanning device; see Robinson ¶ [0010]; the sensors may also provide orientation data such as pitch, roll, and yaw {i.e. probe angle}. Such sensors may be position and/or orientation sensors that detect either position and/or orientation data. In some cases, a position sensor may only detect position. The system may then derive the undetected orientation information if needed {i.e. the Robinson position/orientation sensors track angle of the probe during the capture of the two scans illustrated in FIG. 25 which result in angles θ1 & θ2}; see Robinson ¶ [0145]), the position acquisition unit acquires a first angle θ1 of the ultrasound probe on the nipple with respect to the vertical direction and a second angle θ2 of the ultrasound probe on the epidermis above the region of interest with respect to the vertical direction in the three-dimensional space, which are detected on the basis of the angle detection signal (FIGS. 6, 7, & 10 of Robinson clearly illustrate an Anterior-Posterior axis of reference, i.e. vertical direction), and in a case where (indicating contingent limitation), by a first straight line from the first position to the second position, a fourth straight line extending from the nipple toward an inside of the subject at the first angle θ1, and a fifth straight line extending from the epidermis above the region of interest toward the inside of the subject at the second angle θ2, an isosceles triangle in which distances of the fourth straight line and the fifth straight line are equal is formed, and by the third straight line, a sixth straight line extending perpendicularly from the nipple to the fifth straight line, and a seventh straight line from an intersection between the fifth straight line and the sixth straight line to the region of interest, a right triangle in which an angle formed by the sixth straight line and the seventh straight line is a right angle is formed, the third distance calculation unit uses the Pythagoras' theorem to calculate the third linear distance L3 on the basis of the first linear distance L1, the second linear distance L2, the first angle θ1, and the second angle θ2 (as noted above, Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction and mapping all other structures relative to the reference point {e.g. nipple and/or lesion}; see Robinson ¶ [0048 & 0050]). Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0076 & 0103]} any point {i.e. first/second/third/fourth/fifth/sixth/seventh straight lines & third linear distance L3 along with angles θ1 & θ2} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface would be obvious to one of ordinary skill in the art because the Robinson already teaches of mapping via triangulation all other structures relative to said reference points; furthermore, the corresponding limitation is a contingent limitation and one of ordinary skill in the art would understand that the Robinson system is capable of performing triangulation calculation via Pythagorean Theorem, as disclosed by Robinson). With regards to Claim 51, modified Robinson teaches of wherein the position sensor is a magnetic sensor, a GPS sensor, or an optical sensor (any suitable sensor may be used to provide location and position data. For example, magnetic sensors, optical markers (e.g. to be imaged by cameras), infrared markers, and ultraviolet sensors are examples of suitable options; see Robinson ¶ [0142]). With regards to Claim 61, further comprising: a monitor (display 3/17; see Robinson FIG. 31); and a display control unit that displays information on the third linear distance L3 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L3 & ROI}, and displays them on an idealized breast profile; see Robinson ¶ [0091]). With regards to Claim 76, modified Robinson teaches of wherein the display control unit further displays information on the second linear distance L2 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L2 & ROI}, and displays them on an idealized breast profile {i.e. superimposing}; see Robinson ¶ [0091]). With regards to Claim 86, modified Robinson teaches of wherein the site specifying unit specifies a pectoralis major muscle or chest wall of the subject in the ultrasound image, the ultrasound system further comprises a fourth distance calculation unit that calculates a fourth linear distance L4 from the region of interest to the pectoralis major muscle or chest wall in the ultrasound image (as noted above, Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction. Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0076 & 0103]} any point {i.e. first/second/third straight lines & third linear distance L3} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating various position relative to the same reference points; furthermore, Claim 2 is a contingent limitation and one of ordinary skill in the art would understand that the Robinson system is capable of performing triangulation calculation via Pythagorean Theorem, as disclosed by Robinson), and the display control unit further displays information on the fourth linear distance L4 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L2 & ROI}, and displays them on an idealized breast profile; see Robinson ¶ [0091]). With regards to Claim 101, modified Robinson teaches of further comprising: an input device that receives an instruction input from the user (the user to establish {i.e. user input} a single reference mark for all of the recorded scan tracks; see Robinson ¶ [0079]; other control 11; see Robinson FIG. 31 & ¶ [0139]; it should be appreciated the FIG. 31 clearly illustrates what one of ordinary skill in the art as a keyboard/input device of hand-held imaging monitor console 18), wherein the site specifying unit specifies at least one of the region of interest or the epidermis in the ultrasound image on the basis of the instruction input from the user (some embodiments allow the user to establish a single reference mark {e.g. nipple} for all of the recorded scan tracks, create a unique reference point for each scan track, or create a set of unique reference points, each point serving as reference for a grouping of scan tracks; see Robinson ¶ [0079]; it should be appreciated that one of ordinary skill in the art would understand that for the a user to establish the reference marks they would have to interact with the hand-held imaging monitor console 18 via the clearly illustrated keyboard/input device). With regards to Claim 111, modified Robinson teaches further comprising: an image analysis unit that analyzes the ultrasound image (the Image Recording and Tissue Mapping System includes a processor or controller that performs the reconstruction functions or actions as described. The processor, controller, or computer may execute software or instructions for this purpose; see Robinson ¶ [0146]), wherein the site specifying unit specifies the region of interest in the ultrasound image on the basis of an analysis result of the ultrasound image (in ¶ [0148] Robinson incorporates US Ser. No. 13/854,800 which teaches of software for the detection suspicious lesions in corresponding ¶ [0076]; see also Robinson Claim 10 which discloses that the image recording and mapping system is configured to identify at least two reference marks, e.g. detecting suspicious lesions and/or nipple, in the tissue based on the position and location information received from the position tracking system; the user specifies the location of the landmarks {i.e. based on user instruction} corresponding to reference points; see Robinson ¶ [0078-0079]). With regards to Claim 121, modified Robinson teaches of wherein the site specifying unit has a determination model that has learned, using learning ultrasound images including a region of interest of a breast of a subject as teacher data, a relationship between the learning ultrasound image and the region of interest and epidermis included in the learning ultrasound image (a machine learning based method using a CNN to automatically segment 3D breast ultrasound images including the skin, fibroglandular tissue, mass, and fatty tissue; see Xu Abstract), and the determination model uses the ultrasound image as an input, and specifies at least one of the region of interest or the epidermis in the ultrasound image (a machine learning based method using a CNN to automatically segment 3D breast ultrasound images including the skin, fibroglandular tissue, mass, and fatty tissue; see Xu Abstract). With regards to Claim 152, wherein the position sensor is a magnetic sensor, a GPS sensor, or an optical sensor (any suitable sensor may be used to provide location and position data. For example, magnetic sensors, optical markers (e.g. to be imaged by cameras), infrared markers, and ultraviolet sensors are examples of suitable options; see Robinson ¶ [0142]). With regards to Claim 162, further comprising: a monitor (display 3/17; see Robinson FIG. 31); and a display control unit that displays information on the third linear distance L3 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L3 & ROI}, and displays them on an idealized breast profile; see Robinson ¶ [0091]). With regards to Claim 1716, wherein the display control unit further displays information on the second linear distance L2 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L2 & ROI}, and displays them on an idealized breast profile {i.e. superimposing}; see Robinson ¶ [0091]). With regards to Claim 187, wherein the site specifying unit specifies a pectoralis major muscle or chest wall of the subject in the ultrasound image, the ultrasound system further comprises a fourth distance calculation unit that calculates a fourth linear distance L4 from the region of interest to the pectoralis major muscle or chest wall in the ultrasound image (as noted above, Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction. Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0076 & 0103]} any point {i.e. first/second/third straight lines & third linear distance L3} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating various position relative to the same reference points; furthermore, Claim 2 is a contingent limitation and one of ordinary skill in the art would understand that the Robinson system is capable of performing triangulation calculation via Pythagorean Theorem, as disclosed by Robinson), and the display control unit further displays information on the fourth linear distance L4 on the monitor by superimposing the information on the ultrasound image including the region of interest (in some embodiments, the tissue mapping system takes the recorded images, and associated locations {i.e. linear distance L2 & ROI}, and displays them on an idealized breast profile; see Robinson ¶ [0091]). With regards to Claim 202, further comprising: an input device that receives an instruction input from the user (the user specifies the location of the landmarks {i.e. based on user instruction} corresponding to reference points; see Robinson ¶ [0078-0079]; other control 11; see Robinson FIG. 31 & ¶ [0139]; it should be appreciated the FIG. 31 clearly illustrates what one of ordinary skill in the art as a keyboard/input device of hand-held imaging monitor console 18), wherein the site specifying unit specifies at least one of the region of interest or the epidermis in the ultrasound image on the basis of the instruction input from the user (some embodiments allow the user to establish a single reference mark {e.g. nipple} for all of the recorded scan tracks, create a unique reference point for each scan track, or create a set of unique reference points, each point serving as reference for a grouping of scan tracks; see Robinson ¶ [0079]; it should be appreciated that one of ordinary skill in the art would understand that for the a user to establish the reference marks they would have to interact with the hand-held imaging monitor console 18 via the clearly illustrated keyboard/input device). Claims 9, 13, & 19 are rejected under 35 U.S.C. 103 as being unpatentable over Robinson, as applied to Claim 1, and in further view of Caluser et al. (US PGPUB 20150051489; hereinafter "Caluser"). With regards to Claim 91, modified Robinson teaches of further comprising: an ultrasound diagnostic apparatus (mapping system 10; see Robinson FIG. 31 & ¶ [0139]); and wherein the ultrasound diagnostic apparatus includes the ultrasound probe (hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]), the position sensor (position sensors 32 a-32 c, which are affixed to hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]), and the image generation unit (display module/controller 40 for processing the image data; see Robinson ¶ [0140]), and (Robinson discloses mapping the entire breast region along with recording reference locations of a ROI, nipple, and center of chest wall along with the location/orientation of the probe for each scanned image and the corresponding reconstruction. Therefore, triangulating {i.e. via Pythagorean Theorem as taught in Robinson ¶ [0076 & 0103]} any point {i.e. first/second/third straight lines & third linear distance L3} relative to the ROI, nipple, chest wall centroid, or any point along the skin surface [by the any one of the first/second/third calculating units] would be obvious to one of ordinary skill in the art because the Robinson already teaches of triangulating various position relative to the same reference points). While modified Robinson discloses all of the limitations of intervening claim 1 and the limitations as shown directly above, it appears that modified Robinson may be silent to the struck-through limitations directly above However, Caluser teaches of 3D mapping display (TDMD) 20 which can map coordinates of targets in ultrasound images based on a 3D position sensor 52 disposed on the probe 34 (see Caluser ¶ [0078 & 0081]). In particular, Caluser teaches of: a server (the target position can also be determined at a later time in the same TDMD computer or a remote computer {i.e. server} with the TDMD software {i.e. server includes 3D positional calculation}; see Caluser ¶ [0117]), the server includes the third distance calculation unit (TDMD software {i.e. server includes 3D positional calculation}; see Caluser ¶ [0117]). Modified Robinson and Caluser are both considered to be analogous to the claimed invention because they are in the same field of ultrasound mapping of breast targets. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have further modified Robinson to incorporate the above teachings of Caluser to provide at least the struck-through limitations above. Doing so would aid in remote viewing and interpretation (see Caluser ¶ [0147-0148]). With regards to Claim 131, wherein information on the third linear distance L3 is transmitted to a picture archiving and communication system, and is displayed on a display device of the picture archiving and communication system (the target position can also be determined at a later time in the same TDMD computer or a remote computer {i.e. server} with the TDMD software {i.e. server includes 3D positional calculation} for remote viewing and interpretation, the target positional information can be displayed at the time of the ultrasound examination or at a later date, and it also can be printed and stored in digital format {i.e. PACS} at any time after the acquisition; see Caluser ¶ [0117 & 0147-0148]). With regards to Claim 192, further comprising: an ultrasound diagnostic apparatus (mapping system 10; see Robinson FIG. 31 & ¶ [0139]); and a server (the target position can also be determined at a later time in the same TDMD computer or a remote computer {i.e. server} with the TDMD software {i.e. server includes 3D positional calculation}; see Caluser ¶ [0117]), wherein the ultrasound diagnostic apparatus includes the ultrasound probe (hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]), the position sensor (position sensors 32 a-32 c, which are affixed to hand-held imaging probe 14; see Robinson FIG. 31 & ¶ [0141]), and the image generation unit (display module/controller 40 for processing the image data; see Robinson ¶ [0140]), and the server includes the third distance calculation unit (TDMD software {i.e. server includes 3D positional calculation}; see Caluser ¶ [0117]). Response to Arguments Applicant's arguments with regards to the rejection under 35 U.S.C. 112(a) have been fully considered but they are not persuasive. With regards to Claims 1, Applicant contends that “the features of amended independent claims 1 and 14 are described in the present specification in such a way as to reasonably convey to one skilled in the art that the inventor, at the time the application was filed, has possession of the claimed subject matter.” Without conceding to Applicant’s arguments, the rejection under 35 U.S.C. 112(a) of Claims 1 & 14 has been withdrawn because manually specifying the site via manual input by the user would be well understood by one of ordinary skill in the art in light of the instant specification. However, the act of using an “image analysis unit” to automatically specify a site through a function of computer-aided diagnosis, as in Claims 11-12, does not meet the standards under 35 U.S.C. 112(a) as laid out in MPEP § 2161. To establish the Office’s position, the Office will address Applicant’s arguments with respect to Claim 1 and 14 regarding the automatic detection embodiment of the site specifying unit. In particular, Applicant argues that “Paragraph [0051] of the present application describes that the site specifying unit 62 specifying at least one site in the ultrasound image on the basis of an instruction input from the user. The site specifying unit of the present application is to set, as a region of interest, a region specified by an operator of an ultrasound image and a region automatically detected through a function of a computer-aided diagnosis with an image analyzing unit.” The Office respectfully disagrees. To meet the requirements of 35 U.S.C. 112(a) as laid out in cited MPEP § 2161, the instant specification must describe the intended functionality of said computer-aided diagnosis (CAD) via the image analyzing unit. While ¶ [0052] introduces a determination model, the instant specification fails to provide any details regarding how said determination model functions let alone any details regarding its architecture. Applicant is reminded that simply restating the function recited in the claim is not necessarily sufficient. Neither the cited ¶ [0051-0052] nor the remaining instant specification provide any explicit details regarding how the automatic detection is performed by the image analysis unit. In the field of computer-aided diagnosis, one of ordinary skill in the art would understand that CAD algorithms are highly dependent upon the feature of interest and highly complicated and require explicit description of its architecture for one of ordinary skill in the art to understand is intended functionality. In other words, if one of ordinary skill in the art were to develop a determination model, they would have to achieve the identical output as Applicant based on an identical input. This feat would be virtually impossible in light of the instant specification. Therefore, one of ordinary skill in the art would not be able to discern the functionality of the site specifying unit for automatically specifying an epidermis and/or a region of interest as claimed. It should be appreciated, that the instant specification establishes two embodiments of the site specifying unit in which a site is manually specified via user input OR a site is automatically specified via the image analysis unit. The instant specification does not establish that the same site is specified both by user input and automatically via image analysis. Therefore, the scope of Claim 11 not enabled . ¶ [0052] disclose that: “The site specifying unit 62 can specify at least one site in the ultrasound image on the basis of an instruction input from the user. Further, an image analysis unit that analyzes the ultrasound image may be provided, and the site specifying unit 62 may specify at least one site in the ultrasound image on the basis of an analysis result of the ultrasound image by the image analysis unit,” (emphasis added). The corresponding disclosure only establishes that “at least one site” may be specified via user input and “at least one site” may be specified via image analysis. The instant specification does not disclose, explicitly or inherently, that the specifying unit specifies the same “at least one site” via both user input and via image analysis which is also highlighted in ¶ [0021] as alternative embodiments. With regards to the rejection under 35 U.S.C. 103, Applicant’s arguments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. In particular, Applicant contends that independent Claim 1 is not obvious in view of Robinson. To support their position, Applicant argues that “Robinson fails to disclose that the position and the position data provided by the position tracking module 22 are a specific position and position data such as the first position of the ultrasound probe on a nipple of the subject and the second position of the ultrasound probe on the epidermis above the region of interest acquired by the position acquisition unit recited in claim 1.” Applicant then goes on to cite ¶ [0078] to establish that “Robinson fails to disclose that the first position of the ultrasound probe is a nipple, and the second position of the ultrasound probe is the position of the epidermis above the region of interest of the breast which may possibly be a lesion site.” However, Applicant never establishes a logical correlation between the citation of Robinson and as to why the citation establishes that Robinson cannot teaches of the position acquisition unit. Therefore, Applicant’s argument amounts to a general allegation that the claims define a patentable invention without specifically pointing out how the language of the claims patentably distinguishes them from the references. Merely citing a portion of Robinson does not satisfy this standard and, for at least this reason, Applicant’s arguments are not persuasive. Regardless, one of ordinary skill in the art would understand that according to Robinson’s description of the position tracking module 22 of the tissue mapping system 10, as disclosed in ¶ [0139], especially in view of ¶ [0122-0124] that the breast is mapped based on the position of each image. This is clearly illustrated in FIGS. 6-12 which illustrates the mapping procedure. Therefore, the two scan positions, reproduced below for ease of reference, clearly illustrated in FIG. 25 disclose a first position of the ultrasound probe on a nipple of the subject and a second position of the ultrasound probe on the epidermis above the region of interest in the three-dimensional space. PNG media_image1.png 243 138 media_image1.png Greyscale PNG media_image2.png 261 134 media_image2.png Greyscale For at least this reason, Applicant’s argument is not persuasive. Applicant also argues that “Robinson does not describe a relationship between the specified skin and the specified lesion part nor describe how to specify these parts at all.” More specifically, Applicant argues that “Robinson is silent as to the position of skin of breast above ROI with respect to the position of ROI in the three-dimensional space, i.e., the second position. Therefore, accurate calculation of a first linear distance L1 from the first position to the second position by the first distance calculation unit, accurate calculation of a second linear distance L2 from the ROI to the epidermis of breast above the ROI by the second distance calculation unit, and accurate calculation of a third linear distance L3 from the nipple to the ROI by the third distance calculation unit.” The Office respectfully disagrees. FIG. 25, as reproduced above, clearly illustrates “position of skin of breast above ROI with respect to the position of ROI in the three-dimensional space” as the second position. Triangulating any position within a mapped breast is obvious in view of Robinson because Robinson teaches of: “Determining the distance of the features from the reference points using the method of Pythagoras (taking the square root of the sum of the squares of the distances of the x, y, and z locations of the features from the x′, y′, and z′ locations of the references and transforming those distances to new locations (x″, y″, and z″) relative to the standard reference of the idealized breast (x0, y0, and z0)” (emphasis added). Since the entire breast is mapped, any new marked “reference point” can be triangulated based on simple arithmetic {i.e. Pythagoras theorem} and well within the means of one of ordinary skill in the art and clearly obvious in view of Robinson triangulation disclosure. Applicant also argues that “since Robinson is silent as to the intention and significance of calculation of the third linear distance from the nipple to the ROI by the third distance calculation unit, the subject matter according to claim 1 is not obvious over Robinson.” Applicant purports that “The problem addressed and solved by the present application is described in paragraphs [0008]-[0011] of the present application. Prior techniques do not disclose how to obtain the linear distance from the nipple to the region of interest of the breast, which is important as the positional information of the region of interest of the breast in the ultrasound image.” The Office respectfully disagrees. Robinson teaches of “computing a distance between the first and second projections, and constructing the idealized map based on the relative angle and distance computed,” such as, for example, the two projections as illustrated in FIG. 25 (see Robinson ¶ [0016] & FIG. 22). Applicant also argues that the Office is “relying on improper hindsight in an attempt to support a legal conclusion of obviousness” because “Robinson does not recognize the significance of calculating a third linear distance from the nipple to the region of interest by a third calculation unit.” Applicant cites ¶ [0009-0010] of the instant specification to provide the problem solved by the claimed invention. As noted above, Robinson does disclose measuring the third distance, a distance between two projections, e.g. the two projections illustrated in FIG. 25. It should be appreciated that “The reason or motivation to modify the reference may often suggest what the inventor has done, but for a different purpose or to solve a different problem. It is not necessary that the prior art suggest the combination to achieve the same advantage or result discovered by applicant. See, e.g., In re Kahn, 441 F.3d 977, 987, 78 USPQ2d 1329, 1336 (Fed. Cir. 2006),” see MPEP § 2144(IV). One of ordinary skill in the art would understand, in light of the cited paragraphs of the instant specification, that Robinson addresses a similar problem of “provid[ing] the user with a sense of where the lesion is positioned in the actual breast relative to the reference point” with their idealized breast reconstruction (see Robinson ¶ [0016]). Since Applicant fails to disclose an explicit definition of the error, one of ordinary skill in the art can reasonably interpret Robinson’s “sense of where the lesion is positioned” to amount to the instant problem of minimizing positional error of the region of interest on the skin. Applicant also argues that “Applicant submits that Robinson does not disclose specifying the epidermis above the ROI. Therefore, the Examiner's assertion appears to be based on an overinterpretation.” Without conceding to Applicant’s arguments, the Office introduces Xu as cited by previously cited Zhuang. Xu teaches of image segmentation breast ultrasound using a CNN {i.e. image analysis unit} to identify: skin {i.e. epidermis}, fibrous gland, mass, and adipose (see Zhuang pg. 3, ¶ 2). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASHISH S. JASANI whose telephone number is (571)272-6402. The examiner can normally be reached M-F 8:00 am - 4:00 pm (CST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Keith M. Raymond can be reached on (571) 270-1790. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ASHISH S JASANI/Examiner, Art Unit 3798 /KEITH M RAYMOND/Supervisory Patent Examiner, Art Unit 3798
(Ad) Transform your business with AI in minutes, not months
✓
Custom AI strategy tailored to your specific industry needs
✓
Step-by-step implementation with measurable ROI
✓
5-minute setup that requires zero technical skills
Trusted by 1,000+ companies worldwide