Patent Application 18638079 - ARTIFICIAL INTELLIGENCE AI AND MACHINE LEARNING - Rejection
Appearance
Patent Application 18638079 - ARTIFICIAL INTELLIGENCE AI AND MACHINE LEARNING
Title: ARTIFICIAL INTELLIGENCE (AI) AND MACHINE LEARNING (ML) BASED GRAPHICAL USER INTERFACE (GUI) SYSTEM FOR EARLY DETECTION OF DEPRESSION SYMPTOMS USING FACIAL EXPRESSION RECOGNITION AND ELECTROENCEPHALOGRAM
Application Information
- Invention Title: ARTIFICIAL INTELLIGENCE (AI) AND MACHINE LEARNING (ML) BASED GRAPHICAL USER INTERFACE (GUI) SYSTEM FOR EARLY DETECTION OF DEPRESSION SYMPTOMS USING FACIAL EXPRESSION RECOGNITION AND ELECTROENCEPHALOGRAM
- Application Number: 18638079
- Submission Date: 2025-05-16T00:00:00.000Z
- Effective Filing Date: 2024-04-17T00:00:00.000Z
- Filing Date: 2024-04-17T00:00:00.000Z
- National Class: 600
- National Sub-Class: 408000
- Examiner Employee Number: 86979
- Art Unit: 3797
- Tech Center: 3700
Rejection Summary
- 102 Rejections: 0
- 103 Rejections: 2
Cited Patents
The following patents were cited in the rejection:
- US 0346552đ
Office Action Text
DETAILED ACTION Notice of AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 1, 9, and 15 are objected to because of the following informalities: âa emotionâ should be an emotionâ. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.âThe specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claim(s) 1-20 is/are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Regarding claims 1, 9, and 15, the past tense term ârequestedâ renders the claim unclear. It is unclear as to why in response to the identification, data is considered ârequestedâ. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. Claim(s) 1-4, 6-10, 12-14, 16, and 18-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over âEEG Signal and video Analysis Based Depression Indicationâ by Y. Katyal et al. IEEE Int. Conf. Adv. Comm. Ctrl. Comput. Tech. 2014 (Katyal), in view of, âMultimodel Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expressionâ by Y. Sun et al. Brain Sciences. 10, 85, Feb 6 2020 (Sun). Regarding claims 1, 9, and 15, Katyal discloses a method, system, and operations comprising: receiving image data of a biological subject under examination for depression; analyzing the image data, to identify an emotional state conveyed by a face of the biological subject; identifying the emotional state as an emotion indicative of depression in the biological subject, acquire electroencephalogram data for the biological subject using a machine learning model (âneural network relevant to the application being considered (i.e. classification of EEG data) will be employed for designing classifiersâ); receiving, from the biological subject, the electroencephalogram data; analyzing the electroencephalogram data to identify depression in the biological subject; and providing the emotional state identified to a graphical user interface (Abstract: âcombining both EEG signal analysis and facial emotion recognition through video analysisâ; âUsing both EEG and Facial results gives a higher percentage output than the individual resultsâ). Katyal does not explicitly disclose that a facial emotional state is identified by way of a first machine learning model, that the electroencephalogram data is acquired in response to an indication of depression, nor that a severity level of depression is identified. However, Sun teaches using a trained Hidden Markov Model to identify a facial-based emotional state of a person (âIt utilizes Regional Hidden Markov Model (RHMM) as its classifier to train the states of three face regionsâŚâ), measuring a severity level of depression (Fig. 7(e)), and that joint utilization of facial expression and wearable neuroimaging, which includes EEG, improved overall emotional analysis (Abstract, 5. Conclusions). Additionally, Katyal teaches that, âvisual attributes work as a cross check in the studyâ, which suggests to one with ordinary skill that a sequential detection and verification of depression indicators may contribute to a more accurate detection of depression. Katyal also discloses a group consisting of ânot depressedâ (Fig. 16), which is a measure of severity. Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to apply the identification and classification of mental state as taught by Sun to the depression indication of Katyal, as to provide robust bimodal detection of depression. Regarding claims 2 and 3, while Katyal does not explicitly disclose the use of a random forest model, a random forest model of machine learning is well-known and conventionally used in the art and thus it would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to apply a random forest model of machine learning, so as to provide a robust and appropriate machine learning algorithm. Regarding claims 4, 10, and 16, Katyal discloses that the emotional state is sadness (Abstract: âdepressionâ). Regarding claims 6, 12, and 18, Katyal discloses that the image data are provided as one of still images or video images (âdetect the face in the given image or video sequenceâ). Regarding claims 7, 13, and 19, Katyal does not explicitly disclose generating a treatment plan for the biological subject based on the severity level identified by the second machine learning model, however Katyal states, âResults shown can detect even a small change in the patientâs mind and his level of depression which can help in early medication and saving the patient, if at all, from having suicidal tendenciesâ, which suggests to one with ordinary skill that it would have been obvious to generate a treatment plan based on a measured severity level. Regarding claims 8, 14, and 20, Katyal discloses a group consisting of ânot depressedâ (Fig. 16). Claim(s) 5, 11, and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over âEEG Signal and video Analysis Based Depression Indicationâ by Y. Katyal et al. IEEE Int. Conf. Adv. Comm. Ctrl. Comput. Tech. 2014 (Katyal), in view of, âMultimodel Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expressionâ by Y. Sun et al. Brain Sciences. 10, 85, Feb 6 2020 (Sun), as applied to claims 1, 9, and 15 above, in view of Khalid (US 2024/0346552). Regarding claims 5, 11, and 17, neither Katyal nor Sun explicitly disclose that the emotional state identified by the first machine learning model and the severity level identified by the second machine learning model are provided to a graphical user interface in real-time. However, Khalid teaches performing expression analysis in real-time ([0040]). Thus, it would have been obvious to one of ordinary skill in the art before the effective filing date of the present invention to apply the real-time feed as taught by Khalid to the system of Katyal and Sun, as to provide robust real-time feedback. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jason Ip whose telephone number is (571) 270-5387. The examiner can normally be reached Monday - Friday 9a-5p PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examinerâs supervisor, Christopher Koharski can be reached on (571) 272-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JASON M IP/Primary Examiner, Art Unit 3793