Patent Application 18170294 - POSITION TRACKING CLASSIFICATION AND METHODS FOR - Rejection
Appearance
Patent Application 18170294 - POSITION TRACKING CLASSIFICATION AND METHODS FOR
Title: POSITION TRACKING CLASSIFICATION AND METHODS FOR USE THEREWITH
Application Information
- Invention Title: POSITION TRACKING CLASSIFICATION AND METHODS FOR USE THEREWITH
- Application Number: 18170294
- Submission Date: 2025-05-15T00:00:00.000Z
- Effective Filing Date: 2023-02-16T00:00:00.000Z
- Filing Date: 2023-02-16T00:00:00.000Z
- National Class: 340
- National Sub-Class: 541000
- Examiner Employee Number: 84375
- Art Unit: 2686
- Tech Center: 2600
Rejection Summary
- 102 Rejections: 0
- 103 Rejections: 2
Cited Patents
The following patents were cited in the rejection:
Office Action Text
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . To expedite the prosecution of this application, Examiner is not restricting claim(s) 1-25. Claim(s) G1: 1-5, 9, 15-21, and 25 are directed to an invention independent of the claim(s) G2: 1, 6, 7, 8, 10.16, 17, and 22-24, G3: claims 1, 11-14, 16, and 17. Further amendments to claim(s) 1-25 or their dependent claims may cause an undue burden at the US Patent Office, and if so, the claim(s) will be restricted in the subsequent office action under the statute 35 USC 121. Claim Objections Claim 1 is objected to because of the following informalities: Claim 1 recites, “the output signal” in line 8 and it is unclear whether “the output signal” is referring to the “an output” in line 3. Applicant is requested to maintain proper antecedent basis in the claims. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “an interface configured to interface and communicate with a network” in claim 1, is interpreted to be “interfaces include, for example, Bluetooth transceivers, Ultra-Wideband (UWB) transceivers, WiFi transceivers (802.11a—802.11xx), 4G, 5G or other cellular data transceivers, WIMAX transceivers, ZigBee transceivers, Z-wave transceivers, 6LoWPAN transceivers, IPV6 transceivers based on THREAD, or other wired or wireless communication interfaces.” See ¶ 0023; “the sensor unit is configured to measure and report an output” in claims 1 is interpreted to be “mechanical switches, magnetic switches, thermocouples, impedance devices, accelerometers, gyroscopes, magnometers, passive infrared sensors, thermal sensors, image sensors, manometers, voltmeters, acoustic sensors, acoustic wave sensors, flow sensors, pressure sensors, force sensors, compression load sensors, compression sensors, vibration sensors etc.” See ¶ 027; the “processing circuitry is configured to execute the operational instructions to” in claims 1, 8, 10, 15, is interpreted to be, “a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.” See ¶ 0090; “another sensor unit is configured to sense one or more additional changes to the environment” in claim 4, is interpreted to be same as sensor unit, See ¶ 0027, “the another sensor unit is configured to measure and report events associated with” in claim 5, is interpreted to be same as sensor unit, See ¶ 0027; “the network is configured to aggregate the notification with…” in claim 11, is interpreted to be “[t]he network 125 can include the Internet or other wide area network, a home network, a virtual private network or other private network, a personal area network and/or other data communication network including wired, optical and/or wireless links.” See ¶ 0025; “sensing, by a sensor unit … changes to an environment” in clam 17 is interpreted to be same as sensor unit, See ¶ 0027; “the sensor unit is configured to measure and report events associated with” in clam 19, is interpreted to be same as sensor unit, See ¶ 0027; “sensing, by another sensor …, one or more additional changes to the environment,” in claim 20 is interpreted to be same as sensor unit, See ¶ 0027; “another sensor unit is configured to measure and report events associated with” in claim 21, is interpreted to be same as sensor unit, See ¶ 0027; “inertial measurement unit is configured to measure and report events associated with…” in claim 25 is interpreted to be “inertial measurement unit 500 comprising, for example, gyroscopes (502-2, 502-3 and 502-1) and accelerometers (504-2, 504-3 and 504-1))” See ¶ 0038. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claim 25 is rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the enablement requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention. Claim 25 recites, “inertial measurement unit, wherein the inertial measurement unit is configured to measure and report events associated with … temperature.” The specification fails to enable one of skilled in the art to use an inertial measurement unit and measure and report temperature. Please point to the section of the specification that enable one of skilled in the art to measure and report temperature using an inertial measurement unit. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 3, 19, and 25 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 3 recites, “wherein the motion states include … temperature associated with the sensor apparatus”. It is unclear to the examiner the relation between the temperature associated with the sensor apparatus motion states. Examiner does not have an educated guess, as to the subject matter the inventor or a joint inventor, regards as the invention. Claim 19 recites an identical limitation and rejected for the same reasons. Claim 25 recites, “substrate” and “memory;” however, fails to tie these elements with the rest of the claimed limitation, and it is unclear to the examiner the purpose of these elements in the claimed invention. Therefore, it is unclear to the Examiner the subject matter, the inventor or a joint inventor, regards as the invention. Claim 25 recites, “inertial measurement unit, wherein the inertial measurement unit is configured to measure and report events associated with … temperature.” It is unclear to the examiner the relationship between the IMU and the temperature, and, therefore, it is unclear to the Examiner the subject matter the inventor or a joint inventor, regards as the invention. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-24 are rejected under 35 U.S.C. 103 as being unpatentable over Murray (US 2011/0207471 A1). Consider claim 1, Donegan teaches, a sensor apparatus, “detecting and identifying security events using signals from a distributed group of sensor devices within a premises” See ¶ 0004, comprising: an interface (228/862) configured to interface and communicate with a network (114), Donegan teaches, “[t]he centralized hub 102 can be in communication via network(s) 114 with sensor devices 108A-N and sensors 112A-N. The centralized hub 102, sensor devices 108A-N, and/or sensors 112A-N can also be in communication via network(s) 114 with a security system for the building 100. Communication can be wired and/or wireless (e.g., BLUETOOTH, WIFI, ETHERNET, etc.). Communication can also be through a home network.” See ¶ 0048, “communication interface 228 can be configured to provide communication between the sensor devices 108A-N and the components,” See ¶ 0080, “communication interface 862 can facilitate communication (e.g., wired or wireless) with the other components, 830 and 840, comprising the sensor device 108A. The communication interface 862 can also facilitate communication between the sensor device 108A, the centralized hub 102,” See ¶ 0226; a sensor unit (222, 872, See Figs. 2A, 8A), wherein the sensor unit is configured to measure and report an output representative of a motion state for the sensor apparatus, Donegan teaches, “the sensor devices 108A-N can include a suite of sensors and other components, including but not limited to processor(s) 214, light sensor 216, sound sensor 218, temperature sensor 220, motion sensor 222, image sensor 224, output device(s) 226, communication interface 228,” See ¶ 0076 “motion signals indicating sudden movement near a window” See ¶ 0028, “The motion detection information can indicate the paths that occupants frequently use while moving within the building 1100. Paths can be identified by time and location of detected motion, as well as direction of motion as indicated by successive motion detection data points. For example, first, second and third motion sensors may detect motion at first, second, and third time points that are one second apart, indicating that a user moved between locations associated with the first, second, and third motion sensors.” See ¶ 0249; memory (1204/1206) that stores operational instructions; and processing circuitry (1202) operably coupled to the interface and to the memory, wherein the processing circuitry is configured to execute the operational instructions, Donegan teaches, “processor 1202 can process instructions for execution within the computing device 1200, including instructions stored in the memory 1204 or on the storage device 1206” See ¶ 0262, to: receive the output signal from the sensor unit, wherein the output signal is representative of one of a plurality of motion states for the sensor unit, wherein the output signal includes information sufficient to determine one or more changes to an environment associated with the sensor apparatus, Donegan teaches, “Sensor devices can be positioned throughout a premises and can provide sensed information about the environment within the premises to a central computing device and/or system… [t]he device located within the premises can process the signals locally and/or in combination with a remote computer system (e.g., cloud-based computing system). The central computing device and/or system can generate specific outputs that are appropriate for detected events and conditions within the premises, such as transmitting alerts and/or messages regarding related events to user devices associated with the premises” See ¶ 0005, “The motion detection information can indicate the paths that occupants frequently use while moving within the building 1100. Paths can be identified by time and location of detected motion, as well as direction of motion as indicated by successive motion detection data points. For example, first, second and third motion sensors may detect motion at first, second, and third time points that are one second apart, indicating that a user moved between locations associated with the first, second, and third motion sensors.” See ¶ 0249; classify, via an artificial intelligence model [specification states, “Examples of such AI include… machine learning techniques” See ¶ 0098,], the output signal according to previously classified events to produce a classified output, Donegan teaches, “[t]he central monitoring system can also classify the security event using one or more machine learning models that were trained to identify a type of the security event using training data that correlates information about conditions detected on premises with different types of security events, and generate, based on the classified security event, instructions to produce audio or visual output at the plurality of sensor devices.” See ¶ 0013, “expected decibel readings for the room 106B can be based on previous decibel readings for the room at the same or similar time as time=1… Therefore, if the detected signals at time=1 is a sudden increase in decibel readings that deviates from the expected signals at time=1, the centralized hub 102 can determine that the detected signal likely represents some type of security event.” See ¶ 0065, “models can be trained using deep learning (DL) neural networks, convolutional neural networks (CNNs), and/or one or more other types of machine learning techniques, methods, and/or algorithms. The models can also be trained using training data that includes signals that have been detected by the sensor devices 108A-N and/or the sensors 112A-N in the building 100. The models can be trained to identify security events based on detected signals and expected conditions of the particular building 100. The models can also be trained to identify security events based on signals and expected conditions in a variety of different buildings.” See ¶ 0068; and determine whether to transmit a notification to the network, Donegan teaches, “The detected conditions can be received by a centralized hub (e.g., centralized computer device/system), which can determine whether the detected conditions exceed expected threshold levels at a particular time when the conditions are detected and/or whether the detected conditions exceed normal conditions for the building that are learned over time. The centralized hub can use the detected conditions along with relative timing of the detected conditions and a physical relationship of the sensor devices to each other in the building to determine and classify a type of security event, a severity of the event, a location of the event, and/or other event related information.” See ¶ 0047, “the sensor device 108C may only transmit the notification to the emergency response personnel based on determining that a detected severity level of the fire 518 exceeds some threshold reporting out level.” See ¶ 0194, See Fig. 3 step 306 and 320. It would have been obvious to one of ordinary skilled in the art at the time of invention (effective filing date for AIA application) to modify the embodiment (Fig. 2A) of Donegan with the embodiment (Fig. 8A) of Donegan so “the sensor devices may perform local processing of the detected audio and, once one or more conditions and/or patterns have been detected, transmit audio information to the centralized computer device/system detailing that detected security related event.” As suggest by Donegan, See ¶ 0009, “so that when events are detected by the sensor devices, timestamps can be attached to that information across a normalized timescale. Therefore, the centralized computer device/system can identify relative timing of detected events across the different sensor devices.” Consider claim 2, the sensor apparatus of claim 1, wherein the sensor unit includes least one of an accelerometer, a gyroscope, a temperature sensor and a magnetometer, Donegan teaches, “Such sensor devices can include a collection of sensors that are configured to detect various conditions, such as microphones to detect sound, cameras to detect visual changes, light sensors to detect changes in lighting conditions, motion sensors to detect nearby motion, temperature sensors to detect changes in temperature, accelerometers to detect movement of the devices themselves, and/or other sensors.” See ¶ 0004, Consider claim 3, the sensor apparatus of claim 1, wherein the motion states include any two of inclination, acceleration, rotation, rotational polarity, vibration and temperature associated with the sensor apparatus, Donegan teaches, “motion sensors to detect nearby motion, temperature sensors to detect changes in temperature, accelerometers to detect movement of the devices themselves” See ¶ 0004, “The computer system can also automatically receive the signals whenever a sensor detects a change in state (e.g., a deviation in passively monitored signals) in the building. Moreover, the computer system can receive the signals upon transmitting requests for any sensed signals from the sensors. Also as described throughout this disclosure, the sensors can include the sensor devices 108A-N and/or the sensors 112A-N positioned throughout the building. The receive signals can include but are not limited to audio (e.g., decibels), visual (e.g., video feed data, image data), light, motion, temperature, and/or smoke signals.” See ¶ 0124. Consider claim 4, the sensor apparatus of claim 1, further comprising: another sensor unit, wherein the another sensor unit is configured to sense one or more additional changes to the environment, Donegan teaches, “Such sensor devices can include a collection of sensors that are configured to detect various conditions, such as microphones to detect sound, cameras to detect visual changes, light sensors to detect changes in lighting conditions, motion sensors to detect nearby motion, temperature sensors to detect changes in temperature, accelerometers to detect movement of the devices themselves, and/or other sensors.” See ¶ 0004. Consider claim 5, the sensor apparatus of claim 4, wherein the another sensor unit is configured to measure and report events associated with at least one of acoustic energy, pressure, compression, compression load, force, fluid movement, torque, chemical properties, vapor, mass flow of a gas, or humidity, “security events can include, but are not limited to, break-ins, burglary, theft, natural disasters, fire, carbon monoxide, flood, gas leaks, and other types of emergencies.” See ¶ 0048, Donegan teaches, “the emergency detection devices 112A-N can be of various configurations, such as a smoke detector and a heat sensor (e.g., a temperature sensor, an infrared sensor, etc.).” See ¶ 0090. Consider claim 6, the sensor apparatus of claim 1, wherein the classified output is associated with a single event, Donegan teaches, “the emergency indication information 266 can indicate location(s) of single or multiple emergencies within the building” See ¶ 0092. “in a scenario where the centralized hub 102 goes down or a connection between the sensor devices 108A-N and the centralized hub 102 goes down, the sensor devices 108A-N can communicate with each other to determine a state of activity in the building 100 and whether a security event is detected.” See ¶ 0050. Consider claim 7, the sensor apparatus of claim 1, wherein the classified output is associated with changes to the environment over a period of time T, Donegan teaches, “the emergency indication information 266 can indicate location(s) of single or multiple emergencies within the building” See ¶ 0092. “a temperature signal that increases enough to exceed the expected threshold temperature condition over a longer period of time can be identified, by the computer system, as having a lower severity level.” See ¶ 0138. “The sensor device 108C can transmit a notification to the other sensor devices 108A-N asking the other sensor devices 108A-N if they detected any anomalous signals over some predetermined period of time. The notification can also request the other sensor devices 108A-N to provide the sensor device 108C with any signals that were detected over some predetermined period of time.” See ¶ 0163. Consider claim 8, the sensor apparatus of claim 1, wherein the processing circuitry is further configured to execute the operational instructions to: compare the classified event output to a plurality of classified events in the memory, “the centralized hub 102 can compare the detected signals to historic signals that correspond to the building 100 and/or the room in which the signal was detected. For the sharp increase in decibel readings in the room 106B, the centralized hub 102 can compare this increase to expected decibel readings for the room 106B. The expected decibel readings for the room 106B can be based on previous decibel readings for the room at the same or similar time as time=1.”See ¶ 0065; and when the classified event output compares favorably to a classified event of the plurality of classified events, determine to transmit the notification to the network, Donegan teaches, “For example, if time=1 is at 8:30 in the morning, the expected decibel readings can be a historic spread of decibel readings that were taken at 8:30 in the morning over a certain number of days. At 8:30 in the morning, historic changes in decibel readings can be very low because building occupants may still be asleep at that time. Therefore, if the detected signals at time=1 is a sudden increase in decibel readings that deviates from the expected signals at time=1, the centralized hub 102 can determine that the detected signal likely represents some type of security event.” See ¶ 0065. Consider claim 9, the sensor apparatus of claim 1, wherein the one or more changes are associated with inertial events, Donegan teaches, “Such sensor devices can include a collection of sensors that are configured to detect various conditions, such as microphones to detect sound, cameras to detect visual changes, light sensors to detect changes in lighting conditions, motion sensors to detect nearby motion, temperature sensors to detect changes in temperature, accelerometers to detect movement of the devices themselves, and/or other sensors.” See ¶ 0004. Consider claim 10, the sensor apparatus of claim 1, wherein the processing circuitry is further configured to execute the operational instructions to: receive another output signal from the sensor unit, Donegan teaches, “centralized computer device/system can also be trained to determine a severity of this identified security event based on the linked signals.” See ¶ 0028; classify, via an artificial intelligence model, the another output signal according to previously detected events to produce another classified event output, Donegan teaches, “Correlating the signals can include linking signals that deviate from the expected threshold values to piece together and identify the security event. As described herein, the centralized hub 102 can correlate different types of signals to piece together the security event.” See ¶ 0067, “the computer system can classify the security event in 312. As described in reference to FIGS. 2A-C, the computer system can apply one or more machine learning models to the linked signals in order to classify the security event. Classifying the security event can include determining a type of security event (314).” See ¶ 0137 ; and store information representative of the classified event output and the another classified event output in the memory, Donegan teaches, “the normal conditions determiner 244 can take a historic spread of detected signals in order to determine what is normal for the building.” See ¶ 0106, “the more signals that can be linked to create a robust story of the security event can improve confidence and/or ability of the computer system to detect future security events.” See ¶ 0136, “[t]he computer system can be trained to identify a variety of security event types from different combinations of signals and deviations in signals. The computer system can also be trained to identify the type of security event based on a variety of factors, such as how much the signals deviate from the expected threshold conditions, what type of signals have been detected, where the detected signals were located in the building, whether occupants were present or near the signals when detected, etc.” See ¶ 0137. Consider claim 11, the sensor apparatus of claim 1, wherein the notification includes information representative of a classified event, Donegan teaches, “the sensor devices can output a notification of the detected event and, based on the tone or inflection in the peoples' voices, instructions or guidance to help the people safely and calmly address the detected security event (e.g., egressing the premises along one or more predetermined escape plans).” See ¶ 0012; and wherein the network is configured to aggregate the notification with one or more additional notifications received from the sensor apparatus, Donegan teaches, “[d]ynamic evacuation guidance can be provided that is based on real-time situational information about people and compromised location(s) within the premises.” See ¶ 0025. Donegan does not explicitly state, aggregate the notification with one or more additional notifications received from the sensor apparatus. Examiner takes Official notice that it is well known in the prior art to aggregate the notification with one or more additional notifications received from the sensor apparatus in order to provide dynamic notification or guidance to the end-user. Consider claim 12, the sensor apparatus of claim 1, wherein the notification includes information representative of a classified event, Donegan teaches, “the sensor devices can output a notification of the detected event and, based on the tone or inflection in the peoples' voices, instructions or guidance to help the people safely and calmly address the detected security event (e.g., egressing the premises along one or more predetermined escape plans).” See ¶ 0012; and wherein the information representative of a classified event is adapted indicate a relative importance of the classified event, Donegan teaches, “the sensor devices 108A-N or the centralized hub 102 can determine a preferred form of outputting notifications, guidance, or other information to building occupants. Audio output can be preferred in emergency situations where, for example, there is a fire and occupants are unable to see visual guidance prompts through smoke and flames. Visual output can be preferred in some security event situations where, for example, there is a break-in and audio signals could attract a thief to a location of the building occupant(s), thereby putting the building occupant(s) at more risk.” See ¶ 0078. “the sensor device 108C may only transmit the notification to the emergency response personnel based on determining that a detected severity level of the fire 518 exceeds some threshold reporting out level.” See ¶ 0208. Consider claim 13, the sensor apparatus of claim 1, wherein the notification includes information representative of a classified event, Donegan teaches, “the sensor devices can output a notification of the detected event and, based on the tone or inflection in the peoples' voices, instructions or guidance to help the people safely and calmly address the detected security event (e.g., egressing the premises along one or more predetermined escape plans).” See ¶ 0012; and wherein the information representative of a classified event is adapted to trigger a specific action based on the classified event, Donegan teaches, “FIG. 5B, the sensor device 108C can respond to the occupant 502A by outputting a message that states, “A fire started in the kitchen. Evacuate through the door!” (516). The sensor device 108B in the room 504A can output audio guidance to the occupant 502B that states, “A fire started in the kitchen. Leave the room immediately and avoid the kitchen.”” See ¶ 0191. “the building may receive and present information related to the fire 1128 and recommended evacuation of the building 1100.” See ¶ 0260. Consider claim 14, the sensor apparatus of claim 1, wherein the network is a local area network, Donegan teaches, “Examples of communication networks include a local area network (LAN), a wide area network (WAN), and the Internet.” See ¶ 0278, wherein the network is configured to consolidate a plurality of notifications to provide consolidated notifications, Donegan teaches, “[d]ynamic evacuation guidance can be provided that is based on real-time situational information about people and compromised location(s) within the premises.” See ¶ 0025. Donegan does not explicitly state, aggregate the notification with one or more additional notifications received from the sensor apparatus. Examiner takes Official notice that it is well known in the prior art to consolidate a plurality of notifications to provide consolidated notifications in order to provide dynamic notification or guidance to the end-user; and wherein the network is further configured to determine whether to transmit the consolidated notifications to a third party, Donegan teaches, “causing one or more of the sensor devices to collectively output egress and/or other guidance to people within the premises, communicating with appropriate emergency personnel (e.g., fire department, police, emergency medical services), and/or other communication.” Examiner takes Official notice that it is well known in the prior art to provide consolidated notifications to a third party in order to provide dynamic notification. Consider claim 15, the sensor apparatus of claim 1, wherein the sensor unit includes a plurality of sensors, See Donegan Fig. 8 A. wherein the processing circuitry is further configured to execute the operational instructions, to: turn off power to at least some of the plurality of sensors in a first mode of operation; determine whether motion is detected in one or more sensors of the plurality of sensors that are not turned off in the first mode; and in response to a determination that motion is detected, turn on power to the at least some of the plurality of sensors in a second mode of operation, Examiner takes Official notice that it is well known in the prior art to turn off power, i.e. put in sleep mode, some of the sensors to save energy; determine whether motion is detected in the vicinity, turn on power to the sensors in sleep mode. Consider claim 16, the sensor apparatus of claim 1, further comprising: a power source (229, 838, 850, 864), wherein the power source is selected from a list that includes at least one of: a battery; a solar collection device, Donegan teaches, “[t]he sensor devices 108A-N can also include the power source 229, which can provide power to the sensor devices 108A-N. The power source 229 can be any type of power supply, including but not limited to batteries, solar energy, and/or plug-in battery packs.” See ¶ 0080. Consider claim 17, method for execution by a sensor apparatus, the method comprises: sensing, by a sensor unit of the sensor apparatus, one or more changes to an environment, See rejection of claim 1; receiving an output signal from the sensor unit, wherein the output signal is representative of the one or more changes, wherein the output signal includes information sufficient to determine each change of the one or more changes to the environment, See rejection of claim 1; classifying, by an artificial intelligence engine of the sensor apparatus, the output signal according to previously detected events to produce a classified event output, See rejection of claim 1; determining, based on the classified event output, whether to transmit a notification to the network, See rejection of claim 1; With respect to, in response to a determination to transmit the notification to the network, transmitting the notification to the network, Donegan teaches, “[i]n 306, the computer system can determine whether any of the received signals exceed the respective expected threshold conditions beyond a threshold level. Sometimes, the computer system can combine the received signals into a collective of signals. The computer system can then determine whether the collective of signals exceeds expected threshold conditions beyond the threshold level.” See ¶ 0129, “If any of the signals exceed the respective expected threshold conditions beyond the threshold level, then the computer system can identify any other signals that were captured at a similar or same time as the signals that exceed the respective expected threshold conditions in 308.” See ¶ 0134, “Once the security event is classified, the computer system can generate appropriate output in 320.” See ¶ 0141. Consider claim 18, the method of claim 17, wherein the sensor unit includes least one of an accelerometer, a gyroscope, at temperature sensor and a magnetometer, See rejection of claim 2. Consider claim 19, the method of claim 17, wherein the sensor unit is configured to measure and report events associated with any two of acceleration, rotation, rotational polarity, vibration and temperature associated with the sensor apparatus, See rejection of claim 3. Consider claim 20, the method of claim 17, further comprising: sensing, by another sensor associated with the sensor apparatus, one or more additional changes to the environment, See rejection of claim 4. Consider claim 21, the method of claim 17, wherein the another sensor unit is configured to measure and report events associated with at least one of acoustic energy, pressure, compression, compression load, force, fluid movement, torque, chemical properties, vapor, mass flow of a gas, or humidity, See rejection of claim 5. Consider claim 22, the method of claim 17, wherein the classified output is associated with a single event, See rejection of claim 6. Consider claim 23, the method of claim 17, wherein the classified output is associated with an output signal over a period of time T, See rejection of claim 7. Consider claim 24, the method of claim 17, further comprising: comparing the classified event output to a plurality of classified events in memory of the sensor apparatus; and when the classified event output compares favorably to a classified event of the plurality of classified events, determining to transmit the notification to the network, See rejection of claim 8. Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Murray (US 2011/0207471 A1) and further in view of Erivantcev (US 2020/0319721 A1). Consider claim 25, a sensor module, comprising: a substrate, Donegan teaches, “the sensor controller 852 and the audio signaling component 840 can share the same housing unit/circuit board” See ¶ 0215; one or more processors (214/852), wherein the one or more processors are configured to receive an output signal from the [[inertial measurement]] sensor unit, [specification states, “IMU … can include gyroscopes and accelerometers,” See ¶ 0037], Donegan teaches, “Such sensor devices can include a collection of sensors that are configured to detect various conditions, such as… accelerometers to detect movement of the devices themselves” see ¶ 0004, “processor(s) 214 can be configured to perform one or more techniques and operations described herein. For example, sometimes, one or more of the sensor devices 108A-N can be configured to operate like the centralized hub 102. In other words, the one or more sensor devices 108A-N can request and/or receive detected signals from other sensor devices 108A-N can determine whether any of the detected signals exceed expected threshold values.” See ¶ 0076, “sensor controller 852 can include a predetermined signaling logic 854, a predetermined output logic 856, a temperature sensor 858, a user presence sensor 860, a light sensor 866, a sound sensor 868, a motion sensor 872, an image sensor 874, and a communication interface 862. The sensor controller 852 can optionally include a power source 864 (e.g., battery) in order to power the sensor controller 852 and/or the sensor device 108A. Sometimes, the sensor controller 852 may not have one or more of the sensors 858, 860, 866, 868, 872, and 874, and instead can collect sensor information from sensors or other sensor devices 108A-N positioned throughout the building, as described throughout this disclosure.” See ¶ 0217; wherein the output signal is representative of one or more changes in an environment, wherein the output signal includes information sufficient to determine each change of the one or more changes in the environment and classify, via an artificial intelligence model, the output signal according to previously detected events to produce a classified event output, See rejection of claim 1, With respect to, a memory and an inertial measurement unit, wherein the inertial measurement unit is configured to measure and report events associated with any two of acceleration, rotation, rotational polarity, vibration and temperature, in an analogous art, Erivantcev teaches, “[a] system having sensor modules and a computing device. Each sensor module has an inertial measurement unit configured to track its orientation.” See abstract. “the user wears several sensor devices (111, 113, 115, 117 and 119) that track the orientations of parts of the user” See ¶ 0037, “the head module (111) and the arm module (113) have micro-electromechanical system (MEMS) inertial measurement units (IMUs) (121 and 131) that measure motion parameters and determine orientations of the head (107) and the upper arm (103).” See ¶ 0062, “the arm module (113) has a microcontroller (139) to process the sensor signals from the IMU (131) of the arm module (113) and a communication module (133) to transmit the motion/orientation parameters of the arm module (113) to the computing device (141).” See ¶ 0070, “Each of the microcontrollers (129, 139) may include a memory storing instructions controlling the operations of the respective microcontroller (129 or 139) to perform primary processing of the sensor data from the IMU (121, 131) and control the operations of the communication module (123, 133), See ¶ 0080, “Each of the IMUs (131 and 121) has a collection of sensor components that enable the determination of the movement, position and/or orientation of the respective IMU along a number of axes. Examples of the components are: a MEMS accelerometer that measures the projection of acceleration (the difference between the true acceleration of an object and the gravitational acceleration); a MEMS gyroscope that measures angular velocities” See ¶ 0064, “rotations (665, 661 and 663) can be calculated from the orientation of the hand (108) relative to the orientation of the upper arm (103) (e.g., using the orientation data measured by the IMUs of the arm module (113)” See ¶ 0239. It would have been obvious to one of ordinary skilled in the art at the time of invention (effective filing date for AIA application) to modify the invention of Donegan and have sensor modules, wherein “[e]ach of the microcontrollers… include a memory storing instructions controlling the operations of the respective microcontroller to perform primary processing of the sensor data from the IMU and control the operations of the communication module” and “[e]ach of the IMUs (131 and 121) has a collection of sensor components that enable the determination of the movement, position and/or orientation of the respective IMU along a number of axes…: a MEMS accelerometer that measures the projection of acceleration (the difference between the true acceleration of an object and the gravitational acceleration); a MEMS gyroscope that measures angular velocities … rotations” as suggested by Erivantcev, See ¶ 0064, 0080, in an effort to effectively decern “which of the computed orientation measurements are accurate, correct, or preferred and which of the computed orientation measurements are inaccurate, incorrect, or not preferred.” See ¶ 0267. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Omer S. Khan whose telephone number is (571)270-5146. The examiner can normally be reached 10:00 am to 8:00 pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian A. Zimmerman can be reached on 571-272-3059. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Omer S Khan/Primary Examiner, Art Unit 2686