Patent Application 18436335 - SYSTEM METHOD AND COMPUTER-READABLE STORAGE - Rejection
Appearance
Patent Application 18436335 - SYSTEM METHOD AND COMPUTER-READABLE STORAGE
Title: SYSTEM, METHOD, AND COMPUTER-READABLE STORAGE MEDIUM FOR SYSTEM-ON-CHIP VERIFICATION
Application Information
- Invention Title: SYSTEM, METHOD, AND COMPUTER-READABLE STORAGE MEDIUM FOR SYSTEM-ON-CHIP VERIFICATION
- Application Number: 18436335
- Submission Date: 2025-05-19T00:00:00.000Z
- Effective Filing Date: 2024-02-08T00:00:00.000Z
- Filing Date: 2024-02-08T00:00:00.000Z
- National Class: 714
- National Sub-Class: 738000
- Examiner Employee Number: 77270
- Art Unit: 2111
- Tech Center: 2100
Rejection Summary
- 102 Rejections: 1
- 103 Rejections: 0
Cited Patents
The following patents were cited in the rejection:
- US 0018861đ
Office Action Text
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This is a NON-FINAL OFFICE ACTION in response to the present Application filed 02/08/2024. Claims 1-20 are pending in the Application, of which Claims 1, 8 and 17 are independent. Continuity/ Priority Information The present Application 18436335 filed 02/08/2024 claims foreign priority to REPUBLIC OF KOREA, Application No.10-2023-0018861, filed 02/13/2023. Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55. Information Disclosure Statement The information disclosure statement (IDS) submitted on 02/08/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement has been considered by the examiner. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless â (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Sommers (U.S. Patent No. 10,733,088) Pub. Date: 2020-08-04. Regarding independent Claims 1, 8 and 17, Sommers discloses methods, systems, and computer readable media for testing a network node or a related application programming interface (API) using source code, comprising: a library (storage 108, FIG. 1) configured to store a plurality of sequence codes corresponding to the plurality of IPs and a plurality of task codes for driving the SoC; and processing circuitry (TC 102, MA 104, FIG. 1) configured to, generate at least one test scenario by combining one or more of the sequence codes and one or more of the task codes stored in the library: FIG. 1 is a diagram illustrating an example test system 100 for testing a network node or a related API using source code metadata, i.e. such as verifying a system-on-chip (SoC). Referring to FIG. 1, In some embodiments, TC 102, MA 104, and/or other entities in test system 100 may include functionality for accessing data storage 108 or other memory. Data storage 108 may be any suitable entity or entities (e.g., a storage device, memory, a non-transitory computer readable medium, or a storage system) for maintaining or storing information related to testing. For example, data storage 108 may store message capture related information, e.g., time delta information, timestamp related data, and/or other information. generate metadata based on the at least one test scenario, the metadata having a specific format, and generate universal verification methodology (UVM) test code based on the metadata. FIG. 2 is a diagram illustrating an example environment 200 for processing source code for programming SUT 106. In some embodiments, CE 202 may generate source code metadata 208 before, during, or after CE 202 compiles or processes source code 204. n some embodiments, CE 202 may include functionality for analyzing source code metadata 208 and/or related files (e.g., source code 204) to identify code portions that are usable for deriving or obtaining test metadata. For example, CE 202 may include a test metadata generator that obtains and/or derives test metadata based on certain keywords or code sections. In this example, the test metadata may be used test system 100 or another entity to generate test plans and/or related test packets for testing SUT 106. FIG. 3 is a diagram illustrating an example user interface 302 for importing test metadata 300 derived from source code metadata 208. After source code metadata 208 is imported into test system 100, test metadata 300 may be generated using (e.g., derived from) source code metadata 208 by test system 100 or an entity therein (e.g., MA 104). FIG. 5 is a diagram illustrating an example process for testing a network node or a related API using source code metadata. At step 504, the source code metadata may be analyzed to generate test metadata, wherein analyzing the source code metadata to determine the test metadata includes identifying source code metadata portions that indicate elements to test and determining the test metadata based on the elements. At step 506, one or more test plans may be generated for testing the network node or an API associated with the network node. For example, a test plan may be used by test system 100 or related entity (e.g., packet or API request generator) to indicate test traffic that is to be sent to SUT 106. Regarding Claims 2-4, 10-12, 18, Sommers discloses one or more sequence codes; FIG. 2 is a diagram illustrating an example environment 200 for processing source code for programming SUT 106. Referring to FIG. 2, code engine (CE) 202 may be any suitable entity or entities for performing one or more aspects associated with processing source code or related source code files. In some embodiments, CE 202 may include functionality for compiling and/or interpreting source code into machine code, byte code, or other code (e.g., intermediate code) for implementation or executing at a target platform 206. In some embodiments, CE 202 may communicate directly or indirectly with test system 100 and/or MA 104. For example, source code metadata 208 and/or other information generated by CE 202 may be sent to MA 104 for processing via CE 202, target platform 206, a management node, or an operator. In this example, target platform 206 may represent SUT 106 and may include functionality for communicating source code metadata 208 to test system 100 or a related entity (e.g., TC 102, MA 104, etc.) via a P4Runtime API. Regarding Claim 5, Sommers discloses metadata; User interface 302 can allow test metadata 300 derived from source code metadata 208 to be imported. For example, after MA 104 analyzes source code metadata 208, generates test metadata 300 derived from source code metadata 208, and creates a metadata file containing test metadata 300, test operator 110 can input the file name (e.g., âswitch. metadataâ) of the metadata file in an import dialog of a template editor GUI, e.g., of test system 100 or a related entity (e.g., TC 102). Regarding Claims 6, 7, 9, 13-16, 19, 20, Sommers discloses generating a UVM test code; Referring to FIG. 1, In some embodiments, test system 100 or another entity (e.g., MA 104) may generate a number of test plans (e.g., protocol templates, packet templates, flow templates, test templates, etc.) or related information. For example, MA 104 or another entity may generate test plans that are based on possible combinations of protocol headers or API requests derived using a source code metadata file (e.g., a P4Info file). In some embodiments, test system 100 or another entity (e.g., MA 104) may generate one or more test plans for testing SUT 106 with valid values (e.g., parameter field values) and may also generate one or more test plans for testing SUT 106 with invalid values. In some embodiments, test system 100 or another entity (e.g., MA 104) may generate or include source file metadata and/or other information in test plans. In some embodiments, a test plan may also include a plan description and a summary of how the plan relates to a test case (e.g., condition or scenario being tested). For example, a test plan may include useful information to distinguish a plan with invalid header field values from a test plan with valid header field values. Prior Art References Cited The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See References Cited on PTO-892 form. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES C KERVEROS whose telephone number is (571)272-3824. The examiner can normally be reached 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examinerâs supervisor, MARK FEATHERSTONE can be reached at (571) 270-3750. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JAMES C KERVEROS/Primary Examiner, Art Unit 2111 Date: May 14, 2025 Non-Final Rejection 20250513 JAMES C. KERVEROS Primary Examiner, Art Unit 2111 James.Kerveros@USPTO.GOV