18666676. ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD simplified abstract (CANON KABUSHIKI KAISHA)
Contents
- 1 ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Key Features and Innovation
- 1.6 Potential Applications
- 1.7 Problems Solved
- 1.8 Benefits
- 1.9 Commercial Applications
- 1.10 Questions about Ultrasonic Diagnostic Apparatus
- 1.11 Original Abstract Submitted
ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD
Organization Name
Inventor(s)
Kenichi Nagae of Kanagawa (JP)
ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD - A simplified explanation of the abstract
This abstract first appeared for US patent application 18666676 titled 'ULTRASONIC DIAGNOSTIC APPARATUS, LEARNING APPARATUS, AND IMAGE PROCESSING METHOD
Simplified Explanation
The patent application describes an ultrasonic diagnostic apparatus that uses machine learning to generate estimated images based on received signals from multiple ultrasonic wave transmissions.
- Ultrasonic probe scans an object with ultrasonic waves
- Estimated image generating unit uses machine-learned model to generate estimated images
- Learning data includes signals from multiple ultrasonic wave transmissions
- Estimated images equivalent to those obtained from more transmissions are generated
Key Features and Innovation
- Utilizes machine learning to generate estimated images
- Improves image quality by using signals from multiple ultrasonic wave transmissions
- Enhances diagnostic capabilities of the ultrasonic diagnostic apparatus
Potential Applications
- Medical imaging for diagnostic purposes
- Non-destructive testing in industrial applications
- Veterinary medicine for animal health assessments
Problems Solved
- Enhances image quality and diagnostic accuracy
- Reduces the need for multiple ultrasonic wave transmissions
- Improves efficiency and effectiveness of ultrasonic imaging
Benefits
- Accurate and detailed imaging results
- Time-saving and cost-effective diagnostic process
- Enhanced capabilities for medical and industrial applications
Commercial Applications
Ultrasonic diagnostic equipment manufacturers can integrate this technology into their products to offer advanced imaging capabilities for medical, industrial, and veterinary applications.
Questions about Ultrasonic Diagnostic Apparatus
How does the machine-learned model improve image generation in the ultrasonic diagnostic apparatus?
The machine-learned model enhances image generation by analyzing signals from multiple ultrasonic wave transmissions to generate more accurate and detailed estimated images.
What are the potential benefits of using the estimated image generating unit in ultrasonic diagnostic equipment?
The estimated image generating unit can improve diagnostic accuracy, reduce the need for multiple transmissions, and enhance imaging capabilities for various applications.
Original Abstract Submitted
An ultrasonic diagnostic apparatus, comprising: an ultrasonic probe which scans an observation region in an object with an ultrasonic wave; and an estimated image generating unit which, by using a model having been machine-learned using learning data including first data based on a first received signal that is obtained by first transmission/reception of an ultrasonic wave and second data based on a second received signal that is obtained by second transmission/reception that represents a larger number of transmissions/receptions than the first transmission/reception of the ultrasonic wave, generates an estimated image equivalent to image data obtained by the second transmission/reception from third data based on a third received signal that is obtained by transmission/reception equivalent to the first transmission/reception of the ultrasonic wave by the ultrasonic probe.