18125554. MULICAST AND SIMULCAST IN NEURAL ENGINES simplified abstract (Apple Inc.)

From WikiPatents
Jump to navigation Jump to search

MULICAST AND SIMULCAST IN NEURAL ENGINES

Organization Name

Apple Inc.

Inventor(s)

Christopher L. Mills of Saratoga CA (US)

MULICAST AND SIMULCAST IN NEURAL ENGINES - A simplified explanation of the abstract

This abstract first appeared for US patent application 18125554 titled 'MULICAST AND SIMULCAST IN NEURAL ENGINES

Simplified Explanation

The patent application describes a system-on-a-chip circuit that includes a neural processor circuit connected to a central processor unit. The neural processor circuit consists of multiple neural engines and a data processor circuit. The central processor unit runs a compiler that determines data broadcast mode and input data dimension configuration based on a neural network description. Task descriptors are generated by the compiler and distributed to components of the neural processor circuit for data processing and computational operations.

  • Neural processor circuit with neural engines and data processor
  • Central processor unit executing a compiler for data configuration
  • Task descriptors generated and distributed for data processing
  • Computational operations performed based on input data dimension configuration
  • Data broadcast to neural engines based on determined configuration

Key Features and Innovation

- Integration of neural processor circuit with central processor unit - Compiler for determining data broadcast mode and input data dimension configuration - Task descriptors for efficient data processing and computational operations - Neural engines performing operations based on input data dimension configuration - Data broadcast to neural engines based on determined configuration mode

Potential Applications

- Artificial intelligence systems - Machine learning applications - Neural network processing tasks - Data analytics and pattern recognition - Robotics and automation

Problems Solved

- Efficient data processing in neural networks - Streamlined computational operations - Improved performance in AI applications - Enhanced data broadcasting capabilities - Simplified configuration for neural processing tasks

Benefits

- Faster data processing - Enhanced computational efficiency - Improved accuracy in neural network tasks - Streamlined data broadcasting - Simplified configuration process for neural processing

Commercial Applications

Title: Advanced Neural Processor Circuit for AI Applications This technology can be utilized in various industries such as: - Healthcare for medical imaging analysis - Automotive for autonomous driving systems - Finance for fraud detection and risk assessment - Manufacturing for quality control and predictive maintenance - Security for facial recognition and biometric identification

Prior Art

For prior art related to this technology, researchers can explore patents and publications in the fields of neural processors, central processor units, compilers for neural networks, and data processing in AI systems.

Frequently Updated Research

Researchers in the field of artificial intelligence and neural networks are constantly exploring advancements in neural processor circuits, compilers for efficient data processing, and optimization techniques for computational operations in AI applications.

Questions about Neural Processor Circuit

What are the key components of a neural processor circuit?

A neural processor circuit typically consists of neural engines and a data processor circuit for efficient data processing and computational operations.

How does the compiler in the central processor unit contribute to the functionality of the neural processor circuit?

The compiler determines data broadcast mode and input data dimension configuration based on a neural network description, enabling efficient data processing and computational operations in the neural processor circuit.


Original Abstract Submitted

A system-on-a-chip circuit may include a neural processor circuit coupled to a central processor unit. The neural processor circuit may include a plurality of neural engines and a data processor circuit. The central processor unit is configured to execute a compiler, which is in turn configured to determine a data broadcast mode and an input data dimension configuration mode based on a neural network description. The compiler is configured to generate one or more task descriptors, the task descriptors distributed to components of the neural processor circuit. The data processor circuit is configured to broadcast data from the buffer to the plurality of neural engines based on the determined data dimension configuration mode. The neural engines are configured to perform computational operations according to the determined input data dimension configuration mode.