18213028. Heterogeneous Compute Platform Architecture For Efficient Hosting Of Network Functions simplified abstract (GOOGLE LLC)
Heterogeneous Compute Platform Architecture For Efficient Hosting Of Network Functions
Organization Name
Inventor(s)
Santanu Dasgupta of Fremont CA (US)
Bok Knun Randolph Chung of Los Altos CA (US)
Ankur Jain of Mountain View CA (US)
Prashant Chandra of San Jose CA (US)
Durgaprasad V. Ayyadevara of San Ramon CA (US)
Ian Kenneth Coolidge of San Diego CA (US)
Muzammil Mueen Butt of Lynnwood WA (US)
Heterogeneous Compute Platform Architecture For Efficient Hosting Of Network Functions - A simplified explanation of the abstract
This abstract first appeared for US patent application 18213028 titled 'Heterogeneous Compute Platform Architecture For Efficient Hosting Of Network Functions
Simplified Explanation
The present disclosure describes a converged compute platform architecture that includes two configurations: an infrastructure processing unit (IPU)-only configuration and a configuration where the IPU is coupled with a central processing unit (CPU) like an x86 processor.
- The first configuration consists of only the IPU, while the second configuration combines the IPU with a CPU.
- The two configurations can communicate with each other through a PCIe switch or remote direct memory access (RDMA) techniques.
- Both configurations utilize machine learning (ML) acceleration through a single converged architecture.
Potential Applications
This technology has potential applications in various fields, including:
- Data centers and cloud computing environments
- Artificial intelligence and machine learning applications
- High-performance computing and scientific research
Problems Solved
The converged compute platform architecture addresses the following problems:
- Efficient utilization of resources by combining the IPU and CPU in a single architecture
- Improved communication and connectivity between the two configurations
- Accelerated machine learning capabilities through a converged architecture
Benefits
The benefits of this technology include:
- Enhanced performance and efficiency in data centers and cloud computing environments
- Improved ML acceleration through a converged architecture
- Simplified communication and connectivity between the IPU and CPU configurations
Original Abstract Submitted
The present disclosure provides for a converged compute platform architecture, including a first infrastructure processing unit (IPU)-only configuration and a second configuration wherein the IPU is coupled to a central processing unit, such as an x86 processor. Connectivity between the two configurations may be accomplished with a PCIe switch, or the two configurations may communicate through remote direct memory access (RDMA) techniques. Both configurations may use ML acceleration through a single converged architecture.
- GOOGLE LLC
- Santanu Dasgupta of Fremont CA (US)
- Bok Knun Randolph Chung of Los Altos CA (US)
- Ankur Jain of Mountain View CA (US)
- Prashant Chandra of San Jose CA (US)
- Bor Chan of Fremont CA (US)
- Durgaprasad V. Ayyadevara of San Ramon CA (US)
- Ian Kenneth Coolidge of San Diego CA (US)
- Muzammil Mueen Butt of Lynnwood WA (US)
- G06F13/38
- G06F13/28