US Patent Application 17736557. HYBRID ALLOCATION OF DATA LINES IN A STREAMING CACHE MEMORY simplified abstract
Contents
HYBRID ALLOCATION OF DATA LINES IN A STREAMING CACHE MEMORY
Organization Name
Inventor(s)
Michael Fetterman of Lancaster MA (US)
Steven James Heinrich of Madison AL (US)
Shirish Gadre of Fremont CA (US)
HYBRID ALLOCATION OF DATA LINES IN A STREAMING CACHE MEMORY - A simplified explanation of the abstract
This abstract first appeared for US patent application 17736557 titled 'HYBRID ALLOCATION OF DATA LINES IN A STREAMING CACHE MEMORY
Simplified Explanation
- The patent application describes a system for managing cache memory in a computing system. - The system includes a sectored cache memory that allows multiple cache line allocations to share sectors in a cache line. - Traditionally, each cache line allocation is assigned to a separate cache line, leading to low utilization of the cache memory. - With the present techniques, multiple cache lines can share the same cache line, improving cache memory utilization. - Sectors of cache allocations can be assigned strategically to reduce data bank conflicts when accessing cache memory. - This reduction in data bank conflicts can result in improved memory access performance, even when cache lines are shared among multiple allocations.
Original Abstract Submitted
Various embodiments include a system for managing cache memory in a computing system. The system includes a sectored cache memory that provides a mechanism for sharing sectors in a cache line among multiple cache line allocations. Traditionally, different cache line allocations are assigned to different cache lines in the cache memory. Further, cache line allocations may not use all of the sectors of the cache line, leading to low utilization of the cache memory. With the present techniques, multiple cache lines share the same cache line, leading to improved cache memory utilization relative to prior techniques. Further, sectors of cache allocations can be assigned to reduce data bank conflicts when accessing cache memory. Reducing such data bank conflicts can result in improved memory access performance, even when cache lines are shared with multiple allocations.