18081442. BATCHING AWARE TECHNIQUES FOR REFRESHING MEMORY DEVICES simplified abstract (QUALCOMM Incorporated)

From WikiPatents
Revision as of 06:48, 22 June 2024 by Wikipatents (talk | contribs) (Creating a new page)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

BATCHING AWARE TECHNIQUES FOR REFRESHING MEMORY DEVICES

Organization Name

QUALCOMM Incorporated

Inventor(s)

Saurabh Sethi of Mohali (IN)

Madhukar Reddy N of Hyderabad (IN)

Vasantha Kumar Bandur Puttappa of Bangalore (IN)

Amulya Srinivasan Margasahayam of Bangalore (IN)

BATCHING AWARE TECHNIQUES FOR REFRESHING MEMORY DEVICES - A simplified explanation of the abstract

This abstract first appeared for US patent application 18081442 titled 'BATCHING AWARE TECHNIQUES FOR REFRESHING MEMORY DEVICES

Simplified Explanation: The patent application focuses on techniques to reduce memory access latency caused by memory refreshes, particularly in DRAM. By implementing a memory refresh scheduling algorithm that considers memory access batching, such as read and write batches, the algorithm can prioritize refreshes during write batches to decrease read access latency.

  • Memory refresh scheduling algorithm to reduce memory access latency
  • Consideration of memory access batching (read and write batches)
  • Prioritization of refreshes during write batches to decrease read access latency

Potential Applications: 1. Computer systems 2. Data centers 3. Mobile devices

Problems Solved: 1. High memory access latency due to memory refreshes 2. Inefficient memory refresh scheduling 3. Increased write latency

Benefits: 1. Improved memory access performance 2. Reduced latency in memory operations 3. Enhanced overall system efficiency

Commercial Applications: Optimizing memory access latency in various computing devices can lead to faster performance, improved user experience, and increased productivity in data-intensive applications.

Prior Art: Prior research may include studies on memory refresh scheduling algorithms, memory access optimization techniques, and strategies to reduce latency in DRAM operations.

Frequently Updated Research: Stay updated on the latest advancements in memory access optimization, memory refresh scheduling algorithms, and techniques to enhance DRAM performance.

Questions about Memory Access Latency Reduction: 1. How does the memory refresh scheduling algorithm impact overall system performance? 2. What are the potential challenges in implementing these techniques in real-world applications?

1. *How does the memory refresh scheduling algorithm impact overall system performance? 2. *What are the potential challenges in implementing these techniques in real-world applications?


Original Abstract Submitted

Aspects of the present disclosure are directed to techniques and procedures for reducing memory (e.g., DRAM) access latency (e.g., read latency, write latency) due to memory refreshes. In some aspects, a memory refresh scheduling algorithm can take into account of memory access batching (e.g., read batch, write batch). In some aspects, a refresh scheduling algorithm can schedule more or prioritize refreshes to occur during a write batch to reduce memory read access latency because fewer refreshes are scheduled during memory read access. The techniques can be adapted to reduce write latency.