18392310. METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS simplified abstract (Intel Corporation)
Contents
- 1 METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS
- 1.1 Organization Name
- 1.2 Inventor(s)
- 1.3 METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS - A simplified explanation of the abstract
- 1.4 Simplified Explanation
- 1.5 Potential Applications
- 1.6 Problems Solved
- 1.7 Benefits
- 1.8 Potential Commercial Applications
- 1.9 Possible Prior Art
- 1.10 Original Abstract Submitted
METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS
Organization Name
Inventor(s)
Fengguang Wu of TENGCHONG (CN)
METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS - A simplified explanation of the abstract
This abstract first appeared for US patent application 18392310 titled 'METHOD AND APPARATUS TO USE DRAM AS A CACHE FOR SLOW BYTE-ADDRESSIBLE MEMORY FOR EFFICIENT CLOUD APPLICATIONS
Simplified Explanation
The abstract describes a method for identifying and migrating memory pages in virtualized systems to improve access speed.
- Virtualized systems: The technology involves virtual machines running on a processor.
- Memory page migration: Pages are moved from slower memory to faster memory based on access patterns.
- Improved performance: By moving frequently accessed pages to faster memory, overall system performance is enhanced.
Potential Applications
This technology can be applied in cloud computing environments, data centers, and virtualized servers to optimize memory usage and improve system performance.
Problems Solved
1. Slow access speed to memory pages in virtualized systems. 2. Inefficient memory management in virtualized environments.
Benefits
1. Enhanced system performance. 2. Improved memory utilization. 3. Increased efficiency in virtualized systems.
Potential Commercial Applications
Optimizing memory usage in cloud computing services. Improving performance in virtualized servers. Enhancing data center efficiency.
Possible Prior Art
One possible prior art could be memory management techniques in virtualized systems that focus on optimizing memory usage and access speed.
Unanswered Questions
How does this technology impact energy consumption in virtualized systems?
This article does not address the potential impact of memory page migration on energy consumption in virtualized systems.
What are the potential security implications of migrating memory pages in virtualized environments?
The article does not discuss the security implications of moving memory pages between different memory types in virtualized systems.
Original Abstract Submitted
Various embodiments are generally directed to virtualized systems. A first guest memory page may be identified based at least in part on a number of accesses to a page table entry for the first guest memory page in a page table by an application executing in a virtual machine (VM) on the processor, the first guest memory page corresponding to a first byte-addressable memory. The execution of the VM and the application on the processor may be paused. The first guest memory page may be migrated to a target memory page in a second byte-addressable memory, the target memory page comprising one of a target host memory page and a target guest memory page, the second byte-addressable memory having an access speed faster than an access speed of the first byte-addressable memory.