We use cookies to improve your experience with our site.
Yang HJ, Fang J, Cai M et al. A prefetch-adaptive intelligent cache replacement policy based on machine learning. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(2): 391−404 Mar. 2023. DOI: 10.1007/s11390-022-1573-3.
Citation: Yang HJ, Fang J, Cai M et al. A prefetch-adaptive intelligent cache replacement policy based on machine learning. JOURNAL OFCOMPUTER SCIENCE AND TECHNOLOGY 38(2): 391−404 Mar. 2023. DOI: 10.1007/s11390-022-1573-3.

A Prefetch-Adaptive Intelligent Cache Replacement Policy Based on Machine Learning

  • Hardware prefetching and replacement policies are two techniques to improve the performance of the memory subsystem. While prefetching hides memory latency and improves performance, interactions take place with the cache replacement policies, thereby introducing performance variability in the application. To improve the accuracy of reuse of cache blocks in the presence of hardware prefetching, we propose Prefetch-Adaptive Intelligent Cache Replacement Policy (PAIC). PAIC is designed with separate predictors for prefetch and demand requests, and uses machine learning to optimize reuse prediction in the presence of prefetching. By distinguishing reuse predictions for prefetch and demand requests, PAIC can better combine the performance benefits from prefetching and replacement policies. We evaluate PAIC on a set of 27 memory-intensive programs from the SPEC 2006 and SPEC 2017. Under single-core configuration, PAIC improves performance over Least Recently Used (LRU) replacement policy by 37.22%, compared with improvements of 32.93% for Signature-based Hit Predictor (SHiP), 34.56% for Hawkeye, and 34.43% for Glider. Under the four-core configuration, PAIC improves performance over LRU by 20.99%, versus 13.23% for SHiP, 17.89% for Hawkeye and 15.50% for Glider.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return