1 research outputs found

    Incorporating Pattern Prediction Technique for Energy Efficient Filter Cache Design

    No full text
    A filter cache is proposed at a higher level than the L1 (main) cache in the memory hierarchy and is much smaller. The typical size of filter cache is of the order of 256 Bytes. Prediction algorithms popularly based upon the Next Fetch Prediction Table (NFPT) help making the choice between the filter cache and the main cache. In this paper we introduce a new prediction mechanism for predicting filter cache access, which relies on the hit or miss pattern of the instruction access stream over the past filter cache lines accesses. While NFPT makes predominantly incorrect hit-predictions, the proposed Pattern Table based approach reduces this. Predominantly correct prediction achieves efficient cache access, and eliminates cache-miss penalties. Our extensive simulations across a wide range of benchmark applications illustrate that the new prediction scheme is efficient as it results in improved prediction accuracy. Moreover, it reduces energy consumption of the filter cache by as much as 25% compared to NFPT based approaches. Further, the technique implemented is elegant in the form of hardware implementation as it consists only of a shift register and a Look up Table (LUT) and is hence area and energy efficient in contrast to the published prediction techniques
    corecore