Fast access of data from Data Warehouse (DW) is a need for today’s Business Intelligence (BI). In the era of Big Data, the cache is regarded as one of the… Click to show full abstract
Fast access of data from Data Warehouse (DW) is a need for today’s Business Intelligence (BI). In the era of Big Data, the cache is regarded as one of the most effective techniques to improve the performance of accessing data. DW has been widely used by several organizations to manage data and use it for Decision Support System (DSS). Many methods have been used to optimize the performance of fetching data from DW. Query cache method is one of those methods that play an effective role in optimization. The proposed work is based on a cache-based mechanism that helps DW in two aspects: the first one is to reduce the execution time by directly accessing records from cache memory, and the second is to save cache memory space by eliminating non-frequent data. Our target is to fill the cache memory with the most used data. To achieve this goal aging-based Least Frequently Used (LFU) algorithm is used by considering the size and frequency of data simultaneously. The priority and expiry age of the data in the cache memory is managed by dealing with both the size and frequency of data. LFU sets priorities and counts the age of data placed in cache memory. The entry with the lowest age count and priority is eliminated first from the cache block. Ultimately, the proposed cache mechanism efficiently utilized cache memory and fills a large performance gap between the main DW and the business user query.
               
Click one of the above tabs to view related content.