0% found this document useful (0 votes)
35 views

DAA Miniproject

Uploaded by

darshraj87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

DAA Miniproject

Uploaded by

darshraj87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 11

G H Patel College of Engineering & Technology

(A Constituent College of CVM University)


Bakrol, Anand

COMPUTER ENGINEERING DEPARTMENT

Project Report
on

Cache Memory Simulation and Evaluation of Replacement Policies


Submitted By

Name of Student: Darsh Rajput


Harsh Patel
Enrollment Number: 12202130501011
12202130501018

Computer Architecture and design (202044507)


A.Y. 2024-25 ODD TERM

1
 Abstract
The Cache Memory Simulation project focuses on the implementation and evaluation of different
cache replacement policies, specifically FIFO (First-In-First-Out), LRU (Least Recently Used),
and Random. Cache memory, a small but high-speed memory, plays a crucial role in reducing the
latency by storing frequently accessed data closer to the CPU. This simulation evaluates how these
replacement policies influence cache performance through metrics such as hit rate and miss rate.
By analyzing the performance with varying cache sizes and associativity, this project provides
insights into the most efficient policies under different scenarios. The project’s findings are
essential for optimizing cache performance in computer architecture, aiding system designers in
making data storage and retrieval faster and more efficient.

2
 Functions used
Following are the functions used in this project:-

1. Cache Access Function

 This function simulates accessing a cache by checking if a requested memory block is


available (hit) or not (miss).

 On a miss, it triggers a replacement based on the specified policy (FIFO, LRU, or


Random).

 It calculates block address, checks for a hit, and executes the replacement policy if
required.

 Key Logic:

o Cache Hit: Checks if the requested data is present.

o Cache Miss: Applies the replacement strategy (FIFO, LRU, or Random).

2.Replacement Policies Functions

 FIFO Replacement:

o Replaces the oldest entry in the cache. This is implemented by keeping track of the
insertion order and replacing the entry that was first added.

 LRU Replacement:

o Keeps track of the least recently accessed block and replaces it. Each cache block
has a timestamp that updates on each access to determine the least recently used
block.

 Random Replacement:

o Chooses a random cache block for replacement. This function leverages a random
selection method to remove a block without considering access history.

3.Simulation Control Function

 This function runs the simulation for a set number of random memory accesses, invoking
the cache access function for each access and storing results.

 Tracks the overall hit and miss count and calculates the hit rate based on total accesses.

3
 main() Function
The main function in this project initializes cache settings (size, block size, and policy choice) and
coordinates the cache memory simulation. It generates random memory accesses, calls the cache
access function for each access, and calculates performance metrics like hit rate and miss rate. At the
end of the simulation, it displays a comparative performance analysis of FIFO, LRU, and Random
policies, providing valuable insights into their effectiveness in reducing cache misses under different
conditions.

4
Requirements(Libraries)
 stdio.h:
o Used for basic input and output functions such as printf and scanf.
o Essential for displaying simulation results and obtaining user input for cache configuration.
 stdlib.h:
o Provides utility functions, including memory allocation ( malloc and free) and random
number generation (rand), which is essential for the Random Replacement policy.
o Allows for dynamic memory allocation for cache blocks and setting up simulation data
structures.
 time.h:
o Used to generate time-based data, crucial for the LRU policy, where each cache block’s last
accessed time must be tracked.
o In this project, time.h helps track access times and timestamps necessary for implementing
the LRU algorithm.
 math.h (optional):
o Could be used for rounding operations or logarithmic calculations if needed to determine
cache size and block relationships, though not always essential in basic simulations.

5
 CODE
#include <stdio.h>
#include <stdlib.h>

#define CACHE_SIZE 4 // Number of cache blocks


#define BLOCK_SIZE 16 // Block size in bytes

#define FIFO 0
#define LRU 1
#define RANDOM 2

// Cache structure (direct-mapped)


typedef struct {
int valid; // Valid bit
int tag; // Tag of the block
int last_used; // Used for LRU
} CacheLine;

// Initialize cache
void init_cache(CacheLine cache[]) {
for (int i = 0; i < CACHE_SIZE; i++) {
cache[i].valid = 0;
cache[i].last_used = -1; // Set initial last_used time to -1 for LRU
}
}
6
// Access cache with a given memory address based on the selected policy
int access_cache(CacheLine cache[], int address, int policy, int *time) {
int block_address = address / BLOCK_SIZE; // Get block address
int cache_index = block_address % CACHE_SIZE; // Direct-mapped
cache
int tag = block_address / CACHE_SIZE; // Extract tag from address

// Cache hit or miss


if (cache[cache_index].valid && cache[cache_index].tag == tag) {
// Cache hit: Update LRU or nothing for other policies
if (policy == LRU) {
cache[cache_index].last_used = (*time)++;
}
return 1; // Cache hit
} else {
// Cache miss: Replace the cache block
if (policy == FIFO) {
// FIFO: Replace the first block
cache[cache_index].valid = 1;
cache[cache_index].tag = tag;
} else if (policy == LRU) {
// LRU: Replace the least recently used block
int lru_index = 0;
for (int i = 1; i < CACHE_SIZE; i++) {
if (cache[i].last_used < cache[lru_index].last_used) {

7
lru_index = i;
}
}
cache[lru_index].valid = 1;
cache[lru_index].tag = tag;
cache[lru_index].last_used = (*time)++;
} else if (policy == RANDOM) {
// RANDOM: Replace a random block
int replace_index = rand() % CACHE_SIZE;
cache[replace_index].valid = 1;
cache[replace_index].tag = tag;
}
return 0; // Cache miss
}
}

// Run cache simulation


void run_simulation(CacheLine cache[], int num_accesses, int
max_address, int policy) {
int hits = 0, misses = 0;
int time = 0;

for (int i = 0; i < num_accesses; i++) {


int address = rand() % max_address; // Generate random memory
address
int result = access_cache(cache, address, policy, &time);
if (result == 1) {
8
hits++;
} else {
misses++;
}
}

// Print results
printf("Cache Hits: %d\n", hits);
printf("Cache Misses: %d\n", misses);
printf("Hit Rate: %.2f%%\n", 100.0 * hits / (hits + misses));
}

int main() {
CacheLine cache[CACHE_SIZE]; // Cache with defined size
init_cache(cache); // Initialize cache

int num_accesses = 1000; // Number of memory accesses


int max_address = 1024; // Maximum address (for simplicity)

// Choose the replacement policy: FIFO, LRU, or RANDOM


int policy = LRU; // Set to FIFO, LRU, or RANDOM

run_simulation(cache, num_accesses, max_address, policy); // Run the


simulation

return 0;

9
}
Output

10
Conclusion
 This Cache Memory Simulation project provides a comprehensive analysis of the effectiveness of
different cache replacement policies—FIFO, LRU, and Random—in managing cache memory
efficiently. By simulating these policies and evaluating their performance metrics (cache hits,
misses, and hit rate), we demonstrated that the choice of replacement policy significantly impacts
overall system performance.
 The simulation results generally indicate that the LRU (Least Recently Used) policy performs
better in retaining frequently accessed data, leading to a higher hit rate compared to FIFO and
Random. FIFO, while simpler to implement, may lead to more cache misses under certain access
patterns, as it does not consider usage frequency. Random replacement, being independent of
access patterns, tends to perform unpredictably and may yield lower efficiency.
 In conclusion, selecting an appropriate cache replacement policy is essential for optimizing cache
performance. This project underscores the importance of evaluating different policies in practical
scenarios, helping system architects make data-informed decisions on cache design and
management. The findings from this simulation are applicable to real-world computing systems,
where efficient cache management can significantly improve processing speed and resource
utilization.

11

You might also like