Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

« Previous Version 4 Next »

On this page.

What is cache & types of caching ?

  • Caching is a mechanism to improve the performance of any type of application.

  • A cache is a software or hardware component aimed at storing data so that future requests for the same data can be served faster.

TYPES OF CACHING

DESCRIPTION

In-Memory Caching

Cached data is stored directly in RAM. The most common implementation of this type of caching is based on key-value databases.

Database Caching

Each database usually comes with some level of caching. Specifically, an internal cache is generally used to avoid querying a database excessively. Although each database can implement this differently, the most popular approach is based on using a hash table storing key-value pairs.

Web Caching

Web Client Caching : it is also called Web Browser Caching. It works in a very intuitive way. The first time a browser loads a web page, it stores the page resources, such as text, images, stylesheets, scripts, and media files.

Web Server Caching : storing resources server-side for reuse. Such an approach is helpful when dealing with dynamically generated content, which takes time to be created : avoids servers from getting overloaded, reducing the work to be done, and improves the page delivery speed.

CDN caching

It is aimed at caching content, such as web pages, stylesheets, scripts, and media files, in proxy servers. It can be seen as a system of gateways between the user and the origin server, storing its resources. When the user requires a resource, a proxy server intercepts it and checks to see if it has a copy.

What is a caching strategy ?

It’s to determine the relationship between data source and your caching system, and how your data can be accessed.

What are the 3 types of cache memory ?

Why cache is faster than Database ?

https://alonge.medium.com/why-cache-storage-is-80times-faster-than-disk-storage-hdds-b1e9fef5fd8d :

Caches : are in-memory data stores that maintain/provide fast access to data. The latency of in-memory stores is designed to be submillisecond making them the fastest data store after SSDs

  • Reading 4KB randomly from an SSD takes 150 microseconds

  • Reading 1MB sequentially from cache memory takes 250 microseconds

  • Reading 1MB sequentially from an SSD takes 1,000 microseconds or 1 millisecond

  • Reading 1MB sequentially from disk (HDDs) takes 20,000 microseconds or 20 milliseconds.

What is the purpose of caching ?

A cache's primary purpose is to increase data retrieval performance by reducing the need to access the underlying slower storage layer. Trading off capacity for speed, a cache typically stores a subset of data transiently, in contrast to databases whose data is usually complete and durable.

How does caching work in microservices ?

What is Distributed Caching ?

  • No labels