The Distributed Microservice Cache Manager is a microservice that provides a centralized cache management system that can be used by other microservices to store and retrieve frequently used data. It uses caching technologies like Redis, Memcached, or Hazelcast to cache data, and is built using a REST API that allows other microservices to access it.
- REST API for storing and retrieving data from the cache
- Support for caching technologies like Redis, Memcached, or Hazelcast
- Cache eviction policies like Least Recently Used (LRU) or Least Frequently Used (LFU)
- Support for cache synchronization across multiple instances of the service
- Containerized using Docker and deployable on Kubernetes
- Docker
- Kubernetes (optional).
-
Clone the repository:
git clone https://github.com/Addax101/Distributed-microservice-cache-manager.git
-
Install the dependencies:
cd distributed-microservice-cache-manager npm install
-
Start the microservice:
npm start
. This will start the microservice onhttp://localhost:3000.
-
Use the REST API to store and retrieve data from the cache;
# Store data in the cache
POST http://localhost:3000/cache/key
{ "value": "data" }
# Retrieve data from the cache
GET http://localhost:3000/cache/key
- Build the Docker image:
docker build -t {UNSTABLE}/distributed-microservice-cache-manager
- Push the Docker image to a registry:
docker push your-{UNSTABLE}/distributed-microservice-cache-manager
- Deploy the microservice on Kubernetes:
kubectl apply -f kubernetes/deployment.yaml
Contributions are welcome! Here are some ways you can contribute:
- Report bugs and suggest improvements by opening an issue
- Implement new features by forking the repository and submitting a pull request
- Improve the documentation
This project is licensed under the MIT License. See the LICENSE file for details.