mirror of
https://github.com/long2ice/fastapi-cache.git
synced 2026-03-24 20:47:54 +00:00
c20bb73f27408b23984c90379489a485315584a5
fastapi-cache
Introduction
fastapi-cache is a tool to cache fastapi response and function result, with backends support redis and memcache.
Features
- Support
redisandmemcache. - Easily integration with
fastapi. - Support http cache like
ETagandCache-Control.
Requirements
asyncioenvironment.redisif useRedisBackend.memcacheif useMemcacheBackend.
Install
> pip install fastapi-cache2[redis]
or
> pip install fastapi-cache2[memcache]
Usage
Quick Start
import aioredis
import uvicorn
from fastapi import FastAPI
from starlette.requests import Request
from starlette.responses import Response
from fastapi_cache import FastAPICache
from fastapi_cache.backends.redis import RedisBackend
from fastapi_cache.decorator import cache_response, cache
app = FastAPI()
@cache()
async def get_cache():
return 1
@app.get("/")
@cache_response(expire=60)
async def index(request: Request, response: Response):
return dict(hello="world")
@app.on_event("startup")
async def startup():
redis = await aioredis.create_redis_pool("redis://localhost", encoding="utf8")
FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
Use cache_response
If you want cache fastapi response transparently, you can use cache_response as decorator between router decorator and view function and must pass request as param of view function.
And if you want use ETag and Cache-Control features, you must pass response param also.
Use cache
You can use cache as decorator like other cache tools to cache common function result.
License
This project is licensed under the Apache-2.0 License.
Languages
Python
97.8%
Makefile
1.2%
Jinja
1%