📖 Copy-edit README (#168)

- Update workflow shields to point to new CI/CD pipeline, and link
  all shields to somewhere appropriate.
- Use product names instead of code-markup names.
- Edit for English grammar and style.
- Expand decorator argument table to add defaults
- Add more meaningful `Coder` and key builder examples and expand
  on what the default key builder does.
This commit is contained in:
Martijn Pieters
2023-05-17 12:34:40 +01:00
committed by GitHub
parent 826e785522
commit b287f21043

139
README.md
View File

@@ -1,27 +1,26 @@
# fastapi-cache # fastapi-cache
![pypi](https://img.shields.io/pypi/v/fastapi-cache2.svg?style=flat) [![pypi](https://img.shields.io/pypi/v/fastapi-cache2.svg?style=flat)](https://pypi.org/p/fastapi-cache2)
![license](https://img.shields.io/github/license/long2ice/fastapi-cache) [![license](https://img.shields.io/github/license/long2ice/fastapi-cache)](https://github.com/long2ice/fastapi-cache/blob/main/LICENSE)
![workflows](https://github.com/long2ice/fastapi-cache/workflows/pypi/badge.svg) [![CI/CD](https://github.com/long2ice/fastapi-cache/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/long2ice/fastapi-cache/actions/workflows/ci-cd.yml)
![workflows](https://github.com/long2ice/fastapi-cache/workflows/ci/badge.svg)
## Introduction ## Introduction
`fastapi-cache` is a tool to cache fastapi response and function result, with backends support `redis`, `memcache`, `fastapi-cache` is a tool to cache FastAPI endpoint and function results, with
and `dynamodb`. backends supporting Redis, Memcached, and Amazon DynamoDB.
## Features ## Features
- Support `redis`, `memcache`, `dynamodb`, and `in-memory` backends. - Supports `redis`, `memcache`, `dynamodb`, and `in-memory` backends.
- Easily integration with `fastapi`. - Easy integration with [FastAPI](https://fastapi.tiangolo.com/).
- Support http cache like `ETag` and `Cache-Control`. - Support for HTTP cache headers like `ETag` and `Cache-Control`, as well as conditional `If-Match-None` requests.
## Requirements ## Requirements
- `asyncio` environment. - FastAPI
- `redis` if use `RedisBackend`. - `redis` when using `RedisBackend`.
- `memcache` if use `MemcacheBackend`. - `memcache` when using `MemcacheBackend`.
- `aiobotocore` if use `DynamoBackend`. - `aiobotocore` when using `DynamoBackend`.
## Install ## Install
@@ -85,44 +84,46 @@ async def startup():
### Initialization ### Initialization
Firstly you must call `FastAPICache.init` on startup event of `fastapi`, there are some global config you can pass in. First you must call `FastAPICache.init` during startup FastAPI startup; this is where you set global configuration.
### Use `cache` decorator ### Use the `@cache` decorator
If you want cache `fastapi` response transparently, you can use `cache` as decorator between router decorator and view If you want cache a FastAPI response transparently, you can use the `@cache`
function and must pass `request` as param of view function. decorator between the router decorator and the view function.
Parameter | type, description Parameter | type | default | description
------------ | ------------- ------------ | ----| --------- | --------
expire | int, states a caching time in seconds `expire` | `int` | | sets the caching time in seconds
namespace | str, namespace to use to store certain cache items `namespace` | `str` | `""` | namespace to use to store certain cache items
coder | which coder to use, e.g. JsonCoder `coder` | `Coder` | `JsonCoder` | which coder to use, e.g. `JsonCoder`
key_builder | which key builder to use, default to builtin `key_builder` | `KeyBuilder` callable | `default_key_builder` | which key builder to use
injected_dependency_namespace | prefix for injected dependency keywords, defaults to `__fastapi_cache`. `injected_dependency_namespace` | `str` | `__fastapi_cache` | prefix for injected dependency keywords.
cache_status_header | Name for the header on the response indicating if the request was served from cache; either `HIT` or `MISS`. Defaults to `X-FastAPI-Cache`. `cache_status_header` | `str` | `X-FastAPI-Cache` | Name for the header on the response indicating if the request was served from cache; either `HIT` or `MISS`.
You can also use `cache` as decorator like other cache tools to cache common function result. You can also use the `@cache` decorator on regular functions to cache their result.
### Injected Request and Response dependencies ### Injected Request and Response dependencies
The `cache` decorator adds dependencies for the `Request` and `Response` objects, so that it can The `cache` decorator injects dependencies for the `Request` and `Response`
add cache control headers to the outgoing response, and return a 304 Not Modified response when objects, so that it can add cache control headers to the outgoing response, and
the incoming request has a matching If-Non-Match header. This only happens if the decorated return a 304 Not Modified response when the incoming request has a matching
endpoint doesn't already list these objects directly. `If-Non-Match` header. This only happens if the decorated endpoint doesn't already
list these dependencies already.
The keyword arguments for these extra dependencies are named The keyword arguments for these extra dependencies are named
`__fastapi_cache_request` and `__fastapi_cache_response` to minimize collisions. `__fastapi_cache_request` and `__fastapi_cache_response` to minimize collisions.
Use the `injected_dependency_namespace` argument to `@cache()` to change the Use the `injected_dependency_namespace` argument to `@cache` to change the
prefix used if those names would clash anyway. prefix used if those names would clash anyway.
### Supported data types ### Supported data types
When using the (default) `JsonCoder`, the cache can store any data type that FastAPI can convert to JSON, including Pydantic models and dataclasses, When using the (default) `JsonCoder`, the cache can store any data type that FastAPI can convert to JSON, including Pydantic models and dataclasses,
_provided_ that your endpoint has a correct return type annotation, unless _provided_ that your endpoint has a correct return type annotation. An
the return type is a standard JSON-supported type such as a dictionary or a list. annotation is not needed if the return type is a standard JSON-supported Python
type such as a dictionary or a list.
E.g. for an endpoint that returns a Pydantic model named `SomeModel`: E.g. for an endpoint that returns a Pydantic model named `SomeModel`, the return annotation is used to ensure that the cached result is converted back to the correct class:
```python ```python
from .models import SomeModel, create_some_model from .models import SomeModel, create_some_model
@@ -133,9 +134,7 @@ async def foo() -> SomeModel:
return create_some_model return create_some_model
``` ```
It is not sufficient to configure a response model in the route decorator; the cache needs to know what the method itself returns. It is not sufficient to configure a response model in the route decorator; the cache needs to know what the method itself returns. If no return type decorator is given, the primitive JSON type is returned instead.
If no return type decorator is given, the primitive JSON type is returned instead.
For broader type support, use the `fastapi_cache.coder.PickleCoder` or implement a custom coder (see below). For broader type support, use the `fastapi_cache.coder.PickleCoder` or implement a custom coder (see below).
@@ -145,46 +144,76 @@ By default use `JsonCoder`, you can write custom coder to encode and decode cach
inherit `fastapi_cache.coder.Coder`. inherit `fastapi_cache.coder.Coder`.
```python ```python
from typing import Any
import orjson
from fastapi.encoders import jsonable_encoder
from fastapi_cache import Coder
class ORJsonCoder(Coder):
@classmethod
def encode(cls, value: Any) -> bytes:
return orjson.dumps(
value,
default=jsonable_encoder,
option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SERIALIZE_NUMPY,
)
@classmethod
def decode(cls, value: bytes) -> Any:
return orjson.loads(value)
@app.get("/") @app.get("/")
@cache(expire=60, coder=JsonCoder) @cache(expire=60, coder=ORJsonCoder)
async def index(): async def index():
return dict(hello="world") return dict(hello="world")
``` ```
### Custom key builder ### Custom key builder
By default use builtin key builder, if you need, you can override this and pass in `cache` or `FastAPICache.init` to By default the `default_key_builder` builtin key builder is used; this creates a
take effect globally. cache key from the function module and name, and the positional and keyword
arguments converted to their `repr()` representations, encoded as a MD5 hash.
You can provide your own by passing a key builder in to `@cache()`, or to
`FastAPICache.init()` to apply globally.
For example, if you wanted to use the request method, URL and query string as a cache key instead of the function identifier you could use:
```python ```python
def my_key_builder( def request_key_builder(
func, func,
namespace: str = "", namespace: str = "",
request: Request = None, *,
response: Response = None, request: Request = None,
*args, response: Response = None,
**kwargs, *args,
**kwargs,
): ):
prefix = FastAPICache.get_prefix() return ":".join([
cache_key = f"{prefix}:{namespace}:{func.__module__}:{func.__name__}:{args}:{kwargs}" namespace,
return cache_key request.method.lower(),
request.url.path,
repr(sorted(request.query_params.items()))
])
@app.get("/") @app.get("/")
@cache(expire=60, coder=JsonCoder, key_builder=my_key_builder) @cache(expire=60, key_builder=request_key_builder)
async def index(): async def index():
return dict(hello="world") return dict(hello="world")
``` ```
## Backend notes
### InMemoryBackend ### InMemoryBackend
`InMemoryBackend` store cache data in memory and use lazy delete, which mean if you don't access it after cached, it The `InMemoryBackend` stores cache data in memory and only deletes when an
will not delete automatically. expired key is accessed. This means that if you don't access a function after
data has been cached, the data will not be removed automatically.
### RedisBackend ### RedisBackend
When using the redis backend, please make sure you pass in a redis client that does [_not_ decode responses][redis-decode] (`decode_responses` **must** be `False`, which is the default). Cached data is stored as `bytes` (binary), decoding these i the redis client would break caching. When using the Redis backend, please make sure you pass in a redis client that does [_not_ decode responses][redis-decode] (`decode_responses` **must** be `False`, which is the default). Cached data is stored as `bytes` (binary), decoding these in the Redis client would break caching.
[redis-decode]: https://redis-py.readthedocs.io/en/latest/examples/connection_examples.html#by-default-Redis-return-binary-responses,-to-decode-them-use-decode_responses=True [redis-decode]: https://redis-py.readthedocs.io/en/latest/examples/connection_examples.html#by-default-Redis-return-binary-responses,-to-decode-them-use-decode_responses=True