# Cache

> Nitro provides a caching system built on top of the storage layer, powered by [ocache](https://github.com/unjs/ocache).

## Cached handlers

To cache an event handler, you simply need to use the `defineCachedHandler` method.

It works like `defineHandler` but with an second parameter for the [cache options](#options).

```ts [routes/cached.ts]
import { defineCachedHandler } from "nitro/cache";

export default defineCachedHandler((event) => {
  return "I am cached for an hour";
}, { maxAge: 60 * 60 });
```

With this example, the response will be cached for 1 hour and a stale value will be sent to the client while the cache is being updated in the background. If you want to immediately return the updated response set `swr: false`.

See the [options](#options) section for more details about the available options.

<important>

**Request headers are dropped** when handling cached responses. Use the [`varies` option](#options) to consider specific headers when caching and serving the responses.
</important>

### Automatic HTTP headers

When using `defineCachedHandler`, Nitro automatically manages HTTP cache headers on cached responses:

- **`etag`** -- A weak ETag (`W/"..."`) is generated from the response body hash if not already set by the handler.
- **`last-modified`** -- Set to the current time when the response is first cached, if not already set.
- **`cache-control`** -- Automatically set based on the `swr`, `maxAge`, and `staleMaxAge` options:  - With `swr: true`: `s-maxage=<maxAge>, stale-while-revalidate=<staleMaxAge>`
  - With `swr: false`: `max-age=<maxAge>`


### Conditional requests (304 Not Modified)
Cached handlers automatically support conditional requests. When a client sends `if-none-match` or `if-modified-since` headers matching the cached response, Nitro returns a `304 Not Modified` response without a body.

### Request method filtering

Only `GET` and `HEAD` requests are cached. All other HTTP methods (`POST`, `PUT`, `DELETE`, etc.) automatically bypass the cache and call the handler directly.

### Request deduplication

When multiple concurrent requests hit the same cache key while the cache is being resolved, only one invocation of the handler runs. All concurrent requests wait for and share the same result.

## Cached functions

You can also cache a function using the `defineCachedFunction` function. This is useful for caching the result of a function that is not an event handler, but is part of one, and reusing it in multiple handlers.

For example, you might want to cache the result of an API call for one hour:

```ts [routes/api/stars/[...repo\].ts]
import { defineCachedFunction } from "nitro/cache";
import { defineHandler, type H3Event } from "nitro";

export default defineHandler(async (event) => {
  const { repo } = event.context.params;
  const stars = await cachedGHStars(repo).catch(() => 0)

  return { repo, stars }
});

const cachedGHStars = defineCachedFunction(async (repo: string) => {
  const data = await fetch(`https://api.github.com/repos/${repo}`).then(res => res.json());

  return data.stargazers_count;
}, {
  maxAge: 60 * 60,
  name: "ghStars",
  getKey: (repo: string) => repo
});
```

The stars will be cached in development inside `.nitro/cache/functions/ghStars/<owner>/<repo>.json` with `value` being the number of stars.

```json
{"expires":1677851092249,"value":43991,"mtime":1677847492540,"integrity":"ZUHcsxCWEH"}
```

<important>

Because the cached data is serialized to JSON, it is important that the cached function does not return anything that cannot be serialized, such as Symbols, Maps, Sets...
</important>

<callout>

If you are using edge workers to host your application, you should follow the instructions below.
:<collapsible></collapsible>
In edge workers, the instance is destroyed after each request. Nitro automatically uses `event.waitUntil` to keep the instance alive while the cache is being updated while the response is sent to the client.

To ensure that your cached functions work as expected in edge workers, **you should always pass the `event` as the first argument to the function using `defineCachedFunction`.**

```ts [routes/api/stars/[...repo\].ts] {5,10,17}
import { defineCachedFunction } from "nitro/cache";


export default defineHandler(async (event) => {
  const { repo } = event.context.params;
  const stars = await cachedGHStars(event, repo).catch(() => 0)

  return { repo, stars }
});

const cachedGHStars = defineCachedFunction(async (event: H3Event, repo: string) => {
  const data = await fetch(`https://api.github.com/repos/${repo}`).then(res => res.json());

  return data.stargazers_count;
}, {
  maxAge: 60 * 60,
  name: "ghStars",
  getKey: (event: H3Event, repo: string) => repo
});
```

This way, the function will be able to keep the instance alive while the cache is being updated without slowing down the response to the client.
</callout>

::

## Using route rules

This feature enables you to add caching routes based on a glob pattern directly in the main configuration file. This is especially useful to have a global cache strategy for a part of your application.

Cache all the blog routes for 1 hour with `stale-while-revalidate` behavior:

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  routeRules: {
    "/blog/**": { cache: { maxAge: 60 * 60 } },
  },
});
```

If we want to use a [custom cache storage](#cache-storage) mount point, we can use the `base` option.

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  storage: {
    redis: {
      driver: "redis",
      url: "redis://localhost:6379",
    },
  },
  routeRules: {
    "/blog/**": { cache: { maxAge: 60 * 60, base: "redis" } },
  },
});
```

### Route rules shortcuts

You can use the `swr` shortcut for enabling `stale-while-revalidate` caching on route rules. When set to `true`, SWR is enabled with the default `maxAge`. When set to a number, it is used as the `maxAge` value in seconds.

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  routeRules: {
    "/blog/**": { swr: true },
    "/api/**": { swr: 3600 },
  },
});
```

To explicitly disable caching on a route, set `cache: false`:

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  routeRules: {
    "/api/realtime/**": { cache: false },
  },
});
```

<note>

When using route rules, cached handlers use the group `'nitro/route-rules'` instead of the default `'nitro/handlers'`.
</note>

## Cache storage

Nitro stores the data in the `cache` storage mount point.

- In production, it will use the [memory driver](https://unstorage.unjs.io/drivers/memory) by default.
- In development, it will use the [filesystem driver](https://unstorage.unjs.io/drivers/fs), writing to a temporary dir (`.nitro/cache`).
To overwrite the production storage, set the `cache` mount point using the `storage` option:

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  storage: {
    cache: {
      driver: 'redis',
      /* redis connector options */
    }
  }
})
```

In development, you can also overwrite the cache mount point using the `devStorage` option:

```ts [nitro.config.ts]
import { defineConfig } from "nitro";

export default defineConfig({
  storage: {
    cache: {
      // production cache storage
    },
  },
  devStorage: {
    cache: {
      // development cache storage
    }
  }
})
```

## Options

The `defineCachedHandler` and `defineCachedFunction` functions accept the following options:

### Shared options

These options are available for both `defineCachedHandler` and `defineCachedFunction`:

<field-group>

<field>

Name of the storage mountpoint to use for caching. :br
Default to `cache`.
</field>

<field>

Guessed from function name if not provided, and falls back to `'_'` otherwise.
</field>

<field>

Defaults to `'nitro/handlers'` for handlers and `'nitro/functions'` for functions.
</field>

<field>

A function that accepts the same arguments as the original function and returns a cache key (`String`). :br
If not provided, a built-in hash function will be used to generate a key based on the function arguments. For cached handlers, the key is derived from the request URL path and search params.
</field>

<field>

A value that invalidates the cache when changed. :br
By default, it is computed from **function code**, used in development to invalidate the cache when the function code changes.
</field>

<field>

Maximum age that cache is valid, in seconds. :br
Default to `1` (second).
</field>

<field>

Maximum age that a stale cache is valid, in seconds. If set to `-1` a stale value will still be sent to the client while the cache updates in the background. :br
Defaults to `0` (disabled).
</field>

<field>

Enable `stale-while-revalidate` behavior to serve a stale cached response while asynchronously revalidating it. :br
When enabled, stale cached values are returned immediately while revalidation happens in the background. When disabled, the caller waits for the fresh value before responding (the stale entry is cleared). :br
Defaults to `true`.
</field>

<field>

A function that returns a `boolean` to invalidate the current cache and create a new one.
</field>

<field>

A function that returns a `boolean` to bypass the current cache without invalidating the existing entry.
</field>

<field>

A custom error handler called when the cached function throws. :br
By default, errors are logged to the console and captured by the Nitro error handler.
</field>
</field-group>

### Handler-only options

These options are only available for `defineCachedHandler`:

<field-group>

<field>

When `true`, skip full response caching and only handle conditional request headers (`if-none-match`, `if-modified-since`) for `304 Not Modified` responses. The handler is called on every request but benefits from conditional caching.
</field>

<field>

An array of request header names to vary the cache key on. Headers listed here are preserved on the request during cache resolution and included in the cache key, making the cache unique per combination of header values. :br :br
Headers **not** listed in `varies` are stripped from the request before calling the handler to ensure consistent cache hits. :br :br
For multi-tenant environments, you may want to pass `['host', 'x-forwarded-host']` to ensure these headers are not discarded and that the cache is unique per tenant.
</field>
</field-group>

### Function-only options

These options are only available for `defineCachedFunction`:

<field-group>

<field>

Transform the cache entry before returning. The return value replaces the cached value.
</field>

<field>

Validate a cache entry. Return `false` to treat the entry as invalid and trigger re-resolution.
</field>
</field-group>

## SWR behavior

The `stale-while-revalidate` (SWR) pattern is enabled by default (`swr: true`). Understanding how it interacts with other options:

| `swr` | `maxAge` | Behavior |
| --- | --- | --- |
| `true` (default) | `1` (default) | Cache for 1 second, serve stale while revalidating |
| `true` | `3600` | Cache for 1 hour, serve stale while revalidating |
| `false` | `3600` | Cache for 1 hour, wait for fresh value when expired |
| `true` | `3600` with `staleMaxAge: 600` | Cache for 1 hour, serve stale for up to 10 minutes while revalidating |

When `swr` is enabled and a cached value exists but has expired:

1. The stale cached value is returned immediately to the client.
2. The function/handler is called in the background to refresh the cache.
3. On edge workers, `event.waitUntil` is used to keep the background refresh alive.
When `swr` is disabled and a cached value has expired:

1. The stale entry is cleared.
2. The client waits for the function/handler to resolve with a fresh value.

## Cache keys and invalidation
When using the `defineCachedFunction` or `defineCachedHandler` functions, the cache key is generated using the following pattern:

```ts
`${options.base}:${options.group}:${options.name}:${options.getKey(...args)}.json`
```

For example, the following function:

```ts
import { defineCachedFunction } from "nitro/cache";

const getAccessToken = defineCachedFunction(() => {
  return String(Date.now())
}, {
  maxAge: 10,
  name: "getAccessToken",
  getKey: () => "default"
});
```

Will generate the following cache key:

```ts
cache:nitro/functions:getAccessToken:default.json
```

You can invalidate the cached function entry with:

```ts
import { useStorage } from "nitro/storage";

await useStorage('cache').removeItem('nitro/functions:getAccessToken:default.json')
```

<note>

For cached handlers, the cache key includes a hash of the URL path and, when using the [`varies`](#handler-only-options) option, hashes of the specified header values appended to the key.
</note>

<note>

Responses with HTTP status codes `>= 400` or with an undefined body are not cached. This prevents caching error responses.
</note>

<read-more>

Read more about the Nitro storage.
</read-more>
