OutputCache in ASP.NET Core: Server-Controlled Caching That Actually Works

TL;DR OutputCache gives you server-controlled caching independent of HTTP headers. Named policies, tag-based invalidation, and vary strategies for production ASP.NET Core applications.

ResponseCaching middleware honors HTTP cache headers. OutputCache middleware ignores them. This distinction determines which one you should use.

Common questions this answers

  • What is the difference between OutputCache and ResponseCaching?
  • When should I use OutputCache instead of ResponseCaching?
  • How do I configure named caching policies?
  • How do I invalidate cached responses programmatically?
  • How do I prevent unbounded cache key growth?

Definition (what this means in practice)

OutputCache is server-controlled response caching middleware introduced in .NET 7. Unlike ResponseCaching (which respects HTTP cache headers from clients), OutputCache lets the server decide what to cache regardless of client headers.

In practice, this means your caching strategy is predictable and controlled entirely by your configuration, not by what headers clients send.

Terms used

  • OutputCache: server-side middleware that caches HTTP responses based on server configuration.
  • ResponseCaching: middleware that respects RFC 9111 HTTP caching semantics (client can bypass with Cache-Control headers).
  • Named policy: a reusable caching configuration identified by name.
  • Vary key: the components that make a cache entry unique (query parameters, headers, custom values).
  • Cache tag: a label for grouping cache entries for bulk invalidation.

Reader contract

This article is for:

  • Engineers configuring response caching in ASP.NET Core applications.
  • Teams migrating from ResponseCaching to OutputCache.

You will leave with:

  • A decision framework for OutputCache vs ResponseCaching.
  • Named policy patterns for common scenarios.
  • Strategies to keep cache keys bounded.
  • Tag-based invalidation patterns.

This is not for:

  • In-memory caching with IMemoryCache.
  • Distributed caching fundamentals.
  • CDN or edge caching strategies.

Quick start (10 minutes)

If you need server-controlled caching that ignores client headers:

Verified on: ASP.NET Core (.NET 10).

// Program.cs
builder.Services.AddOutputCache(options =>
{
    options.AddPolicy("Default1Hour", b => b.Expire(TimeSpan.FromHours(1)));
});

var app = builder.Build();
app.UseOutputCache(); // After UseRouting, before UseAuthorization

Apply to endpoints:

// Controller
[OutputCache(PolicyName = "Default1Hour")]
public IActionResult Index() => View();

// Minimal API
app.MapGet("/api/data", () => GetData()).CacheOutput("Default1Hour");

OutputCache vs ResponseCaching

This is the key decision. Choose based on who controls caching.

Criterion ResponseCaching OutputCache
Who controls caching Client and server (via HTTP headers) Server only
Client bypass Yes (Cache-Control: no-cache) No
HTTP RFC compliance Yes (RFC 9111) No (ignores headers)
Server-side storage No (sets headers only) Yes (stores responses in memory)
Programmatic invalidation No Yes (via tags)
Available since .NET Core 1.0 .NET 7
Best for Public APIs with CDN UI apps, internal APIs

When to use ResponseCaching

  • Public APIs where clients and CDNs should participate in caching.
  • Scenarios where RFC-compliant cache behavior is required.
  • When you want proxies between client and server to cache responses.

When to use OutputCache

  • Razor Pages and MVC applications (UI apps).
  • Internal APIs where you control all clients.
  • When you need predictable caching regardless of client behavior.
  • When you need programmatic cache invalidation.

Named policy patterns

Define policies once, apply them consistently across endpoints.

Basic policies

builder.Services.AddOutputCache(options =>
{
    // Static content: cache for 12 hours
    options.AddPolicy("Static12Hours", b => b.Expire(TimeSpan.FromHours(12)));

    // Detail pages: cache for 6 hours
    options.AddPolicy("Detail6Hours", b => b.Expire(TimeSpan.FromHours(6)));

    // API responses: cache for 1 minute
    options.AddPolicy("Api1Minute", b => b.Expire(TimeSpan.FromMinutes(1)));
});

Policy with query variation

When the same endpoint returns different content based on query parameters:

options.AddPolicy("VaryByPage", b => b
    .Expire(TimeSpan.FromHours(12))
    .SetVaryByQuery("page"));

This creates separate cache entries for /articles?page=1 and /articles?page=2.

Policy with header variation

When content varies by request header:

options.AddPolicy("VaryByLanguage", b => b
    .Expire(TimeSpan.FromHours(6))
    .SetVaryByHeader("Accept-Language"));

Disable caching for specific endpoints

// Controller attribute
[OutputCache(NoStore = true)]
public IActionResult Contact() => View();

// Or omit the attribute entirely

Vary strategies: keeping cache keys bounded

Cache key explosion is a real problem. If you vary by unbounded inputs, you create unbounded cache entries.

Bounded variation

Good: vary by a small set of known values.

// Page number is bounded (users rarely go past page 10)
options.AddPolicy("VaryByPage", b => b
    .Expire(TimeSpan.FromHours(12))
    .SetVaryByQuery("page"));

Unbounded variation (dangerous)

Bad: vary by user-specific values in public endpoints.

// DON'T: This creates a cache entry per user
options.AddPolicy("BadPolicy", b => b
    .SetVaryByQuery("userId")); // Unbounded!

Custom vary with validation

When you need to vary by a value but want to control the key space:

options.AddPolicy("VaryByCulture", b => b
    .Expire(TimeSpan.FromHours(6))
    .VaryByValue(httpContext =>
    {
        var culture = httpContext.Request.Query["culture"].ToString();
        // Normalize to known cultures only
        var normalized = culture switch
        {
            "en" or "en-US" => "en",
            "es" or "es-ES" => "es",
            "fr" or "fr-FR" => "fr",
            _ => "en" // Default
        };
        return new KeyValuePair<string, string>("culture", normalized);
    }));

For user-specific caching (like personalized content), vary by cookie but ensure the cookie value is controlled:

options.AddPolicy("VaryByVisitor", b => b
    .Expire(TimeSpan.FromSeconds(10))
    .VaryByValue(httpContext =>
    {
        if (!httpContext.Request.Cookies.TryGetValue("visitor_id", out var visitorId)
            || !Guid.TryParse(visitorId, out var parsed))
        {
            return new KeyValuePair<string, string>("visitor", string.Empty);
        }
        return new KeyValuePair<string, string>("visitor", parsed.ToString("D"));
    }));

Cache invalidation with tags

Tags let you invalidate groups of cache entries without knowing their exact keys.

Tagging endpoints

// Tag by content type
app.MapGet("/articles", GetArticles)
    .CacheOutput(b => b.Tag("articles").Expire(TimeSpan.FromHours(1)));

app.MapGet("/articles/{slug}", GetArticle)
    .CacheOutput(b => b.Tag("articles").Expire(TimeSpan.FromHours(1)));

// Tag by feature area
builder.Services.AddOutputCache(options =>
{
    options.AddPolicy("BlogContent", b => b
        .Expire(TimeSpan.FromHours(6))
        .Tag("blog"));
});

Programmatic invalidation

// Inject IOutputCacheStore
public class ArticleController(IOutputCacheStore cacheStore) : Controller
{
    [HttpPost]
    public async Task<IActionResult> Update(ArticleModel model)
    {
        // Save changes...

        // Invalidate all cached articles
        await cacheStore.EvictByTagAsync("articles", default);

        return RedirectToAction("Index");
    }
}

Controller attribute tagging

[OutputCache(PolicyName = "Detail6Hours", Tags = ["articles"])]
public async Task<IActionResult> Read(string slug)
{
    // ...
}

Memory management

OutputCache stores responses in memory by default. Configure limits to prevent memory exhaustion.

Size limits

builder.Services.AddOutputCache(options =>
{
    // Maximum total cache size (default: 100 MB)
    options.SizeLimit = 100 * 1024 * 1024;

    // Maximum size per cached response (default: 64 MB)
    options.MaximumBodySize = 64 * 1024 * 1024;

    // Default expiration when policy doesn't specify (default: 60 seconds)
    options.DefaultExpirationTimeSpan = TimeSpan.FromMinutes(5);
});

Eviction behavior

When SizeLimit is reached:

  • New responses are not cached until existing entries expire or are evicted.
  • Entries are evicted based on expiration time.
  • Consider shorter expiration times for high-traffic endpoints.

Distributed cache with Redis

For multi-server deployments, use Redis to share cache across instances.

Configuration

Install the package:

dotnet add package Microsoft.AspNetCore.OutputCaching.StackExchangeRedis

Configure the backend:

builder.Services.AddStackExchangeRedisOutputCache(options =>
{
    options.Configuration = builder.Configuration.GetConnectionString("Redis");
    options.InstanceName = "MyApp:";
});

builder.Services.AddOutputCache(options =>
{
    options.AddPolicy("Default1Hour", b => b.Expire(TimeSpan.FromHours(1)));
});

When to use distributed cache

  • Multiple server instances serving the same content.
  • Cache needs to survive process restarts.
  • Cache invalidation must propagate across servers.

When in-memory is sufficient

  • Single server deployment.
  • Short expiration times (under a few minutes).
  • Content that can be regenerated quickly on cache miss.

Middleware ordering

OutputCache placement in the middleware pipeline matters.

var app = builder.Build();

app.UseRouting();           // Must come before OutputCache
app.UseOutputCache();       // After routing, before auth
app.UseAuthorization();
app.MapControllers();

If you use CORS:

app.UseCors();              // Must come before OutputCache
app.UseOutputCache();

Copy/paste artifact: production OutputCache configuration

// Program.cs - OutputCache configuration for production

builder.Services.AddOutputCache(options =>
{
    // Memory limits
    options.SizeLimit = 100 * 1024 * 1024; // 100 MB
    options.MaximumBodySize = 10 * 1024 * 1024; // 10 MB per response
    options.DefaultExpirationTimeSpan = TimeSpan.FromMinutes(5);

    // Named policies
    options.AddPolicy("Static12Hours", b => b.Expire(TimeSpan.FromHours(12)));
    options.AddPolicy("Detail6Hours", b => b.Expire(TimeSpan.FromHours(6)));
    options.AddPolicy("List1Hour", b => b
        .Expire(TimeSpan.FromHours(1))
        .SetVaryByQuery("page", "sort"));
    options.AddPolicy("Api1Minute", b => b
        .Expire(TimeSpan.FromMinutes(1))
        .Tag("api"));
});

// Middleware order
app.UseRouting();
app.UseOutputCache();
app.UseResponseCompression();
app.UseAuthorization();

Common failure modes

  1. Caching authenticated content: OutputCache serves cached responses to all users by default. Do not cache user-specific content without proper vary configuration.

  2. Unbounded vary keys: Varying by unbounded inputs (user IDs, timestamps) creates cache key explosion.

  3. Wrong middleware order: OutputCache before UseRouting causes routing data to be unavailable.

  4. Caching Set-Cookie responses: If your endpoint sets cookies, those cookies may be cached and served to other users. Use NoStore for such endpoints.

  5. Missing tag invalidation: Updating content without invalidating related cache tags serves stale data.

  6. No memory limits: Default 100 MB may be too large or too small for your deployment.

Checklist

  • OutputCache vs ResponseCaching decision documented.
  • Named policies defined for each caching scenario.
  • Vary keys are bounded (no unbounded user inputs).
  • Tags configured for content that needs invalidation.
  • Memory limits appropriate for deployment size.
  • Middleware order correct (after UseRouting, before UseAuthorization).
  • Authenticated/personalized endpoints excluded or properly varied.
  • Endpoints that set cookies use NoStore.

FAQ

Can I use both OutputCache and ResponseCaching?

Technically yes, but it is not recommended. Choose one based on your caching control requirements. OutputCache for server control, ResponseCaching for HTTP RFC compliance.

Does OutputCache work with authenticated requests?

By default, OutputCache does not cache authenticated requests (when the Authorization header is present or the user is authenticated). You can override this with custom policies, but be careful about serving one user's content to another.

How do I cache different content for different users?

Use VaryByValue with a user identifier. However, this creates a cache entry per user, which may defeat the purpose of caching. Consider whether per-user caching is actually beneficial for your scenario.

What happens when the cache is full?

New responses are not cached until existing entries expire. The cache does not evict unexpired entries to make room for new ones. Use appropriate expiration times for your traffic patterns.

Can I use OutputCache with Minimal APIs?

Yes. Use the CacheOutput() extension method or the [OutputCache] attribute on handler methods.

How do I debug cache behavior?

Check response headers. Cached responses include an Age header indicating how long the response has been cached. Use browser developer tools or curl to inspect headers.

What to do next

If you are using ResponseCaching for UI applications, evaluate migrating to OutputCache for server-controlled behavior. Define named policies for your common caching scenarios and implement tag-based invalidation for content that changes.

For more on ASP.NET Core performance, read EF Core Performance Mistakes That Ship to Production.

If you want help optimizing caching strategy for your application, reach out via Contact.

References

Author notes

Decisions:

  • Recommend OutputCache over ResponseCaching for UI apps. Rationale: server-controlled caching is more predictable than RFC-compliant caching that clients can bypass.
  • Emphasize bounded vary keys. Rationale: unbounded vary parameters cause cache key explosion and memory issues.
  • Recommend short expiration with tags over long expiration. Rationale: easier to invalidate stale content; cache misses are acceptable if regeneration is fast.

Observations:

  • Teams often start with ResponseCaching, then switch to OutputCache when they realize clients can bypass caching.
  • Cache key explosion from unbounded query parameters is a common production incident.
  • Missing middleware ordering causes subtle bugs where caching appears to work but varies incorrectly.