Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would be far better to have user agents handle caching headers correctly instead of creating another configuration option (which will likely suffer from the same implementation problems).

cache-control: private with either sliding or concrete expiration time already handles this.



cache-control: private doesn't seem to imply that a resource won't ever change and on page refresh the browsers have to check if the resource has been updated, immutable would avoid the 304's responses cascade.


That's the entire point of the expiration time. Use a 2 year range and it's effectively immutable. No content will stay on device forever anyway and headers can easily be set to a smaller time-frame or must-revalidate if the content owner wants it.

Browsers mistakenly continue checking for new copies when they shouldn't within the expiration time. Fixing poor implementations with more standards never works well.


The problem is that servers are allowed to update their resources at any time without waiting for any specific expiration time. So when a user instructs it's browser to refresh the page, usually expecting to get the most up to date version, the browser has to choose between giving the still valid, but maybe not completely updated, cached version or actually checking if the resource has been updated.

Immutable makes it clear that the server won't update the resource in place and will handle updates by generating a new one so the browser can happily avoid checking those resources on page refresh.


I dont see a problem - browsers should honor the expiration time and choose the cache copy if it's still valid.

It's up to the server to use proper headers. Why say a file is ok to cache for years if it actually isnt? If the same url will change content then use shorter cache times or require active revalidation and/or etag checks - or just use the typical cache busting querystring parameters.

This "immutable" flag is unnecessary.


You disagree with the choices the specs have made.

From the cache control rfc:

   When a response is "fresh" in the cache, it can be used to satisfy
   subsequent requests without contacting the origin server, thereby
   improving efficiency.

From the immutable rfc:

   Clients SHOULD NOT issue a conditional request during the response's
   freshness lifetime (e.g., upon a reload) unless explicitly overridden
   by the user (e.g., a force reload).


What exactly do I disagree with? The specs are fine, it's the implementation (the browsers) that are broken which is what I've been saying with this whole thread.

If the implementation is faulty, what is another spec going to solve? Again, there is no need for an "immutable" flag because existing cache headers already express everything that's necessary.


Browsers are free to make a request "just in case" with cache control. With immutable, they are strongly discouraged from doing so. My point is, browsers aren't broken according to the spec if they make those just in case requests.


That is 100% on the browser to optimize itself. Otherwise it's diluting the point of the existing cache header if we need to add a flag to say "we're actually really sure about this expiration time"


And many people are almost certainly going to find that they actually need to either recall an old immutable thing, or mutate it.

Also, I will certainly want to clear out my browser's cache on a regular basis. I do not want it keeping immutable things just because they shouldn't ever change.


You can't 'recall' something you already sent out to browsers, and if you need to mutate then it's easy to make a new URL.

This header won't make browsers cache data any differently. It skips a step when the cache is being read from.


But in the current world, you can serve new content on the conditional check that caches currently do.

That said, I am ultimately for this. I think. There is plenty of data showing that this is a low hanging fruit to hit.


The conditional check that they do sometimes. Now half your users see the new version and half see the old version. Not much of a recall.


Still more of a recall than will be possible in the new world. And you can always detect the old code and prompt users to refresh. (Typically happens on a restart.)

Again, though, I am ultimately for this. I just remain skeptical of any panacea.


You sound as if people who consciously set an immutable header, or set cache expiration time header to 5 years, do not know what they are doing.

Should we optimize the web for clueless server operators?


With the numbers of folks that fix local development by clearing caches... Yes, I am comfortable claiming that. :)


Doesn't that indicate though that the cached resource can't be shared?


Then use cache-control: public with a sliding or concrete expiration time. We can even set different expirations for middle proxies and end user clients.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: