-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(cache): use vary headers to compare cached response with request headers #3
Conversation
I implemented a |
…insertTime'. Also exported the CacheStore.
I tried to make the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about range queries? Are those implicit vary?
Is like to have max entry size and max entry ttl options |
Also more tests wouldn't hurt |
What should happen when the cache gets full? Simply ignore more entries to be stored, or delete an old one when there is a new entry? Also, what do you mean with 'max entry ttl options'? The maximum amount of ttl a single entry can have? |
No, technically I will implement the case where the server responds with a |
…ug where data structure weren't correctly parsed back from JSON
… size. Added BJSON
Continued in nodejs/undici#3562 |
In this first commit, the cache handler handles multiple cached entries from the same cache key. It then tries to find which of the entries match the request, if any.
What I can't get my head around is how we are supposed to set the cache. We need both the response and the request at the same time. We need the vary headers and data from the response - the vary headers so that we can compare, the data to cache the response, and the request headers to have something to compare against.
My interpretation of the current functionality is that after we fetch an existing entry from the cache, the
onComplete
method caches that object again. What is the purpose of caching the same entry again?