-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Throttle config updates #66
Comments
There has not been any activity to this issue in the last 14 days. It will automatically be closed after 7 more days. Remove the |
Sorry for the late reaction! That's a great suggestion and would certainly be useful in highly dynamic environments. 👍 I cannot make any promises as to when (or even if) any of us might get on this, but in the meantime I'd happy review any PR coming my way concerning this feature. |
Hey, that's good to hear! I'm not really the Go expert, but I'll see if I can take a jab at it. |
I think the best point to start would probably be the kube-httpcache/pkg/controller/watch.go Lines 14 to 57 in 8b81c12
For the actual rate limiting, a structure like
|
Is your feature request related to a problem? Please describe.
Aggressive autoscaling can trigger config updates multiple times rendering Varnish unable to respond.
Describe the solution you'd like
Have a configurable grace period before issuing a config update and discard old ones if another change was registered.
Describe alternatives you've considered
Less aggressive autoscaling or just living with it, but both don't seem that hot.
Additional context
We have pretty aggressive autoscaling since we have to get from 0-10000 Requests per Second in 2 Minutes. However, because that autoscaling is so aggressive, new varnish instances keep coming up, which results in config updates. Each config update adds a bit of latency, and if a lot of varnishes go online in a short period, the latency can shoot up to ~1s, while Varnish itself can serve the request in ~3ms.
The text was updated successfully, but these errors were encountered: