We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No response
With the default k8s client so sometimes get timeout errors like this
E0314 08:24:19.610919 1 reflector.go:147] k8s.io/[email protected]/tools/cache/reflector.go:229: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://172.20.0.1:443/api/v1/pods?fieldSelector=spec.nodeName%3Dip-10-0-17-162.eu-north-1.compute.internal&resourceVersion=4141611754": dial tcp 172.20.0.1:443: i/o timeout
it would be nice to configure the k8s client with a larger timeout.
The text was updated successfully, but these errors were encountered:
Pinging code owners for exporter/loadbalancing: @jpkrohling. See Adding Labels via Comments if you do not have permissions to add labels yourself.
Sorry, something went wrong.
Makes sense to me, especially since other resolvers are able to have their own timeout specified.
I investigated, it looks pretty straightforward to implement this.
Even though I don't think I would increase the timeout in this case, I agree that users should be able to set a timeout.
[loadbalancingexporter] Support the timeout period of k8s resolver li…
e360afe
…st watch can be configured (#31904) **Link to tracking Issue:** close #31757 --------- Signed-off-by: Jared Tan <[email protected]>
JaredTan95
Successfully merging a pull request may close this issue.
Component(s)
No response
Is your feature request related to a problem? Please describe.
With the default k8s client so sometimes get timeout errors like this
Describe the solution you'd like
it would be nice to configure the k8s client with a larger timeout.
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: