Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM Killed ingress-appgw-deployment #1611

Open
taulantracaj opened this issue May 5, 2024 · 1 comment
Open

OOM Killed ingress-appgw-deployment #1611

taulantracaj opened this issue May 5, 2024 · 1 comment

Comments

@taulantracaj
Copy link

taulantracaj commented May 5, 2024

Describe the bug
We are experiencing OOM Killed every time we restart the Azure Kubernetes Cluster

To Reproduce
Stop Application Gateway
Stop Kubernetes cluster
Start both in parallell
Steps to reproduce the behavior:
We use for system (2 nodes):
Standard F8s v2 (8 vcpus, 16 GiB memory)
Image: AKSUbuntu-2204gen2containerd-202404.16.0
Ingress Controller details

  • Output of kubectl describe pod <ingress controller> . The pod name can be obtained by running helm list.
  • Normal Pulled 2m22s (x6 over 7m24s) kubelet Container image "mcr.microsoft.com/azure-application-gateway/kubernetes-ingress:1.7.4" already present on machine
  • Output of `kubectl logs .
    I0505 09:39:43.639902 1 context.go:171] k8s context run started
    I0505 09:39:43.639931 1 context.go:238] Waiting for initial cache sync
    I0505 09:39:43.640042 1 reflector.go:219] Starting reflector *v1.Secret (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640062 1 reflector.go:255] Listing and watching *v1.Secret from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640089 1 reflector.go:219] Starting reflector *v1.Pod (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640103 1 reflector.go:255] Listing and watching *v1.Pod from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640044 1 reflector.go:219] Starting reflector *v1.Service (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640144 1 reflector.go:219] Starting reflector *v1beta1.AzureApplicationGatewayRewrite (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640163 1 reflector.go:255] Listing and watching *v1beta1.AzureApplicationGatewayRewrite from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640180 1 reflector.go:255] Listing and watching *v1.Service from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640244 1 reflector.go:219] Starting reflector *v1.Ingress (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640264 1 reflector.go:255] Listing and watching *v1.Ingress from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640322 1 reflector.go:219] Starting reflector *v1.IngressClass (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640336 1 reflector.go:255] Listing and watching *v1.IngressClass from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640044 1 reflector.go:219] Starting reflector *v1.Endpoints (30s) from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
    I0505 09:39:43.640351 1 reflector.go:255] Listing and watching *v1.Endpoints from pkg/mod/k8s.io/client-go@v0.20.0-beta.1/tools/cache/reflector.go:167
  • Any Azure support tickets associated with this issue.
    2405050050000002
@taulantracaj
Copy link
Author

taulantracaj commented May 10, 2024

In addition, when i try just temporary to change the memory limit to 1Gi (from 600 Mi) it works but it again rolled back to the initial state.
In the logs of the created one with 1Gi before terminating I'm seeing the following error:

httpserver.go:59] Failed to start API serverhttp: Server closed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant