Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase default memory for agent #7789

Merged
merged 3 commits into from
May 10, 2024

Conversation

kvalliyurnatt
Copy link
Contributor

In our e2e tests we noticed test failures related to agent and fleet. Debugging that we found the agent pods were using about 366Mi of memory vs a limit of 350Mi, this was causing the processes to be OOM killed. Increasing the default memory allocation for agent while we figure out the increase in memory usage with the latest release.

@kvalliyurnatt kvalliyurnatt added >bug Something isn't working v2.13.0 labels May 9, 2024
Copy link
Contributor

@naemono naemono left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems a reasonable increase considering what we're seeing.

@kvalliyurnatt kvalliyurnatt enabled auto-merge (squash) May 10, 2024 13:09
@kvalliyurnatt kvalliyurnatt merged commit f0759a7 into elastic:main May 10, 2024
3 checks passed
kvalliyurnatt added a commit to kvalliyurnatt/cloud-on-k8s that referenced this pull request May 10, 2024
* increase default memory for agent

(cherry picked from commit f0759a7)
@kvalliyurnatt
Copy link
Contributor Author

💚 All backports created successfully

Status Branch Result
2.13

Questions ?

Please refer to the Backport tool documentation

kvalliyurnatt added a commit that referenced this pull request May 10, 2024
* increase default memory for agent

(cherry picked from commit f0759a7)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
>bug Something isn't working v2.13.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants