Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loki scaling #347

Merged
merged 1 commit into from
Jun 12, 2024
Merged

Loki scaling #347

merged 1 commit into from
Jun 12, 2024

Conversation

hypesystem
Copy link
Contributor

What does this PR do?

Allows loki backends to scale horizontally

Should this be tested by the reviewer and how?

Does this look like a reasonable approach? Read the commit message.

What are the relevant tickets?

DDFDRIFT-96: Identificér og skaler knudepunkter nu hvor vi har større load på systemet

@hypesystem hypesystem requested review from achton and ITViking May 29, 2024 11:11
@hypesystem hypesystem changed the base branch from main to fix-loki-retention May 29, 2024 11:11
Base automatically changed from fix-loki-retention to main May 29, 2024 11:40
@ITViking
Copy link
Contributor

Jeg synes det lyder som en god løsningen på problemet. Det ville være ærgerligt at loki gik ned og vi ved jo allerede at builds har en tendens til at sluge en hel del ressourcer, så risikoen er reel.

The loki backends have no specified resource allocation whcih means the Quality of Service level is [BestEffort](https://kubernetes.io/docs/concepts/workloads/pods/pod-qos/#besteffort), and they use whatever resources are available on a spare node. We could consider increasing the minimum request, but because use of loki is so bursty, I think it may make sense to just scale by adding more instances when we add more load?
@ITViking ITViking merged commit 99dc0e5 into main Jun 12, 2024
2 checks passed
@ITViking ITViking deleted the loki-scaling branch June 12, 2024 13:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants