Skip to content

Add CLI arg to llama-run to adjust the number of threads used #12370

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Mar 14, 2025

Conversation

ericcurtin
Copy link
Collaborator

@ericcurtin ericcurtin commented Mar 13, 2025

We default to 4, sometimes we want to manually adjust this

@ericcurtin ericcurtin marked this pull request as draft March 13, 2025 16:24
@ericcurtin ericcurtin force-pushed the llama-run-n-threads branch from 6984c5e to c71410e Compare March 13, 2025 22:36
@ericcurtin ericcurtin marked this pull request as ready for review March 13, 2025 22:37
We default to 4, sometimes we want to manually adjust this

Signed-off-by: Eric Curtin <[email protected]>
@ericcurtin ericcurtin force-pushed the llama-run-n-threads branch from c71410e to 638e177 Compare March 14, 2025 15:54
@ericcurtin ericcurtin merged commit 9f2250b into master Mar 14, 2025
46 of 47 checks passed
@ericcurtin ericcurtin deleted the llama-run-n-threads branch March 14, 2025 16:41
jpohhhh pushed a commit to Telosnex/llama.cpp that referenced this pull request Mar 14, 2025
…rg#12370)

We default to 4, sometimes we want to manually adjust this

Signed-off-by: Eric Curtin <[email protected]>
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Mar 19, 2025
…rg#12370)

We default to 4, sometimes we want to manually adjust this

Signed-off-by: Eric Curtin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants