Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I use LM-studio a local AI with openAI endpoints: #1089

Open
ludiusvox opened this issue Feb 9, 2025 · 0 comments
Open

I use LM-studio a local AI with openAI endpoints: #1089

ludiusvox opened this issue Feb 9, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@ludiusvox
Copy link

Feature summary

Using local LM-studio Llama 3.2 system for LLM on system

Feature description

I use LM-studio a local AI with openAI endpoints It has an openAI API, I can run a local server of an LLM, to apply for jobs, and then take that system, and then incorporate a local machines endpoint instead of OpenAI.

This would cost me no money to use, and I wouldn't need Gemini or ChatGPT pro. It would save me money. I will look at it myself also since this is open source.

Motivation

No response

Alternatives considered

No response

Additional context

No response

@ludiusvox ludiusvox added the enhancement New feature or request label Feb 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant