-
Notifications
You must be signed in to change notification settings - Fork 166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Ollama #76
Comments
I am trying porting Cradle to llama with mistral nemo 13B. so far the main issue is the LLM provider inside Cradle only support openai and claude. llama 's server api have most compatibility to openai , but Ollama is far from compatible. so if you go with ollama, it definitely cost you more time as you have to write a new LLM client. |
LM studio uses same api structure as openai so is it possible to run via that and i have tried setting set OPENAI_API_BASE_URL="http://localhost:1234/v1" and i have also changed openai_config.json file to add a base url is there anything in code to change base url for openai { |
You can modify line 108 of the cradle/provider/llm/openai.py. OpenAI provides the base url parameter.
|
i have changed self.client = OpenAI(api_key="lm-studio", base_url ="http://localhost:1234/v1") in line 108 and now its redirecting to LMstudio and hre is the output from runner.py |
Thanks for the amazing work, of course for a brand new beginner, it takes some times to setup the entire environment.
As readme already said the current best model to use is the gpt4o model. which majorities of us would understand. but in another hand, the openai and glaude models has been coded as predefiend config/formats. I wonder is there any plan to support ollama for the experiements of the offline LLM testing? since ollama can be called with the api as well. that should be possible and should be a standard way.
again, I am brandnew to both llm and agent, please correct me if I am wrong in any way. and hope ollama can be supported soon.!
once again, thanks for the great work!
The text was updated successfully, but these errors were encountered: