Replies: 1 comment 1 reply
-
|
You'll need to configure Ollama to accept network connections first - check out the official docs here: https://docs.ollama.com/faq (see "How do I configure Ollama server?") Once that's set up, here's your models:
- name: Llama 3.1
provider: ollama
model: llama3.1:latest
apiBase: http://192.168.1.100:11434 # Replace with your Ollama machine's IP |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have ollama running on another machine and can not get this to work. I've read the docs which are clear as mud, tried various things syntax wise in my config.yaml but select model shows nothing. Trying to add a model just keeps giving me options for third party services or local only to this machine style setups.
How do you add an ollama instance on your network!?
Beta Was this translation helpful? Give feedback.
All reactions