Articles in this section

Set up a connection to Ollama

Ollama is an open-source tool and framework that lets users run, manage, and experiment with large language models (LLMs) locally on their own machines.

Set up a connection

After you start the connection, configure it in the Create connection panel and complete all of the required * authentication settings: 

Setting

Instructions

Name your connection *

Enter a clear and distinguishable name.

Throughout integrator.io imports and exports, you will have the option to choose this new connection. A unique identifier will prove helpful later when selecting it from a list of the connections in your account.

Mode *

Choose your Ollama account mode (either Cloud or On-premise).

Agent * (applicable only for On-premise)

Choose your Ollama account agent. For more information, see Install the agent.

Instance URL *

Enter the instance URL of your Ollama server. For example, if your local host is http://localhost:11434/api, then http://localhost:11434 is your instance URL.

Tip

This connector documentation describes only the settings shown for the Simple view. For the corresponding HTTP settings, see Custom universal connector documentation.

Additional references