๐Ÿ”— Cannoli
Create and run LLM scripts on the Obsidian Canvas.
favorite
share
by blindmansion
150
4,206
Latest Version
12 days ago
Changelog

Ollama

Cannoli now has support for running local LLMs with Ollama!

To switch to local LLMs, change the "AI provider" top level setting to Ollama, and make sure the ollama url reflects your setup (the default is usually the case).

We also need to configure the OLLAMA_ORIGINS environment variable to "" in order for requests from obsidian desktop to reach the ollama server successfully. Reference this document to configure this environment variable for each operating system, for example, in Mac OS you will run the command launchctl setenv OLLAMA_ORIGINS "" in your terminal and restart ollama.

You can change the default model in the settings, and define the model per-node in Cannolis themselves using config arrows as usual, but note that the model will have to load every time you change it, so having several models in one cannoli will take longer.

Function calling is not implemented yet, so Choice arrows and Field arrows currently don't work with Ollama.

All OpenAI chat models available

All chat OpenAI models are now available on cannoli, so long as you give the correct model name string in the settings or in the config arrow. Not all models have the correct price numbers currently, but now you won't have to wait on us to update the list to use openAI models.