Ability to use local AI - ollama, LMStudio, etc.
complete
S
Serban
It would be great to be able to use other AI frameworks (ollama, LMStudio, etc) and models (could be self-hosted) in order to accomplish AI tasks.
Could be using the curent AI plugin that could offer possibility to select/configure other API endpoints (maybe starting with openAI API compatible endpoints).
Focus here is more the privacy of the data being sent to the LLM rather than complex functionality or web searches.
Many thanks !
Serban
Eduard Metzger
complete
Available now, see: https://help.noteplan.co/article/268-how-to-use-other-ai-models-claude-etc
S
Serban
Eduard, many thanks for getting to this !
Short answer, it is not about the financial part or economy, is about not pushing sensitive data to external LLMs.
If one has a sensitive notes vault, one would have an issue with the note content being sent to openAI for processing.
Simple summarization, translations, suggesting links between notes, potential followups etc, can be done with low powered LLMs without sending the note contents to any provider, and since many local LLMs align to the openAI API model, it might ease up the burden of development,
Thanks !
Edit: PS - also the models can be run with Ollama (has its own API) or LM Studio, or Jan for example, no need to integrate the engine in NotePlan.
Eduard Metzger
Hi Serban, thanks for sharing this! Are you using AI tools heavily already so that local LLM integration would pay off?