Connecting Custom LLM APIs

Connecting Reor to an OpenAI-like API


With Reor, you can connect to any API that adheres to the OpenAI interface. Meaning apps like LM Studio, Oobabooga & Ollama are all supported.

Follow these steps:

  1. Open Settings -> "Remote LLM Setup"
  2. Under "API URL" enter the URL of the API you want to connect to. This must be whatever comes before /chat/completions. So for most apps you'd enter something like http://127.0.0.1:1337/v1.
  3. Enter a model name. This is the name of the model you want to use in your app of choice. For example, davinci or gpt-3.5-turbo. It can be optional too, we just need a name to refer to the model.
  4. Enter an API key if required. This is optional and only required if the API you're connecting to requires an API key.
  5. Set the context length of the model. This will restrict the number of tokens set to your API. 2048 is fine for most cases.