Module custom_llm_model

Source

Structs§

CustomLlmModel

Enums§

MetadataSendModeTrue
This determines whether metadata is sent in requests to the custom provider. - off will not send any metadata. payload will look like { messages } - variable will send assistant.metadata as a variable on the payload. payload will look like { messages, metadata } - destructured will send assistant.metadata fields directly on the payload. payload will look like { messages, ...metadata } Further, variable and destructured will send call, phoneNumber, and customer objects in the payload. Default is variable.
ProviderTrue
This is the provider that will be used for the model. Any service, including your own server, that is compatible with the OpenAI API can be used.