Can I use my own LLM or does it have to be one of the out-of-the-box options you provide?

We’ve got an in-house LLM we want to plug in here.

Hey GGLighthouse,

Yes! The LLM node accepts custom endpoints:

const llmNode = runtime.nodes.llm({
provider: ‘custom’,
endpoint: ‘https://your-llm-api.com/v1/chat’,
model: ‘your-model-name’
});

Just match the OpenAI-compatible request/response format and you’re good :slight_smile: