Skip to main content

Input

Type: LLMChatRequest The data type that LLMChatNode accepts as input

Output

Type: LLMChatResponse | Content The data type that LLMChatNode outputs. See Content for more details.

Examples

// Using LLM provider configuration
const llmNode = new RemoteLLMChatNode({
id: 'my-llm-node',
provider: 'openai',
modelName: 'gpt-4o-mini',
stream: true
});

// Using existing LLM component
const llmNodeWithComponent = new RemoteLLMChatNode({
id: 'my-llm-node',
llmComponent: existingLLMComponent
});

// Using default settings
const defaultLlmNode = new RemoteLLMChatNode();

Constructors

Interfaces


Constructors

constructor

new RemoteLLMChatNode(props?: RemoteLLMChatNodeProps | RemoteLLMChatNodeWithLLMComponentProps): RemoteLLMChatNode
Creates a new RemoteLLMChatNode instance.

Parameters

props
RemoteLLMChatNodeProps | RemoteLLMChatNodeWithLLMComponentProps
Optional configuration for the chat node.

Returns

RemoteLLMChatNode

Interfaces

RemoteLLMChatNodeProps

Configuration for RemoteLLMChatNode using LLM provider settings.

Properties

textGenerationConfig?: object Text generation configuration parameters stream?: boolean Whether to stream responses provider?: string LLM provider (e.g., ‘openai’, ‘anthropic’, ‘inworld’) modelName?: string Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’) responseFormat?: "text" | "json" | "json_schema" Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’) messageTemplates?: Camelize<MessageTemplate>[] Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’)

RemoteLLMChatNodeWithLLMComponentProps

Configuration for RemoteLLMChatNode using an existing LLM component.

Properties

llmComponent: RemoteLLMComponent | AbstractComponent ID of the existing LLM component to use textGenerationConfig?: object Text generation configuration parameters stream?: boolean Whether to stream responses responseFormat: "text" | "json" | "json_schema" Whether to stream responses messageTemplates?: Camelize<MessageTemplate>[] Whether to stream responses