Input
Type:LLMChatRequest
The data type that LLMChatNode accepts as input
Output
Type:LLMChatResponse | Content
The data type that LLMChatNode outputs. See Content for more details.
Examples
Constructors
Interfaces
Constructors
constructor
Parameters
Optional configuration for the chat node.
Returns
RemoteLLMChatNode
Interfaces
RemoteLLMChatNodeProps
Configuration forRemoteLLMChatNode using LLM provider settings.
Properties
textGenerationConfig?:object
Text generation configuration parameters
stream?: boolean
Whether to stream responses
provider?: string
LLM provider (e.g., ‘openai’, ‘anthropic’, ‘inworld’)
modelName?: string
Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’)
responseFormat?: "text" | "json" | "json_schema"
Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’)
messageTemplates?: Camelize<MessageTemplate>[]
Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’)
RemoteLLMChatNodeWithLLMComponentProps
Configuration forRemoteLLMChatNode using an existing LLM component.
Properties
llmComponent:RemoteLLMComponent | AbstractComponent
ID of the existing LLM component to use
textGenerationConfig?: object
Text generation configuration parameters
stream?: boolean
Whether to stream responses
responseFormat: "text" | "json" | "json_schema"
Whether to stream responses
messageTemplates?: Camelize<MessageTemplate>[]
Whether to stream responses