Skip to main content
Base Node Node inspector for an LLM node

Parameters

Model Name
string
default:"openai/gpt-5-mini"
required
The model identifier to use. Must start with the provider name, e.g., openai/, anthropic/, etc, unless using a custom base URL.
Temperature
float | null
default:"null"
Controls randomness for the model.
Base URL
string | null
default:"null"
Custom base URL for the model provider.
API Version
string | null
default:"null"
Provider-specific API version to send with the request (for example, Azure OpenAI).
Reasoning Effort
string | null
default:"null"
Sets the model’s reasoning effort when supported. Valid values: none, minimal, low, medium, high, xhigh, default.
Extra Arguments
json | null
default:"null"
Extra keyword arguments passed directly to the underlying model provider.

Credentials

LLM Credential
LLM Credential
required
Select an LLM Credential for the provider used by Model Name (for example, OpenAI, Anthropic, Gemini, or xAI).

Inputs

This node has no inputs.

Outputs

LLM Config
json
Resolved model settings used by the Agent node.
FieldDescription
model_nameModel identifier used by the provider.
temperatureTemperature value applied to the request.
base_urlCustom base URL, if supplied.
api_keyAPI key sourced from the selected credential.
api_versionAPI version passed to the provider, if supplied.
reasoning_effortReasoning effort level sent to the provider, if supported.
organizationOrganization or account identifier from the credential, if set.
extra_argsArbitrary provider-specific arguments, if provided.