Skip to main content
Force LLM responses into valid JSON format. Supports json_object (free-form JSON) and json_schema (schema-validated JSON).

Basic JSON Mode

const agent = new Agent({
  name: 'Data Extractor',
  instructions: 'Extract structured data from text.',
  model: openai('gpt-4o'),
  modelConfig: {
    responseFormat: { type: 'json_object' }
  }
});

Schema-Validated JSON

Force the response to match a specific schema:
const agent = new Agent({
  name: 'Profile Extractor',
  instructions: 'Extract person profile from text.',
  model: openai('gpt-4o'),
  modelConfig: {
    responseFormat: {
      type: 'json_schema',
      json_schema: {
        type: 'object',
        properties: {
          name: { type: 'string' },
          age: { type: 'integer' },
          email: { type: 'string' },
        },
        required: ['name', 'age', 'email'],
        additionalProperties: false,
      }
    }
  }
});

const result = await agent.process({ message: 'John Smith, 34, john@example.com' });
const profile = JSON.parse(result.message);
// { name: 'John Smith', age: 34, email: 'john@example.com' }

Provider Support

Providerjson_objectjson_schemaHow
OpenAINativeNativeresponse_format API parameter
GeminiNativeNativeresponseMimeType + responseSchema
AnthropicNot supportedNative (Claude 4.5+)output_config.format
BedrockNot supportedNative (Claude 4.5+)output_config in payload
GroqNativeNot supportedresponse_format (OpenAI-compatible)
xAINativeNativeresponse_format (OpenAI-compatible)
When json_object is not natively supported (Anthropic, Bedrock), add JSON instructions to your system prompt for best results.

With LLM Standalone

const extractor = LLM.openai('gpt-4o', {
  responseFormat: { type: 'json_object' }
});

const result = await extractor.generate('List 3 colors with hex codes', {
  system: 'Respond with valid JSON only.'
});
const data = JSON.parse(result.text);

Next Steps

Reasoning

Enable chain-of-thought thinking

Server-Side Tools

Provider-native web search and code execution