Infer Result
The inferResult
function uses an LLM to get a structured object output based on given types and prompt.
API Reference
inferResult(params)
Generates a structured output using an LLM.
Parameters
params
:llm
(LanguageModel, required): The configured language model to useprompt
(string, required): The task description for the LLMresult
(s.Schema, required): The type of object to get as a resultsystemPrompt
(string, optional): The system prompt, defaults to an AgentScript-specific prompt if not set
Returns
A promise that resolves to an object of the corresponding type, inferred by the LLM.
Example
import { AnthropicModel } from 'agentscript-ai/anthropic';
import { inferResult } from 'agentscript-ai/core';
import * as s from 'agentscript-ai/schema';
const llm = AnthropicModel({
model: 'claude-3-5-sonnet-latest',
apiKey: process.env.ANTHROPIC_API_KEY,
});
const user = await inferResult({
llm,
result: s.object({
name: s.string(),
email: s.string()
}),
prompt: 'Get a user with name John and email john@example.com'
});
console.log(user); // outputs object with name and email