• Decrease Text SizeIncrease Text Size

Function Calling

Function calling is the structured tool-use capability built into modern LLMs where the model emits a JSON object specifying a function name and arguments rather than free-form text, enabling reliable invocation of external code. OpenAI introduced function calling in June 2023, followed by Anthropic's Claude (tool use), Google Gemini (function calling), Mistral, and most open-source LLMs through specialized fine-tuning. The function call schema (function name, parameter types, descriptions) is provided to the model in the system prompt, and the model decides when to call which function based on the conversation. Function calling enables agentic workflows like search, calculation, database queries, API integration, and code execution — all expressed through a uniform interface. The schema is typically JSON Schema, making function calling tightly integrated with type validators in Python (Pydantic), TypeScript (Zod), and similar libraries. AI governance teams document the available functions and their scopes as part of agent AI compliance lineage. Function calling has effectively replaced earlier brittle text-parsing tool-use patterns and is the foundation of essentially every modern agent framework.

Function-calling agents through Centralpoint: Centralpoint orchestrates function-calling agents across OpenAI, Anthropic, Gemini, LLAMA, and other providers in a model-agnostic stack with structured tool inventories. Tokens are metered per skill and function, prompts stay local, and tool-using chatbots deploy through one line of JavaScript on any portal.


Related Keywords:
Function Calling,,