Function Calling
Function Calling lets a language model invoke structured tools or APIs with typed arguments, returning the function's output back into the conversation. The pattern was popularized by OpenAI's June 2023 release and is now supported across major providers (Anthropic's tool use, Google's Gemini function calling, Mistral's function calling, and open models via the OpenAI-compatible API). Developers define a JSON schema for each available function (name, description, parameters), the model decides when to call which function with which arguments, the application executes the call, and the result is fed back to the model. Examples include retrieving a weather forecast given a city, looking up a customer record by ID, or executing a database query. Function calling is the technical backbone of modern AI agents and copilots. AI governance, AI compliance, and AI risk management programs treat function-calling permissions as sensitive privileges to be reviewed, scoped, and audited as part of responsible AI deployment.
Centralpoint Audits Every Function the AI Calls: Oxcyon's Centralpoint AI Governance Platform records function-calling activity across ChatGPT, Gemini, Llama, and embedded models. Centralpoint meters consumption, keeps prompts and skills on-premise, and embeds function-aware chatbots into your portals via a single line of JavaScript.
Related Keywords:
Function Calling,
,