Function calling is the ability of a language model (LLM) to identify when it should call an external function and generate the structured parameters needed for that call.
What is function calling?
Function calling is a capability of language models (LLMs) that allows them to decide when to call an external function or tool, and generate the necessary arguments in structured format (JSON). It's the fundamental building block that transforms a chatbot into an AI agent capable of taking action.
How does it work?
- Declaration: You describe available functions to the model (name, description, parameters)
- Detection: The LLM analyzes the user's request and identifies if a function is needed
- Generation: The model produces a structured call with the right parameters
- Execution: Your code executes the function and returns the result to the model
Concrete examples
"What's the weather in Paris?" → The LLM calls get_weather(city="Paris") rather than making up an answer. "Launch an audit for Acme Corp" → The LLM calls launch_audit(client_name="Acme Corp").
Function calling and AILabsAudit
Through the MCP protocol, AILabsAudit exposes its features as "tools" that LLMs can call. ChatGPT, Claude or Gemini can launch audits, check results and generate reports directly from the conversation.