• Decrease Text SizeIncrease Text Size

Activation Function

An Activation Function introduces non-linearity into a neural network, allowing it to learn complex patterns rather than just linear relationships. Without activation functions, even a hundred-layer network would collapse mathematically into a single linear transformation. Common activation functions include ReLU (the default in most modern networks), sigmoid (for binary outputs), tanh (similar to sigmoid but centered at zero), softmax (for multi-class classification), and newer choices like GELU and SwiGLU used inside large language models. The choice of activation function affects training speed, model accuracy, and the kinds of patterns the network can represent. Although low-level, this AI term shows up in model documentation reviewed during AI governance and AI compliance audits — particularly when models are ported between frameworks (PyTorch vs TensorFlow) or quantized for deployment. Understanding activation functions supports responsible AI engineering and is part of the technical literacy expected of AI risk management teams.

Centralpoint Activates Enterprise-Wide AI Governance: Oxcyon's Centralpoint AI Governance Platform isn't tied to any one architecture or activation function. It is model-agnostic across ChatGPT, Gemini, Llama, and embedded options, meters all LLM use, and keeps every prompt and skill on-premise. Add chatbots to any site or portal with one line of JavaScript.


Related Keywords:
Activation Function,,