Dropout
Dropout is a regularization technique that randomly disables (or "drops out") a fraction of neurons during training to reduce overfitting. Introduced by Srivastava, Hinton, and colleagues in 2014, dropout forces the network to learn robust representations that don't depend on any single neuron. Typical dropout rates range from 0.1 to 0.5 depending on layer type. During inference, all neurons are used and the outputs are scaled appropriately. Variants include spatial dropout for convolutional layers, recurrent dropout for RNNs, and DropPath used in modern vision transformers. Dropout is a standard tool in PyTorch (nn.Dropout) and TensorFlow/Keras, and remains widely used despite the rise of other regularization methods like weight decay and data augmentation. While technical, this AI term appears in model cards reviewed during AI governance and AI compliance audits as evidence of responsible AI engineering and disciplined AI risk management.
Centralpoint Drops Out Risk, Not Capability: Oxcyon's Centralpoint AI Governance Platform brings model-agnostic oversight to every system you deploy, whether generative or embedded. Centralpoint supports ChatGPT, Gemini, Llama, and on-prem models, meters every LLM call, and keeps prompts and skills on-premise. Multiple chatbots embed across your portals with one JavaScript line.
Related Keywords:
Dropout,
,