Technology

Model Monitoring

Continuous tracking of AI model performance, behavior, and output quality in production environments.

Full Definition

Model Monitoring is the practice of continuously tracking the performance, behavior, and output quality of AI models deployed in production. Key monitoring dimensions include performance metrics (latency, throughput, error rates), output quality (relevance, accuracy, coherence), behavioral consistency (comparing current output distributions to baselines), cost tracking (token usage, API costs), and safety metrics (hallucination rates, bias indicators, policy violation frequency). Model monitoring is distinct from model training — it focuses on how models behave in real-world conditions rather than on benchmarks. Effective monitoring detects issues like model degradation, distribution shift, and behavioral drift before they impact end users. Governance platforms extend basic monitoring with compliance-aware alerting, risk scoring, and automated incident investigation.