Out-of-DIstribution detector for Neural networks

Description

ODIN (Out-of-Distribution Detector for Neural Networks) identifies when a neural network encounters inputs significantly different from its training distribution. It enhances detection by applying temperature scaling to soften the model's output distribution and adding small, carefully calibrated perturbations to the input that push in-distribution samples towards higher confidence predictions. By measuring the maximum softmax probability after these adjustments, ODIN can effectively distinguish between in-distribution and out-of-distribution inputs, flagging potentially unreliable predictions before they cause downstream errors.

Example Use Cases

Reliability

Detecting anomalous medical images in diagnostic systems, where ODIN flags X-rays or scans containing rare pathologies or imaging artefacts not present in training data, preventing misdiagnosis and prompting specialist review.

Safety

Protecting autonomous vehicle perception systems by identifying novel road scenarios (e.g., unusual weather conditions, rare obstacle types) that fall outside the training distribution, triggering fallback safety mechanisms.

Explainability

Monitoring production ML systems for data drift by detecting when incoming customer behaviour patterns deviate significantly from training data, helping explain why model performance may degrade over time.

Limitations

  • Requires careful tuning of temperature scaling and perturbation magnitude parameters, which may need adjustment for different types of out-of-distribution data.
  • Performance degrades when out-of-distribution samples are very similar to training data, making near-distribution detection challenging.
  • Vulnerable to adversarial examples specifically crafted to evade detection by mimicking in-distribution characteristics.
  • Computational overhead from input preprocessing and perturbation generation can impact real-time inference applications.

Resources

Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks
Research PaperShiyu Liang, Yixuan Li, and R. SrikantJun 8, 2017
facebookresearch/odin
Software Package
Generalized ODIN: Detecting Out-of-distribution Image without Learning from Out-of-distribution Data
Research PaperYen-Chang Hsu et al.Feb 26, 2020
Detection of out-of-distribution samples using binary neuron activation patterns
Research PaperChachuła, Krystian et al.Mar 24, 2023
Out-of-Distribution Detection with ODIN - A Tutorial
Tutorial

Tags

Applicable Models:
Data Requirements:
Data Type:
Evidence Type:
Expertise Needed:
Explanatory Scope:
Lifecycle Stage:
Technique Type: