Artificial Intelligence
Stochastic Embeddings: A Probabilistic and Geometric Analysis of Out-of-Distribution Behavior
Publié le - Forty-first Conference on Uncertainty in Artificial Intelligence
Deep neural networks perform well in many applications but often fail when exposed to outof-distribution (OoD) inputs. We identify a geometric phenomenon in the embedding space: indistribution (ID) data show higher variance than OoD data under stochastic perturbations. Using high-dimensional geometry and statistics, we explain this behavior and demonstrate its application in improving OoD detection. Unlike traditional post-hoc methods, our approach integrates uncertainty-aware tools, such as Bayesian approximations, directly into the detection process. Then, we show how considering the unit hypersphere enhances the separation of ID and OoD samples. Our mathematically sound method achieves competitive performance while remaining simple.