TY - JOUR
T1 - Out-of-Distribution Detection with Memory-Augmented Variational Autoencoder
AU - Ataeiasad, Faezeh
AU - Elizondo, David
AU - Calderón Ramírez, Saúl
AU - Greenfield, Sarah
AU - Deka, Lipika
N1 - Publisher Copyright:
© 2024 by the authors.
PY - 2024/10
Y1 - 2024/10
N2 - This paper proposes a novel method capable of both detecting OOD data and generating in-distribution data samples. To achieve this, a VAE model is adopted and augmented with a memory module, providing capacities for identifying OOD data and synthesising new in-distribution samples. The proposed VAE is trained on normal data and the memory stores prototypical patterns of the normal data distribution. At test time, the input is encoded by the VAE encoder; this encoding is used as a query to retrieve related memory items, which are then integrated with the input encoding and passed to the decoder for reconstruction. Normal samples reconstruct well and yield low reconstruction errors, while OOD inputs produce high reconstruction errors as their encodings get replaced by retrieved normal patterns. Prior works use memory modules for OOD detection with autoencoders, but this method leverages a VAE architecture to enable generation abilities. Experiments conducted with CIFAR-10 and MNIST datasets show that the memory-augmented VAE consistently outperforms the baseline, particularly where OOD data resembles normal patterns. This notable improvement is due to the enhanced latent space representation provided by the VAE. Overall, the memory-equipped VAE framework excels in identifying OOD and generating creative examples effectively.
AB - This paper proposes a novel method capable of both detecting OOD data and generating in-distribution data samples. To achieve this, a VAE model is adopted and augmented with a memory module, providing capacities for identifying OOD data and synthesising new in-distribution samples. The proposed VAE is trained on normal data and the memory stores prototypical patterns of the normal data distribution. At test time, the input is encoded by the VAE encoder; this encoding is used as a query to retrieve related memory items, which are then integrated with the input encoding and passed to the decoder for reconstruction. Normal samples reconstruct well and yield low reconstruction errors, while OOD inputs produce high reconstruction errors as their encodings get replaced by retrieved normal patterns. Prior works use memory modules for OOD detection with autoencoders, but this method leverages a VAE architecture to enable generation abilities. Experiments conducted with CIFAR-10 and MNIST datasets show that the memory-augmented VAE consistently outperforms the baseline, particularly where OOD data resembles normal patterns. This notable improvement is due to the enhanced latent space representation provided by the VAE. Overall, the memory-equipped VAE framework excels in identifying OOD and generating creative examples effectively.
KW - deep learning
KW - integrated external memory
KW - machine learning
KW - memory-augmentation
KW - out-of-distribution detection
KW - PyTorch
KW - variational autoencoder
UR - http://www.scopus.com/inward/record.url?scp=85206577387&partnerID=8YFLogxK
U2 - 10.3390/math12193153
DO - 10.3390/math12193153
M3 - Artículo
AN - SCOPUS:85206577387
SN - 2227-7390
VL - 12
JO - Mathematics
JF - Mathematics
IS - 19
M1 - 3153
ER -