Abstract
Highly complex deep learning models are increasingly integrated into modern cyber-physical systems (CPS), many of which have strict safety requirements. One problem arising from this is that deep learning lacks interpretability, operating as a black box. The reliability of deep learning is heavily impacted by how well the model training data represents runtime test data, especially when the input space dimension is high as natural images. In response, we propose a robust out-of-distribution (OOD) detection framework. Our approach detects unusual movements from driving video in real-time by combining classical optic flow operation with representation learning via variational autoencoder (VAE). We also design a method to locate OOD factors in images. Evaluation on a driving simulation data set shows that our approach is statistically more robust than related works.
Original language | English |
---|---|
Title of host publication | ICCPS 2021 - Proceedings of the 2021 ACM/IEEE 12th International Conference on Cyber-Physical Systems (with CPS-IoT Week 2021) |
Publisher | Association for Computing Machinery, Inc |
Pages | 225-226 |
Number of pages | 2 |
ISBN (Electronic) | 9781450383530 |
DOIs | |
Publication status | Published - May 19 2021 |
Externally published | Yes |
Event | 12th ACM/IEEE International Conference on Cyber-Physical Systems, ICCPS 2021, part of CPS-IoT Week 2021 - Virtual, Online, United States Duration: May 19 2021 → May 21 2021 |
Publication series
Name | ICCPS 2021 - Proceedings of the 2021 ACM/IEEE 12th International Conference on Cyber-Physical Systems (with CPS-IoT Week 2021) |
---|
Conference
Conference | 12th ACM/IEEE International Conference on Cyber-Physical Systems, ICCPS 2021, part of CPS-IoT Week 2021 |
---|---|
Country/Territory | United States |
City | Virtual, Online |
Period | 5/19/21 → 5/21/21 |
Bibliographical note
Publisher Copyright:© 2021 ACM.
ASJC Scopus Subject Areas
- Computer Networks and Communications