Zitao Chen, Niranjhana Narayanan, Bo Fang, Guanpeng Li, Karthik Pattabiraman, and Nathan DeBardeleben, IEEE International Symposium on Software Reliability Engineering (ISSRE), 2020. (Acceptance Rate: 26%) [ PDF | Talk ] (Code)
Abstract: As machine learning (ML) has seen increasing adoption in safety-critical domains (e.g., autonomous vehicles), the reliability of ML systems has also grown in importance. While prior studies have proposed techniques to enable efficient error-resilience (e.g., selective instruction duplication), a fundamental requirement for realizing these techniques is a detailed understanding of the application’s resilience. In this work, we present TensorFI, a high-level fault injection (FI) framework for TensorFlow-based applications. TensorFI is able to inject both hardware and software faults in general TensorFlow programs. TensorFI is a configurable FI tool that is flexible, easy to use, and portable. It can be integrated into existing TensorFlow programs to assess their resilience for different fault types (e.g., faults in particular operators). We use TensorFI to evaluate the resilience of 12 ML programs, including DNNs used in the autonomous vehicle domain. The results give us insights into why some of the models are more resilient. We also present two case studies to demonstrate the usefulness of the tool. TensorFI is publicly available at https://github.com/DependableSystemsLab/TensorFI.
- Characterizing and Improving Resilience of Accelerators to Memory Errors in Autonomous Robots
- EdgeEngine: A Thermal-Aware Optimization Framework for Edge Inference
- Evaluating the Effect of Common Annotation Faults on Object Detection Techniques
- Resilience Assessment of Large Language Models under Transient Hardware Faults
- Mixed Precision Support in HPC Applications: What About Reliability?
- Towards Reliability Assessment of Systolic Arrays against Stuck-at Faults
- Overconfidence is a Dangerous Thing: Mitigating Membership Inference Attacks by Enforcing Less Confident Prediction
- Structural Coding: A Low-Cost Scheme to Protect CNNs from Large-Granularity Memory Faults
- A Low-cost Strategic Monitoring Approach for Scalable and Interpretable Error Detection in Deep Neural Networks
- SwarmFuzz: Discovering GPS Spoofing Attacks in Drone Swarms
- About us