BBPoly: Scalable and Modular Robustness Analysis of Deep Neural Networks

05/10/2021, 10:30am

Speaker

Yuyi Zhong

Abstract

As neural networks are trained to be deeper and larger, the scalability of neural network analyzer is urgently required. The main technical insight of our method is modularly analyzing neural networks by segmenting a network into blocks and conduct the analysis for each block. In particular, we propose the network block summarization technique to capture the behaviors within a network block using a block summary and leverage the summary to speed up the analysis process. We instantiate our method in the context of a CPU-version of the state-of-the-art analyzer DeepPoly and name our system as Bounded-Block Poly (BBPoly). We evaluate BBPoly extensively on various experiment settings. The experimental result indicates that our method yields comparable precision as DeepPoly but runs faster and requires less computational resources. For example, BBPoly can analyze really large neural networks like SkipNet or ResNet which contain up to one million neurons in less than around 1 hour per input image,while DeepPoly needs to spend even 40 hours to analyze one image.

Speaker Bio

Yuyi Zhong is pursuing a Ph.D. in the School of Computing at National Univerisity of Singapore (NUS), under the supervision of Professor Siau-Cheng Khoo. Her research interests lie in applying formal methods used in program analysis to neural network verification. The presented work, inspired by the idea of building function summaries in program analysis, aims to build network block summaries to assist the verification process. This work is part of the research project of building a configurable neural network verification platform that enables quick and systematic construction and experimentation of networkanalyses.