论文标题

Qebverif:量化错误界面验证神经网络的验证

QEBVerif: Quantization Error Bound Verification of Neural Networks

论文作者

Zhang, Yedi, Song, Fu, Sun, Jun

论文摘要

为了减轻在边缘设备上部署深神网络(DNN)的实际限制,量化被广泛认为是一种有前途的技术。它通过将DNN的权重和/或激活张量量化为较低的位固定点,从而减少了计算能力和存储空间的资源要求,从而导致量化的神经网络(QNN)。虽然已经验证明它引入了较小的准确性损失,但一旦量化,DNN的关键验证特性可能会变得无效。现有的验证方法着重于单个神经网络(DNN或QNNS)或用于部分量化的量化误差。在这项工作中,我们提出了一种量化错误验证方法,称为Qebverif,其中重量和激活张量均已量化。 Qebverif由两个部分组成,即差异可及性分析(DRA)和基于混合的线性编程(MILP)验证方法。 DRA在DNN及其量化对应物的逐层进行差分分析,以有效地计算紧密的量化误差间隔。如果DRA未能证明错误绑定的错误,那么我们将验证问题编码为等效的MILP问题,该问题可以通过现成的求解器来解决。因此,Qebverif是合理的,完整的,而且效率合理。我们实施Qebverif并进行广泛的实验,显示其有效性和效率。

To alleviate the practical constraints for deploying deep neural networks (DNNs) on edge devices, quantization is widely regarded as one promising technique. It reduces the resource requirements for computational power and storage space by quantizing the weights and/or activation tensors of a DNN into lower bit-width fixed-point numbers, resulting in quantized neural networks (QNNs). While it has been empirically shown to introduce minor accuracy loss, critical verified properties of a DNN might become invalid once quantized. Existing verification methods focus on either individual neural networks (DNNs or QNNs) or quantization error bound for partial quantization. In this work, we propose a quantization error bound verification method, named QEBVerif, where both weights and activation tensors are quantized. QEBVerif consists of two parts, i.e., a differential reachability analysis (DRA) and a mixed-integer linear programming (MILP) based verification method. DRA performs difference analysis between the DNN and its quantized counterpart layer-by-layer to compute a tight quantization error interval efficiently. If DRA fails to prove the error bound, then we encode the verification problem into an equivalent MILP problem which can be solved by off-the-shelf solvers. Thus, QEBVerif is sound, complete, and reasonably efficient. We implement QEBVerif and conduct extensive experiments, showing its effectiveness and efficiency.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源