Simone Milanesi, Ambrogio Maria Bernardelli
PhD students at the CompOpt Lab (University of Pavia)
Binarized Neural Networks (BNNs) are receiving increasing attention due to their lightweight architecture and ability to run on low-power devices.
The Mixed-Integer Linear Programming (MILP) approach achieves the state of the art for training classification BNNs when limited data are available.
We propose the BeMi ensemble, a structured architecture of BNNs based on training a single BNN for each possible pair of classes and applying an OVO-inspired majority voting scheme to predict the final output. The training of each BNN is achieved with a MILP model that optimizes a lexicographic multi-objective function, representing the principles of robustness and simplicity. This research area is a prime example of TAILOR’s WP4 on Integrating Paradigms and the proposed framework can be exploited also with other types of neural networks (NNs), such as the Integer NNs, an expertise of our host, Dr. Yorke-Smith, a member of TAILOR group.