A Tropical Approach to Neural Networks with Piecewise Linear Activations

22 May 2018  ·  Vasileios Charisopoulos, Petros Maragos ·

We present a new, unifying approach following some recent developments on the complexity of neural networks with piecewise linear activations. We treat neural network layers with piecewise linear activations as tropical polynomials, which generalize polynomials in the so-called $(\max, +)$ or tropical algebra, with possibly real-valued exponents. Motivated by the discussion in (arXiv:1402.1869), this approach enables us to refine their upper bounds on linear regions of layers with ReLU or leaky ReLU activations to $\min\left\{ 2^m, \sum_{j=0}^n \binom{m}{j} \right\}$, where $n, m$ are the number of inputs and outputs, respectively. Additionally, we recover their upper bounds on maxout layers. Our work follows a novel path, exclusively under the lens of tropical geometry, which is independent of the improvements reported in (arXiv:1611.01491, arXiv:1711.02114). Finally, we present a geometric approach for effective counting of linear regions using random sampling in order to avoid the computational overhead of exact counting approaches

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods