Search Results for author: Aleksandr Beknazaryan

Found 6 papers, 0 papers with code

Shallow neural network representation of polynomials

no code implementations17 Aug 2022 Aleksandr Beknazaryan

We show that $d$-variate polynomials of degree $R$ can be represented on $[0, 1]^d$ as shallow neural networks of width $2(R+d)^d$.

regression

Nonparametric regression with modified ReLU networks

no code implementations17 Jul 2022 Aleksandr Beknazaryan, Hailin Sang

We consider regression estimation with modified ReLU neural networks in which network weight matrices are first modified by a function $\alpha$ before being multiplied by input vectors.

regression

Expressive power of binary and ternary neural networks

no code implementations27 Jun 2022 Aleksandr Beknazaryan

We show that deep sparse ReLU networks with ternary weights and deep ReLU networks with binary weights can approximate $\beta$-H\"older functions on $[0, 1]^d$.

Neural networks with superexpressive activations and integer weights

no code implementations20 May 2021 Aleksandr Beknazaryan

An example of an activation function $\sigma$ is given such that networks with activations $\{\sigma, \lfloor\cdot\rfloor\}$, integer weights and a fixed architecture depending on $d$ approximate continuous functions on $[0, 1]^d$.

regression

Analytic function approximation by path norm regularized deep networks

no code implementations5 Apr 2021 Aleksandr Beknazaryan

We show that neural networks with absolute value activation function and with the path norm, the depth, the width and the network weights having logarithmic dependence on $1/\varepsilon$ can $\varepsilon$-approximate functions that are analytic on certain regions of $\mathbb{C}^d$.

Function approximation by deep neural networks with parameters $\{0,\pm \frac{1}{2}, \pm 1, 2\}$

no code implementations15 Mar 2021 Aleksandr Beknazaryan

In this paper it is shown that $C_\beta$-smooth functions can be approximated by deep neural networks with ReLU activation function and with parameters $\{0,\pm \frac{1}{2}, \pm 1, 2\}$.

regression

Cannot find the paper you are looking for? You can Submit a new open access paper.