Approximating probability distributions by RELu networks

Manuj Mukherjee, Aslan Tchamkerten, Mansoor Yousefi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

How many neurons are needed to approximate a target probability distribution using a neural network with a given input distribution and approximation error? This paper examines this question for the case when the input distribution is uniform, and the target distribution belongs to the class of histogram distributions. We obtain a new upper bound on the number of required neurons, which is strictly better than previously existing upper bounds. The key ingredient in this improvement is an efficient construction of the neural nets representing piecewise linear functions. We also obtain a lower bound on the minimum number of neurons needed to approximate the histogram distributions.

Original languageEnglish
Title of host publication2020 IEEE Information Theory Workshop, ITW 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728159621
DOIs
StatePublished - 11 Apr 2021
Event2020 IEEE Information Theory Workshop, ITW 2020 - Virtual, Riva del Garda, Italy
Duration: 11 Apr 202115 Apr 2021

Publication series

Name2020 IEEE Information Theory Workshop, ITW 2020

Conference

Conference2020 IEEE Information Theory Workshop, ITW 2020
Country/TerritoryItaly
CityVirtual, Riva del Garda
Period11/04/2115/04/21

Bibliographical note

Publisher Copyright:
© 2021 IEEE.

Fingerprint

Dive into the research topics of 'Approximating probability distributions by RELu networks'. Together they form a unique fingerprint.

Cite this