Iclr2020: Compression based bound for non-compressed network

Iclr2020: Compression based bound for non-compressed network

4.9
(252)
Write Review
More
$ 19.99
Add to Cart
In stock
Description

Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network - Download as a PDF or view online for free
1) The document presents a new compression-based bound for analyzing the generalization error of large deep neural networks, even when the networks are not explicitly compressed. 2) It shows that if a trained network's weights and covariance matrices exhibit low-rank properties, then the network has a small intrinsic dimensionality and can be efficiently compressed. 3) This allows deriving a tighter generalization bound than existing approaches, providing insight into why overparameterized networks generalize well despite having more parameters than training examples.

Future Internet, Free Full-Text

Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network

Continuum Modeling and Control of Large Nonuniform Networks

ICLR 2020 Statistics - Paper Copilot

Intellectual property protection of DNN models

Publications - OATML

Publications - OATML

Adversarial Neural Pruning with Latent Vulnerability Suppression

Minimax optimal alternating minimization \ for kernel nonparametric tensor learning

ICLR 2020

Single-model uncertainty quantification in neural network potentials does not consistently outperform model ensembles

Peter Richtarik

Adversarial Neural Pruning with Latent Vulnerability Suppression

Machine learning project