View a PDF of the paper titled Architecture independent generalization bounds for overparametrized deep ReLU networks, by Thomas Chen and 3 other authors
View PDF
HTML (experimental)
Abstract:We prove that overparametrized neural networks are able to generalize with a test error that is independent of the level of overparametrization, and independent of the Vapnik-Chervonenkis (VC) dimension. We prove explicit bounds that only depend on the metric geometry of the test and training sets, on the regularity properties of the activation function, and on the operator norms of the weights and norms of biases. For overparametrized deep ReLU networks with a training sample size bounded by the input space dimension, we explicitly construct zero loss minimizers without use of gradient descent, and prove that the generalization error is independent of the network architecture.
Submission history
From: Thomas Chen [view email]
[v1]
Tue, 8 Apr 2025 05:37:38 UTC (13 KB)
[v2]
Wed, 9 Apr 2025 17:29:05 UTC (14 KB)
[v3]
Thu, 22 May 2025 15:45:56 UTC (14 KB)