Browsing: Yannic Kilcher
Pre-training a CNN backbone for visual transfer learning has recently seen a big push into the direction of incorporating more…
Determining the stability properties of differential systems is a challenging task that involves very advanced symbolic and numeric mathematical manipulations.…
The Lottery Ticket Hypothesis has shown that it’s theoretically possible to prune a neural network at the beginning of training…
Even though LSTMs and GRUs solve the vanishing and exploding gradient problems, they have trouble learning to remember things over…
Image-to-Image translation usually requires corresponding samples or at least domain labels of the dataset. This paper removes that restriction and…
Self-supervised representation learning relies on negative samples to keep the encoder from collapsing to trivial solutions. However, this paper shows…
BERT and GPT-2/3 have shown the enormous power of using generative models as pre-training for classification tasks. However, for images,…
In this part, we go much more in-depth into the relationship between intelligence, generality, skill, experience, and prior knowledge and…
This paper proposes SimCLRv2 and shows that semi-supervised learning benefits a lot from self-supervised pre-training. And stunningly, that effect gets…
Implicit neural representations are created when a neural network is used to represent a signal as a function. SIRENs are…