Collections
Discover the best community collections!
Collections including paper arxiv:2206.14502
-
Fixup Initialization: Residual Learning Without Normalization
Paper • 1901.09321 • Published • 1 -
RegMixup: Mixup as a Regularizer Can Surprisingly Improve Accuracy and Out Distribution Robustness
Paper • 2206.14502 • Published • 1 -
MixUp as Locally Linear Out-Of-Manifold Regularization
Paper • 1809.02499 • Published • 1
-
All you need is a good init
Paper • 1511.06422 • Published • 1 -
Align Your Steps: Optimizing Sampling Schedules in Diffusion Models
Paper • 2404.14507 • Published • 21 -
Efficient Transformer Encoders for Mask2Former-style models
Paper • 2404.15244 • Published • 1 -
Deep Residual Learning for Image Recognition
Paper • 1512.03385 • Published • 6
-
All you need is a good init
Paper • 1511.06422 • Published • 1 -
Align Your Steps: Optimizing Sampling Schedules in Diffusion Models
Paper • 2404.14507 • Published • 21 -
Deep Residual Learning for Image Recognition
Paper • 1512.03385 • Published • 6 -
MoDE: CLIP Data Experts via Clustering
Paper • 2404.16030 • Published • 12
-
Wide Residual Networks
Paper • 1605.07146 • Published • 2 -
Characterizing signal propagation to close the performance gap in unnormalized ResNets
Paper • 2101.08692 • Published • 2 -
Pareto-Optimal Quantized ResNet Is Mostly 4-bit
Paper • 2105.03536 • Published • 2 -
When Vision Transformers Outperform ResNets without Pre-training or Strong Data Augmentations
Paper • 2106.01548 • Published • 2