Follow
Elad Hoffer
Elad Hoffer
PhD, Research @ Habana Labs
Verified email at habana.ai - Homepage
Title
Cited by
Cited by
Year
Deep metric learning using triplet network
E Hoffer, N Ailon
Similarity-Based Pattern Recognition: Third International Workshop, SIMBAD …, 2015
23512015
Train longer, generalize better: closing the generalization gap in large batch training of neural networks
E Hoffer, I Hubara, D Soudry
Advances in neural information processing systems 30, 2017
9162017
The implicit bias of gradient descent on separable data
D Soudry, E Hoffer, MS Nacson, S Gunasekar, N Srebro
The Journal of Machine Learning Research 19 (1), 2822-2878, 2018
8762018
Scalable methods for 8-bit training of neural networks
R Banner, I Hubara, E Hoffer, D Soudry
Advances in neural information processing systems 31, 2018
3482018
Augment your batch: Improving generalization through instance repetition
E Hoffer, T Ben-Nun, I Hubara, N Giladi, T Hoefler, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
278*2020
Norm matters: efficient and accurate normalization schemes in deep networks
E Hoffer, R Banner, I Golan, D Soudry
Advances in Neural Information Processing Systems 31, 2018
1782018
Bayesian gradient descent: Online variational Bayes learning with increased robustness to catastrophic forgetting and weight pruning
C Zeno, I Golan, E Hoffer, D Soudry
arXiv preprint arXiv:1803.10123, 2018
114*2018
The knowledge within: Methods for data-free model compression
M Haroush, I Hubara, E Hoffer, D Soudry
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2020
1032020
Fix your classifier: the marginal value of training the last weight layer
E Hoffer, I Hubara, D Soudry
arXiv preprint arXiv:1801.04540, 2018
1022018
Exponentially vanishing sub-optimal local minima in multilayer neural networks
D Soudry, E Hoffer
arXiv preprint arXiv:1702.05777, 2017
1002017
ACIQ: analytical clipping for integer quantization of neural networks
R Banner, Y Nahshan, E Hoffer, D Soudry
722018
Neural gradients are lognormally distributed: understanding sparse and quantized training
B Chmiel, L Ben-Uri, M Shkolnik, E Hoffer, R Banner, D Soudry
arXiv, 2020
50*2020
Task-agnostic continual learning using online variational bayes with fixed-point updates
C Zeno, I Golan, E Hoffer, D Soudry
Neural Computation 33 (11), 3139-3177, 2021
44*2021
Semi-supervised deep learning by metric embedding
E Hoffer, N Ailon
arXiv preprint arXiv:1611.01449, 2016
392016
Deep unsupervised learning through spatial contrasting
E Hoffer, I Hubara, N Ailon
arXiv preprint arXiv:1610.00243, 2016
322016
Mix & match: training convnets with mixed image sizes for improved accuracy, speed and scale resiliency
E Hoffer, B Weinstein, I Hubara, T Ben-Nun, T Hoefler, D Soudry
arXiv preprint arXiv:1908.08986, 2019
222019
Logarithmic unbiased quantization: Practical 4-bit training in deep learning
B Chmiel, R Banner, E Hoffer, HB Yaacov, D Soudry
19*2021
At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks?
N Giladi, MS Nacson, E Hoffer, D Soudry
arXiv preprint arXiv:1909.12340, 2019
192019
Quantized back-propagation: Training binarized neural networks with quantized gradients
I Hubara, E Hoffer, D Soudry
62018
Infer2Train: leveraging inference for better training of deep networks
E Hoffer, B Weinstein, I Hubara, S Gofman, D Soudry
NeurIPS 2018 Workshop on Systems for ML, 2018
32018
The system can't perform the operation now. Try again later.
Articles 1–20