Lei Huang received his BSc and PhD degrees under supervision of Prof. Wei Li, respectively in 2010 and 2018, at the School of Computer Science and Engineering, Beihang University, China. From 2015 to 2016, he visited the Vision and Learning Lab, University of Michigan, Ann Arbor, as a joint PhD student supervised by Prof. Jia Deng. During 2018 to 2020, he was a research scientist in Inception Institute of Artificial Intelligence (IIAI), UAE. His current research mainly focuses on normalization techniques (involving methods, theories and applications) in training DNNs. He also has wide interests in deep learning theory (representation & optimization) and computers vision tasks. He serves as a reviewer for the top conferences and journals such as CVPR, ICML, ICCV, ECCV, NeurIPS, AAAI, JMLR, TPAMI, IJCV, TNNLS, etc.
NEWS Dec 2020: Two papers are accepted by AAAI 2021.
NEWS Sep 2020: Our survey on normalization: "Normalization Techniques in Training DNNs: Methodology, Analysis and Application" can be available on arXiv and GitHub .
NEWS July 2020: One paper is accepted by ACM Multimedia 2020
NEWS July 2020: Two papers (One Oral) are accepted by ECCV 2020
NEWS June 2020: One paper is accepted by ICML 2020
NEWS Feb 2020: Two papers (Orals) are accepted by CVPR 2020
NEWS Jan 2020: One paper is accepted by ECAI 2020
NEWS May 2019: I will co-organize The 1st Statistical Deep Learning in Computer Vision workshop in conjunction with ICCV 2019.
NEWS March 2019: Two papers are accpeted by CVPR 2019.
NEWS September 2018: I co-organized a tutorial named "Normalization Methods for Training Deep Neural Networks: Theory and Practice" at ECCV 2018. The slides can be found here .
NEWS Feb 2018: Our paper is accepted by CVPR 2018
NEWS November 2017: One paper is accepted by AAAI 2018 (oral)
NEWS Sep 2017: I will atend the Doctoral Symposium at ICIP 2017, on 19 September at China National Convention Center (CNCC), BeiJing. I will present my research about 'Normalization Techniques in Training Deep Neural Networks'
NEWS Jul 2017: One paper accepted by ICCV 2017.
I am addicted to understanding and debugging the training of DNNs. I believe one avenue is delving into the basic modules of DNNs, e.g., normalization layer and linear layer.
My contributions for the community are mainly on designing and understanding these basic modules and their compound for the training dynamics of DNNs (* indicates corresponding authors; # indicates equal contributions).
November 1th, 2014. Graph-based active Semi-Supervised Learning: a new perspective for relieving multi-class annotation labor. ICT International Exchange Workshop 2014, Laboratory of Advanced Research B, University of Tsukuba, Japan [Slides]
ONI: This project is the implementation of the paper "Controllable Orthogonalization in Training DNNs" (arXiv:2004.00917).
IterNorm: This project is the implementation of the paper "Iterative Normalization: Beyond Standardization towards Efficient Whitening" (arXiv:1904.03441).
DBN: This project is the Torch implementation of the paper : Decorrelated Batch normalization (arXiv:1804.08450).
OWN: This project is the Torch implementation of the paper : orthogonal weight normalization method for solving orthogonality constraints over Steifel manifold in deep neural networks (arXiv:1709.06079).
CWN: This project is the Torch implementation of our accepted ICCV 2017 paper: Centered Weight Normalization in Accelerating Training of Deep Neural Networks
NormProjection: This project is the Torch implementation of the paper: Projection Based Weight Normalization for Deep Neural Networks (arXiv:1710.02338)
Ladder_deepSSL_NP: The reimplementation of Ladder networks with projection based weight normalization. We achieved test errors as 2.52%, 1.06%, and 0.91% on Permunate invariant MNIST dataset with 20, 50, and 100 labeled samples respectively, which is the state-of-the-art results.