Lei Huang received his BSc and PhD degrees under supervision of Prof. Wei Li, respectively in 2010 and 2018, at the School of Computer Science and Engineering, Beihang University, China. From 2015 to 2016, he visited the Vision and Learning Lab, University of Michigan, Ann Arbor, as a joint PhD student supervised by Prof. Jia Deng. During 2018 to 2020, he was a research scientist in Inception Institute of Artificial Intelligence (IIAI), UAE. His current research mainly focuses on normalization techniques (involving methods, theories and applications) in training DNNs. He also has wide interests in deep learning theory (representation & optimization) and computers vision tasks. He serves as a reviewer for the top conferences and journals such as CVPR, ICML, ICCV, ECCV, NeurIPS, AAAI, JMLR, TPAMI, IJCV, TNNLS, etc.
NEWS Dec 2024: The Bench-CoE project is open-sourced and the paper is available.
NEWS July 2024: One paper is accepted by ECCV.
NEWS June 2024: One paper is accepted by IEEE TPAMI.
NEWS May 2024: The TinyLLaVA Factory project is open-sourced and the paper is available.
NEWS May 2024: One paper is accepted by IJCV.
NEWS May 2024: One paper is accepted by ICML.
NEWS Feb 2024: One paper is accepted by CVPR.
NEWS Feb 2024: One paper is accepted by IEEE TPAMI.
NEWS Dec 2020: Two papers are accepted by AAAI 2021.
NEWS Sep 2020: Our survey on normalization: "Normalization Techniques in Training DNNs: Methodology, Analysis and Application" can be available on arXiv and GitHub .
NEWS July 2020: One paper is accepted by ACM Multimedia 2020
NEWS July 2020: Two papers (One Oral) are accepted by ECCV 2020
NEWS June 2020: One paper is accepted by ICML 2020
NEWS June 2020: Our paper "Controllable Orthogonalization in Training DNNs" is norminated as Best Paper Candidate by CVPR 2020
NEWS Feb 2020: Two papers (Orals) are accepted by CVPR 2020
NEWS Jan 2020: One paper is accepted by ECAI 2020
NEWS May 2019: I will co-organize The 1st Statistical Deep Learning in Computer Vision workshop in conjunction with ICCV 2019.
NEWS March 2019: Two papers are accpeted by CVPR 2019.
NEWS September 2018: I co-organized a tutorial named "Normalization Methods for Training Deep Neural Networks: Theory and Practice" at ECCV 2018. The slides can be found here .
NEWS Feb 2018: Our paper is accepted by CVPR 2018
NEWS November 2017: One paper is accepted by AAAI 2018 (oral)
NEWS Sep 2017: I will atend the Doctoral Symposium at ICIP 2017, on 19 September at China National Convention Center (CNCC), BeiJing. I will present my research about 'Normalization Techniques in Training Deep Neural Networks'
NEWS Jul 2017: One paper accepted by ICCV 2017.
Publications
I am addicted to understanding and debugging the training of DNNs. I believe one avenue is delving into the basic modules of DNNs, e.g., normalization layer and linear layer.
My contributions for the community are mainly on designing and understanding these basic modules and their compound for the training dynamics of DNNs (* indicates corresponding authors; # indicates equal contributions).
September 8th, 2018. Normalization Methods for Training Deep Neural Networks:
Theory and Practice, ECCV 2018 Tutorial. Munich, Germany. [Slides]
August 17th, 2017. Normalization techiniques in deep learning. Multimedia Signal and Intelligent Information Processing Laboratory, Tsinghua University, Beijing. [Slides]
November 2016- January 2017. Deep Learning Seminar For graduated students in State Key Laboratory of Software Development Environment, Beihang University, Beijing.
[slides1-Introduction]
[slides2-MLP]
[slides3-CNN]
[slides4-RNN]
November 1th, 2014. Graph-based active Semi-Supervised Learning: a new perspective for relieving multi-class annotation labor. ICT International Exchange Workshop 2014, Laboratory of Advanced Research B, University of Tsukuba, Japan [Slides]
Source Code
ONI: This project is the implementation of the paper "Controllable Orthogonalization in Training DNNs" (arXiv:2004.00917).
StochasticityBW: This project is the implementation of the paper "An Investigation into the Stochasticity of Batch Whitening" (arXiv:2003.12327).
IterNorm: This project is the implementation of the paper "Iterative Normalization: Beyond Standardization towards Efficient Whitening" (arXiv:1904.03441).
DBN: This project is the Torch implementation of the paper : Decorrelated Batch normalization (arXiv:1804.08450).
OWN: This project is the Torch implementation of the paper : orthogonal weight normalization method for solving orthogonality constraints over Steifel manifold in deep neural networks (arXiv:1709.06079).
CWN: This project is the Torch implementation of our accepted ICCV 2017 paper: Centered Weight Normalization in Accelerating Training of Deep Neural Networks
NormProjection: This project is the Torch implementation of the paper: Projection Based Weight Normalization for Deep Neural Networks (arXiv:1710.02338)
Ladder_deepSSL_NP: The reimplementation of Ladder networks with projection based weight normalization. We achieved test errors as 2.52%, 1.06%, and 0.91% on Permunate invariant MNIST dataset with 20, 50, and 100 labeled samples respectively, which is the state-of-the-art results.
Miscellaneous
I also have great interest in hiphop dance (particularly in breakin dance), singing and guitar. you can find some videos of my show on my homepage of Meipai and YouTube