Research
Weakly-Supervised Learning
Uncertainty Quantification in Classification and Regression
Deep Learning Phenomena
Publications († Equal Contribution)
-
Revisiting Consistency Regularization for Deep Partial Label Learning.
D.-D. Wu†, D.-B. Wang†, M.-L. Zhang.
In: Proceedings of the 39th International Conference on Machine Learning (ICML), 2022. -
Adaptive Graph Guided Disambiguation for Partial Label Learning.
D.-B. Wang, M.-L. Zhang, L. Li.
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2022. -
Rethinking Calibration of Deep Neural Networks: Don't Be Afraid of Overconfidence.
D.-B. Wang, L. Feng, M.-L. Zhang.
In: Advances in Neural Information Processing Systems (NeurIPS), 2021. -
Learning from Complementary Labels via Partial-Output Consistency Regularization.
D.-B. Wang, L. Feng, M.-L. Zhang.
In: Proceedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI), 2021. - Learning from Noisy Labels with Complementary Loss Functions. [Paper] [Code]
D.-B. Wang, Y. Wen, L. Pan, M.-L. Zhang.
In: Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI), 2021. - Multi-View Multi-Label Learning with View-Specific Information Extraction.
X. Wu, Q.-G. Chen, Y. Hu, D.-B. Wang, X. Chang, X. Wang, M.-L. Zhang.
In: Proceedings of the 28th International Joint Conference on Artificial Intelligence (IJCAI), 2019. - Adaptive Graph Guided Disambiguation for Partial Label Learning. [Paper] [Code]
D.-B. Wang, L. Li, M.-L. Zhang.
In: Proceedings of the 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD), 2019.
Honors
National Scholarship for New PhD Students 2019
Outstanding Graduate Award of Southwest University 2019
National Scholarship 2018
First Class Academic Scholarship of Southwest University 2016, 2017, 2018
Academic Services
PC Member for ICML (2022), IJCAI(2022), AAAI (2022, 2021), ECML/PKDD (2022), ACML (2021), ICMLA (2021), IAAI (2022, 2021).
Invited Journal Reviewer for IEEE TPAMI, ACM TIST, ACM TKDD, IEEE TMM, JCST, Neurocomputing.