Skip to main content Skip to navigation

EECS Colloquium: Moving beyond scale-driven learning — Hangfeng He, UPenn

ZOOM

About the event

Abstract:

Machine learning, especially deep learning, has been recognized as a monumentally successful approach to many data-intensive applications across a broad range of domains. Despite the great success achieved, recent progress mainly relies on scaling up existing learning methods with regard to the size of models or training data, consuming enormous time and energy. Therefore, my research focuses on moving beyond scale-driven learning to avoid large-scale training data and overly complicated models. My prior work has been driven by two problems: alleviating the supervision bottleneck and interpreting the behaviors of deep neural networks. The former reduces the demand for task-specific data, and the latter helps to design simple and efficient models. In the future, I aspire to expand my research goal from the learning realm to more areas, including comprehending the mechanism of reasoning and analyzing the structure of data.

Three relevant papers:

  • Foreseeing the Benefits of Incidental Supervision (EMNLP 2021)
  • Exploring Deep Neural Networks via Layer-Peeled Model: Minority Collapse in Imbalanced Training (PNAS 2021)
  • Weighted Training for Cross-Task Learning (ICLR 2022)

Bio:
Hangfeng He is a fifth-year Ph.D. student, working with Dan Roth and Weijie Su, in the Department of Computer and Information Science at the University of Pennsylvania. His research interests include machine learning and natural language processing, with a focus on moving beyond scale-driven learning. Specifically, he works on incidental supervision for natural language understanding, interpretability of deep neural networks, reasoning in natural language, and structured data modeling. He has published at top-tier venues in both machine learning and NLP areas.

Contact