成人VR视频

Event

Graph Representation Learning: Recent Advances and Open Challenges

Friday, December 4, 2020 11:00to12:00
ZOOM, CA

Virtual Informal Systems Seminar (VISS) Centre for Intelligent Machines (CIM) and Groupe d'Etudes et de Recherche en Analyse des Decisions (GERAD)

Speaker: William Hamilton, Assistant Professor, School of Computer Science, 成人VR视频



Meeting ID: 910 7928 6959 聽 聽 聽 聽Passcode: VISS


础产蝉迟谤补肠迟:听

Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial if we want systems that can learn, reason, and generalize from this kind of data. Recent years have seen a surge in research on graph representation learning, most prominently in the development of graph neural networks (GNNs). Advances in GNNs have led to state-of-the-art results in numerous domains, including chemical synthesis, 3D-vision, recommender systems, question answering, and social network analysis. In the first part of this talk I will provide an overview and summary of recent progress in this fast-growing area, highlighting foundational methods and theoretical motivations. In the second part of this talk I will discuss fundamental limitations of the current GNN paradigm. Finally, I will conclude the talk by discussing recent progress my group has made in advancing graph representation learning beyond the GNN paradigm.

Bio:
William (Will) Hamilton is an Assistant Professor in the School of Computer Science at 成人VR视频, a Canada CIFAR AI Chair, and a member of the Mila AI Institute of Quebec. Will completed his PhD in Computer Science at Stanford University in 2018. He received the 2018 Arthur Samuel Thesis Award for best Computer Science PhD Thesis from Stanford University, the 2014 CAIAC MSc Thesis Award for best AI-themed MSc thesis in Canada, as well as an honorable mention for the 2013 ACM Undergraduate Researcher of the Year. His interests lie at the intersection of machine learning, network science, and natural language processing, with a current emphasis on the fast-growing subject of graph representation learning.
Back to top