SCALE@NTU Invited Talk: How Do Large Language Models Capture the Ever-changing World Knowledge? A Review of Recent Advances

08 Dec 2023 10.30 AM - 11.30 AM Public

This research seminar is organized by Singtel Cognitive and Artificial Intelligence Lab for Enterprises (SCALE@NTU). Please find below the registration information:

To attend the seminar physically (limited seats available), click here

To attend the seminar on Teams, click here

Abstract : Large language models (LLMs) have shown impressive performance in solving various tasks. However, as the world is constantly changing and new information is being generated every day, the trained LLM can be quickly outdated without re-training, which can be a time-consuming and resource-intensive process. Consequently, maintaining LLMs up-to-date status is a pressing concern in the current era. In this talk, I will provide a review of recent advances in aligning LLMs with the ever-changing world knowledge without re-training from scratch, by categorizing research works and providing in-depth comparisons and discussion. I will also discuss existing challenges and highlight future directions to facilitate research in this field.

Speaker : Ling Chen is a Professor in the School of Computer Science at the University of Technology Sydney. She leads the Data Science and Knowledge Discovery Laboratory (The DSKD Lab) within the Australian Artificial Intelligence Institute (AAII) at UTS. Ling’s recent research interests include anomaly detection, data representation learning, and dialogues and interactive systems. Her research has gained recognition from both government agencies, receiving competitive grants such as ARC DP/LP/LIEF, and industry partners, with contracted research support from entities like Facebook Research and TPG Telecom. Ling serves as an Editorial Board member for journals including the IEEE Journal of Social Computing, the Elsevier Journal of Data and Knowledge Engineering and the Computer Standards and Interfaces.