Published on 04 Mar 2021

Making an impact on climate change: Prof Lin Weisi

Combining human and artificial intelligence, Prof Lin Weisi develops multimodal environmental monitoring technology.

From the lithosphere to the atmosphere, our warming climate affects our planet on multiple levels. It has dire consequences on human societies along coastal settlements, whose homes and livelihoods are threatened by rising sea levels and intensifying rainfall, both of which cause catastrophic flooding.

Tapping into artificial intelligence and capabilities of human sensory systems like vision, NTU's Prof Lin Weisi develops technologies for multimedia applications such as multimodal environmental monitoring of air quality and sea-level rise.

“I always find it exciting to build technologies that are innovative, practical and beneficial to society,” says Prof Lin, Associate Chair (Research) at NTU’s School of Computer Science and Engineering.

By leveraging on the latest in perception science, brain studies, psychology and sociology, Prof Lin’s research group develops intelligent image and video processing through modelling human visual system characteristics.

“The human visual system has unique mechanisms of visual perception to detect details and changes in the visual field, which can be translated into algorithms and system design for engineering solutions like picture acquisition, transmission, storage, enhancement and presentation,” Prof Lin adds.

Over the last 15 years, Prof Lin has developed a suite of core technologies, which include human visual attention modelling, visual quality of experience assessment, and estimation of just-noticeable-difference—the amount an image or video must be changed in order for a difference to be noticeable.

His team recently developed an advanced visual collaborative intelligence framework that combines human and artificial intelligence to take on various visual tasks, including accurate visual feature extraction; data privacy preservation; and energy-efficient data computing, storage and transmission.

A more ambitious aim of Prof Lin’s is to tap into sensory signals beyond vision. He wants to include four other major human senses for multimedia applications such as passenger tracking and services in airports; e-commerce apps that incorporate audio-visual, touch, temperature, smell and taste; and multimodal environmental monitoring of air quality, industrial smoke, sea-level changes along coastlines and floods.

A Fellow of the Institution of Engineering and Technology and the Institute of Electrical and Electronics Engineers, Prof Lin was ranked among the top global scientists in the field of engineering and cross-field (computer science and engineering), respectively, on Clarivate Analytics’ list of Highly Cited Researchers for the last two years. Prof Lin has co-authored 12 patents and developed more than ten major systems and modules that have been deployed with industry partners such as Panasonic, Japanese telcom company NTT DoCoMo, and Singapore’s ST Engineering and Singtel.

Prof Lin also has keen entrepreneurial instincts. His team was selected in 2020 by Lean LaunchPad Singapore—a governmental innovation and commercialisation platform—to commercialise the visual collaborative intelligence technology recently developed by his lab for potential applications such as new-generation video surveillance, robotics and autonomous vehicles.

The article appeared first in NTU's research & innovation magazine Pushing Frontiers (issue #18, February 2021).