Research

 

Current Projects    
  Automatic Optical Inspection
 
This project is a part of the research funded by Delta-NTU Corporate lab for Cyber-Physical Systems. We investigate theory and practice of building a generic framework for the Automatic Optical Inspection (AOI) of electronic components based on deep learning methods which will work for different types of components and defects in the production line. Surface level and solder joint defects are considered. The AOI will be set in the production lines to detect defects, highlight their locations and provide the classes with reasonable processing speed. We aim to build a highly configurable and flexible manufacturing system which will have an ability to support optimized schedule of various items and parts into a production line.    
     
  Function-based Shape Modeling
 
We use mathematical definitions or procedural representation to define geometry, visual appearance and physical properties of virtual objects. We work with FReps, algebraic surfaces, implicit surfaces, CSG solids, volumetric objects, as well as parametric curves, surfaces and solids. We have developed FVRML/FX3D — function-based extension of Virtual Reality Modeling Language (VRML) and Extensible 3D (X3D) which allows for typing mathematical formulae and function scripts straight in the codes of these programming languages for defining practically any type of geometry, appearance as well as physical properties, which can be explored using haptic devices. We are also developing various interactive modelling tools for different applications including medical simulations, free-form shape modelling, and computer science education.
     
  Tangible Images and Haptic Video Comunication
 
We add haptic interaction modality to visual rendering and common video communication across the internet. Haptic forces are either retrieved from images or video or efficiently exchanged as asynchronous data streams across the internet. We research how to do it in the most efficient way and without modifications of the existing video communication applications. We devise methods supporting haptic interactions done with common desktop haptic devices, video cameras as well as depth sensing controllers and wearable devices. We are working on applying the proposed methods to medical simulations and electronic shopping applications.
     

  New Human-Computer Interaction
 
We add new modality to common interactive shape modelling and virtual prototyping and allow for combining visual, audio, haptic and free-hand interactions in virtual environments and simulations. Hand motion is captured by different ways ranging from using various desktop haptic devices to optical motion capture cameras so that it will become possible to start modelling with one type of device and then continue it using another type of hand tracking device. We design and implement a set of robust and efficient interactive hand gestures suitable for various engineering designs (virtual prototyping) and crafts (freeform shape modelling). We simulate existing hand-made assembling and modelling processes so that the required motor skills can be both trained and applied in the simulations. We also work on new ways evaluating the quality of interaction experience.
     
  Cyberworlds
 
Created intentionally or spontaneously, cyberworlds are information spaces and communities that immensely augment the way we interact, participate in business and receive information throughout the world. Cyberworlds seriously impact our lives and the evolution of the world economy by taking such forms as social networking services, 3D shared virtual communities and massively multiplayer online role-playing games. I am coordinating the annual International Conferences on Cyberworlds
 
Past Projects
    Interactive Free-form Shape Modeling
 
Interactive modification of the function model with concurrent visualization of the respective part of it  lets us provide both the interactivity and any required level of detail leading to photo-realistic appearance of the resulting shapes. This project continue as Function-based Shape Modeling project.
     
    Function Representation (FRep)
 
This method proposes that geometric shapes are represented with the inequality f(x,y,z)0, where the value of the real function f is positive for the points inside the shape, equal to zero on its border and negative outside the shape. This project continue as Function-based Shape Modeling project.
     
    Interactive Segmentation of MRI Brain Data
 
Novel visualization algorithms developed specifically for segmentation purposes have been proposed along with a method for 3D interactive haptic correction of brain segmentation errors introduced by the fully automatic segmentation algorithms.
     
    Scientific Visualization
 
3D visualization of the electroencephalograms showing how the electrical signal changes through time. Its size and appearance visually reflect the brain activity. The software developed is an interactive program, which visualizes one or several signals by modeling the respective time-dependent 3D surfaces around the 3D human head.
     
    Virtual Orthopedic Surgery Training
 
While currently available VR surgery systems usually require expensive hardware and software, we developed a desktop VR orthopedic surgery training system that can run on commonly available personal computers. This project continues as Tangible Images and Haptic Video Comunication project.
     
    Parallel Computing for Visualization and Shape Modeling
 
Parallel computing is used for computer animation rendering and interactive shape modeling based on using implicit functions. We have also devised and implemented a loseless 3D compression algorithm that enables for transferring across the Internet gigabytes of scene representation files in Renderman and mental images formats.
     

    Virtual Campus
 
Virtual Campus is a shared collaborative VRML model of the real campus of Nanyang Technological University in Singapore. It can be accessed from any Internet-connected personal computer running MS Windows and Internet Explorer or Mozilla Firefox. This project continues as Cyberworlds project.

 

Current PhD Students:

  • Ms. Dai Wenting, "Automated Smart Manufacturing Control Based on Image Processing and Machine Learning",
    DELTA-NTU Corp Lab project, co-supervisor Marius Erdt.

  • Mr. Mohammad Chegini, "Visual Interactive Data Analyzis Supported by Novel Interaction Modalities",
    joint PhD project, co-supervisor Tobias Schreck.
  • Mr. Johannes Edelsbrunner, "Domain Specific Procedural Modeling of 3D Shapes" (pending degree confirmation),
    joint PhD project, co-supervisor Sven Havemann.
  • Ms. Xingzi Zhang, "Image-inspired Haptic Interaction" (pending degree confirmation),
    joint PhD project, co-supervisor Michael Goesele,

  • Mr. Jian Cui, "Mid-Air Hand Interaction with Optical Tracking for 3D Modelling" (thesis preparation),
    joint PhD project, c
    o-supervisors Arjan Kuijper and Dieter Fellner.

  • Mr. Bin Weng, "Thin Deformable Tissues Modeling for Simulation of Artroscopic Knee Surgery",
    NTU PhD project funded by MOE Tier-
    2 grant, (thesis examination)
    .

Researchers:

  • Mr. Guo Song (Research Associate, Fraunhofer Singapore)

Collaborators:

  • Dr. Olga Sourina (Principal Scientist, Fraunhofer Singapore)
  • Dr. Marius Erdt (Deputy Director, Fraunhofer Singapore)

Past Researchers and Students:

Past Visiting Researchers and Professors:

  • Prof. Stanislav Klimenko (visiting Professor, Moscow Institute of Physics and Technology, Russia)

  • Prof. Pang Ming-Yong (visiting Professor, Nanjing Normal University, China)

  • Prof. Hasan Ugail (visiting Professor, University of Bradford, UK)

Current Funding:

  • Delta-NTU Joint Corp Lab, project "Virtual Factory 4.0 Process Monitor Platform (SMA-RP4) Research Areas:
    Manufacturing (SMA)
    " 01.07.2016 - 30.06.2019

Completed Funded Projects:

  • Ministry of Education of Singapore, Research Grant MOE T1 RG17/15 “Haptic Interaction with Images and
    Videos
    ”. S$ 100,000, 1 year, 2015-2016
  • Ministry of Education of Singapore , Research Grant MOE2011-T2-1-006 “Collaborative Haptic Modeling for
    Orthopaedic Surgery Training in Cyberspace
    ”, S$ 630,874, 2011-2015
  • Singapore National Research Foundation Interactive Digital Media R&D Program
    Research Grant NRF2008IDM-IDM004-002 “Visual and Haptic Rendering in Co-Space” S$ 1,248,000, 2008-2012
  • UK Engineering and Physical Sciences Research Council (EPSRC) travel funding EP/G067732.
    Function Based Geometry Modelling Within Visual Cyberworlds", GBP 19,710, 2 years, 2009-2011

  • SBIC Innovative Grant RP C-012/2006. “Improving Measurement Accuracy of Magnetic Resonance
    Brain Images to Support Change Detection in Large Cohort Studies
    ”, S$ 605,700, 4 years, 2007-2011

  • NTU RG27/06. “Investigation, Modeling and Quantification of Brain Response to External Stimuli”,
    S$ 86,692, 3 years, 2007–2010

  • Singapore MOE Teaching Excellence Grant.Cyber-learning with Cyber-instructors, S$ 114,100.00,
    2 years, 2007-2009

  • Singapore AE@SG R&D Alliance Project, IDA, Work Package 2.  “Grid-enabled Graphics and Animation
    Applications
    ”, S$ 470,000, 3.25 years, 2004–2007

  • NTU RG01/04. “Interactive function based shape modeling”, S$ 35,521, 3 years, 2004-2006

  • NTU RG35/96. “Real time dynamic simulation—Virtual Campus”, S$ 73,700.00, 3 years, 1997-2000