Pang Wee Ching

I build robot in the little red dot...

Image

Hello world, I am Wee Ching. I build and program robots with most of my time in Nanyang Technological University (NTU), Singapore.

Currently, I am a full time research associate at the Singapore Centre for 3D printing (SC3DP), which is a national research centre for expanding the 3D printing capabilities in Singapore.

I am also a PhD student at NTU, School of MAE, advised by A/P Gerald Seet.

My PhD research topic focuses mainly on developing an anthropomorphic robot for telepresence applications. This includes robot design, software development to build modules for mobile robot navigation. I also develop dual-arm and head gesturing tasks for humanoid-human interaction as well as GUIs to control the robot.

To contact me...

You can contact me at weeching{at}ntu{dot}edu{dot}sg. Thanks for hopping by my personal website.


EDGAR (Expressions Display and Gesturing Avatar Robot) is a humanoid avatar robot for telepresence. It has a total of 28 degrees of freedom for mimicking the remote user's head and torso movements.

EDGAR is capable of independent movement on each finger in order to produce a large number of complex hand gestures. The robot's head is a rear-projection screen that can display the remote user's facial features and expressions.


MAVEN, which stands for "Mobile Avatar for Virtual Engagement by NTU", is an intelligent mobile telepresence system that permits the user to project his or her presence at a remote environment. This would enable the user to attend an overseas conference or meeting without leaving the comfort of one's office.

Part of the work in this project involves the development of mobile holonomic robotic platforms. These platforms navigate within the remote environment autonomously and safely. On these mobile platforms, a display of the user is present. The user's presence is displayed via the real-time projection of video imagery on a translucent screen.

Alternatively, presence can be displayed with a physical humanoid avatar robot. This avatar robot mimics the remote user's physical actions such as head movements and arm gestures. It also displays real-time video imagery of the user's facial features and expressions.

Chapter

  • 1. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok, and Pang Wee Ching (2012). A Hierarchically Structured Collective of Coordinating Mobile Robots Supervised by a Single HumanMobile Ad Hoc Robots and Wireless Robotic Systems: Design and Implementation.

Journal Paper

  • 1. Pang Wee Ching, Gerald Seet Gim Lee, Michael Lau Wai Shing and Aryo Wiman Nur Ibrahim. (2013). From Ground to Air: Extension of RoboSim to Model UAVs and Behaviors for Urban Operations. Journal of Unmanned System Technology, 1(1), 6.
  • 2. Pang Wee Ching, Gerald Seet Gim Lee and Yao Xiling. (2014). A study on high-level autonomous navigational behaviors for telepresence applications. Presence: Teleoperators and Virtual Environments, vol. 23, no. 2, pp. 155-171

Conference Paper

  • 1. Pang Wee Ching, Gerald Seet Gim Lee and Yao Xiling. (2013). Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology: A Multimodal Person-following System for Telepresence Applications. Virtual Reality Software and Technology (VRST 2013) (pp. 157--164)New York, NY, USA: ACM. [PDF]
  • 2. Pang Wee Ching, Gerald Seet Gim Lee, Aryo Wiman Nur Ibrahim and Michael Lau Wai Shing. (2012, October). Evaluation of Intelligent Mini UAV Design Parameters for Urban Operations within 3D Robotic Simulator. Paper presented at The 8th International Conference on Intelligent Unmanned Systems 2012 (ICIUS 2012), Singapore.
  • 3. Pang Wee Ching, Burhan, Gerald Seet. (2012). Lecture Notes in Computer Science: Design Considerations of a Robotic Head for Telepresence Applications. The 5th International Conference on Intelligent Robotics and Applications (ICIRA 2012) (pp. 131--140)Montreal, Canada: Springer Berlin Heidelberg. [PDF]
  • 4. Gerald Seet Gim Lee, Pang Wee Ching, Burhan, Chen I-Ming,Viatcheslav V Iastrebov, William Gu Yuan Long and Wong Choon Yue. (2012). 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video: A Design for a Mobile Robotic Avatar - Modular Framework. 3DTV-Conference 2012 The True Vision (pp. 1--4)Zurich, Switzerland: 2012.
  • 5. Gerald Seet Gim Lee, Pang Wee Ching and Burhan. (2012, May). Towards the Realization of MAVEN - Mobile Robotic Avatar. Paper presented at The 25th International Conference on Computer Animation and Social Agents, Singapore.
  • 6. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok and Pang Wee Ching. (2010). Control Automation Robotics & Vision (ICARCV): Single-human multiple-robot systems for urban search and rescue: Justifications, design and testing. 11th International Conference on Control Automation Robotics & Vision (ICARCV) (pp. 579-584).
  • 7. Aryo Wiman Nur Ibrahim, Pang Wee Ching, Gerald Seet Gim Lee, Michael Lau Wai Shing and Witold Czajewski. (2010). 2010 Fourth Pacific-Rim Symposium on Image and Video Technology (PSIVT): Moving Objects Detection and Tracking Framework for UAV-based Surveillance. Image and Video Technology (PSIVT) (pp. 456-461).
  • 8. Pang Wee Ching, Gerald Seet Gim Lee, Burhan, Viatcheslav V Iastrebov and Michael Lau Wai Shing. (2010). 3rd International Conference on Underwater System Technology: Theory and Applications 2010 (USYS'10): Individually-Adjustable Stereoscopic TVS for the Remote Observation of Underwater Pipeline.. (pp. 1 - 6)Cyberjaya, MALAYSIA.
  • 9. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok and Pang Wee Ching. (2010). 2010 IEEE Conference on Sustainable Utilization and Development in Engineering and Technology (STUDENT): A framework for area coverage and the visual search for victims in USAR with a mobile robot. Sustainable Utilization and Development in Engineering and Technology (STUDENT) (pp. 112-118).

In the Media

Thanks to all journalists and event organizers, who have given my team and I the chances to demonstrate our work, as well as to provide for media exposure. Thank you very much!


Recounting the Amazon Robotic Challenge Experience

Tuesday, August 1, 2017

I really cannot contain this joy anymore!!! Exploding with gladness because... We won big at the Amazon Robotics Challenge 2017!!!

We scored well, in fact very well, securing the first, second and third prizes in total. We got second place in the stowing challenge, first prize for the picking challenge and third for the final stow-and-pick challenge. In term of total score, our team is the only team that scored above 600pts.

Two months ago, I joined the Team Nanyang, as a part-time software developer, to participate in the Amazon Robotic Challenge 2017. They had much of the system up but lacked a robust object recognition solution. Hence, I developed a deep learning system for the team to use the CNN to train and recognize items during all picking and stowing tasks.

As I have mentioned in my last post on ARC, I wanted to learn and implement some of the new algorithms that I have read. I am so glad that I did. Through this opportunity, I have learnt and tested the active segmentation technique, yolo, some packing strategies and a little bit on pointcloud based object recognition.

With these knowledge, I developed a picking strategy based on the order list as well as an accumulative object recognition technique (I call it recognition memory) to help remember the items that have been stowed previously but occluded by current stowed items. Finally, I used the Tensorflow to implement a simple CNN classifier for object recognition using 2D images. However, it can take a long time to retrain a CNN classifier, so we have to tweak the training process a little to ensure that the training of novel items can be achieved within 15 minutes during the competition. It's really quite a close shave that we managed to re-train 16 new items within the time given.

All in all, I LOVE the experiences that I have gained in participating in the Amazon Robotic Challenge 2017, even I didn't get the chance to go to Japan. (I can't go because I have NDP Pre-Show at the same time.) Despite being in Singapore, I can feel the adrenaline through the updates I received from my teammates. I also got to re-program the system while I was in Singapore. Crazy I know.. but it is so exciting.


Edgar is going to NDP!!!

Tuesday, July 11, 2017

Finally, the news is out!!

Finally, I can tell everyone that we are going to be participating in the National Day Parade 2017. This year's parade theme is #OneNationTogether, and it will be showcasing some of the achievements that Singaporeans have accomplished as one nation together!
Edgar, being a Made-In-Singapore robot, has been head-hunted to host the parade. This is the first time a robot is going to host such a big event, together with Narain, Julie, Joakim, and Nurul (See Image: from left to right).

We are so honor and thankful that the NDP committee has found Edgar. As Singaporeans ourselves, we are eager to demonstrate Edgar and inspire our nation to achieve more in arts and science. Besides adding some futuristic elements to the parade, we hope that this would encourage more smart technologies (including robotics) to be developed in Singapore.

And let's be assured -- robots and smart technologies WILL NOT take jobs away from us humans.
In fact, these technologies will create more job openings, such as mechanical, electrical and computer engineers, robot designers, electronic designers, content writers, software managers, AI programmers, apps developers, artists, animators and so much more... However, this would mean that Singaporeans have to get creative, get hands-on and start creating things... Well, if we don't do it, then other people will do it.

Alright, enough of my rambling... Happy National Day, Singapore!!! May this little red dot continues to be blessed with love, peace, joy and progress.



I am officially joining the Amazon Robotic Challenge 2017...

Thursday, June 15, 2017

After much persuasion and consideration, I have decided to join Team Nanyang to participate in the Amazon Robotic Challenge 2017.

This is a difficult decision because I have many work commitments at hand and there are yet so much to do for the competition. And, I can only work on the ARC at a part-time basis. Geez... now I am worried if I can manage my time properly or not.

But why did I join the team?

Well, I was approached by Reeve and Yuan Yik at different occasions. They have been persuading me to join the team for so many times that it is becoming harder to refuse. You see, they are one of the many capable people in RRC. Reeve took part in Darpa Humanoid Challenge previously and Yuan Yik won the MIT Hackathon this year. It's an honor to be recognized by them. Hendra has also recently joined the team, he is like my best friend at work and he is also very competent in both hardware and software.
Thus, my very first reason to join the challenge is PEOPLE... I feel that we may have a very high chance to win this competition because of this A-Team.

The second reason is that I have not been participating in a competition for a long time... I personally feel that when you compete in a challenge like this, you will work or learn more efficiently as you will be under the pressure to perform. This is not the same as working on a project, which you can try different methods and justify your result at a presentation. In a competition, you have to perform. So, I really want to PROVE myself through this competition.

Of course, I also wanna learn from the team as well as to implement some of the new algorithms that I have read. I always like to PROGRESS, but for most of the time, I procrastinate. Hence, I hope that in joining, I will be perked to get better.

Lastly, for the PRIZE. Not so much of monetary but for the honor, if we can win an international robotic competition.



ICRA 2017

Saturday, June 3, 2017

Super excited and glad to be at the ICRA 2017!!!!
This conference is like one of the best robotic conference in the world. And this year, it is held in Singapore and is organized by RRC.
Therefore, some of us would have the privilege to be there at the conference, without publishing any papers...
Some of the RRC researchers can be at the conference as helpers.

We, however, are fortunate enough to have a booth at the exhibition hall to showcase EDGAR. Really have to thank the conference committees for that opportunity.

Here's a selfie group photo taken with the Tiago robot from PAL robotics and EDGAR, at our booth.

Besides being an exhibitor, I also helped out as an official photographer as well as a technician, who would provide help with using our "door gift". The ICRA door gift this year is an electronic tablet... yes, each delegate will receive one Android tablet that consists of the conference proceedings!!! Furthermore, the conference is held at Marina Bay Sands --- that means sumptuous fine food and excellent services. This has to be the most luxurious ICRA conference yet..



Confirmed!

Wednesday, May 17, 2017

Finally, I passed my qualifying examination...
More work to be done before I meet these gentlemen again.

Image


NTU MAE Alumni Homecoming Dinner

Saturday, September 3, 2016

It has been an exciting day in school today because we have been invited to the MAE homecoming dinner. Thanks to EDGAR again.
It is heartening to see different batches of graduates coming back to celebrate the 35th anniversary of MAE. I heard that there were more than 200 attendees, consisting of graduates from the very first batch of 1985. Amazing!
And for the first time, the dinner is held at the new North Spine Skydeck garden in NTU. It's such a unique experience!



A*STAR One North Fest 2016

Saturday, August 6, 2016

It is a great privilege to be invited by the Agency for Science, Technology and Research (A*STAR) to their inaugural One-North Fest.
The organisers have made a promotion video to highlight the Day 1 event, and EDGAR has been featured in it. Awesome!
Thank you very much!



ROS-Industrial Asia Pacific Workshop

Friday, July 15, 2016

It is such an honor to present our work at the ROS-I workshop!!
This event is co-organized by RRC and ARTC to bring ROS-I workshop to Asia for the first time.

And guess who are the guests of honor? Brian Gerkey of course, as well as Morgan Quigley and Dave Coleman.
It is so exciting for us to demonstrate EDGAR before these gurus in robotics and computer sciences.

At the final tea break, I have to summon all of my courage to approach Brian and Morgan.
In my mind, I was thinking that I have to talk to them. If I don't talk to them, I will never have a chance to talk to them again.. urg.. I will regret if I don't talk to them. What shall I talk to them about? Tell them that I have been using ROS since Diamondback.. talk to them about EDGAR.. ask them about ROS v2 or talk about Player and Stage.. Arg...  Just talk to them about anything. Just go ahead and talk to them!!!

And I'm glad I did.

I can't remember what we were talking about at all because not long into our conversation, Morgan said that he would like to give me a gift.

Then he fished out their autographed book from his bag and handed it to me . He asked me if I would like to have it.

Of course, I would like to have it!!

It was such a great surprise and I totally lost my cool. I got so crazily elated that I kept saying "omg" and thanking them.
I got Reeve to take picture of us, and this drew some attention - thereafter, everybody wanted to take picture with them..
One stranger even borrowed my book and posed a photo with them!!!
What the **** is this?

Anyway, at that time I was too happy to respond to anything, just grinning away.

What a lovely day!



Invitation to the NTU State of the University Address 2016

Tuesday, March 8, 2016

It is so honorable that we are invited to attend the NTU State of the University Address 2016 as special guests.
Thanks to EDGAR, I am able to sit at the best seats in the Nanyang Auditorium during the Address.
This is a picture of EDGAR on the big screen during President Andersson's speech.
However... Sigh... I felt that the video is not tastefully made. EDGAR looks stupid in the video.. so am I.



Video shoot by National Geographic Channel

Tuesday, March 1, 2016

This photo illustrates our vision of using EDGAR as a telepresence humanoid --- the ability to transmit physical interaction for telecommunication.

Here, you see Max McMurdo using EDGAR to "skype" with Mischa Pollack. Through the robot, Max is able to give his friend a physcial hug and even play charade over the Internet.
These two TV hosts have been having a lot of fun stress-testing the robot!

Check out the NatGeo TV for more information!