Pang Wee Ching

I build robot in the little red dot...

Image

Hello world, I am Wee Ching. I build and program robots with most of my time in Nanyang Technological University (NTU), Singapore.

Currently, I am a full time research fellow at RRIS, working on projects related to smart wheelchair, perception and speech recognition.

I have also just graduated from my PhD programme at NTU, School of MAE. My PhD research topic focuses mainly on developing an anthropomorphic robot for telepresence applications. This includes robot design, software development to build modules for mobile robot navigation. I have also developed dual-arm and head gesturing tasks for humanoid-human interaction as well as GUIs to control the robot.

To contact me...

You can contact me at weeching{at}ntu{dot}edu{dot}sg. Thanks for hopping by my personal website.


EDGAR (Expressions Display and Gesturing Avatar Robot) is a humanoid avatar robot for telepresence. It has a total of 28 degrees of freedom for mimicking the remote user's head and torso movements.

EDGAR is capable of independent movement on each finger in order to produce a large number of complex hand gestures. The robot's head is a rear-projection screen that can display the remote user's facial features and expressions.


MAVEN, which stands for "Mobile Avatar for Virtual Engagement by NTU", is an intelligent mobile telepresence system that permits the user to project his or her presence at a remote environment. This would enable the user to attend an overseas conference or meeting without leaving the comfort of one's office.

Part of the work in this project involves the development of mobile holonomic robotic platforms. These platforms navigate within the remote environment autonomously and safely. On these mobile platforms, a display of the user is present. The user's presence is displayed via the real-time projection of video imagery on a translucent screen.

Alternatively, presence can be displayed with a physical humanoid avatar robot. This avatar robot mimics the remote user's physical actions such as head movements and arm gestures. It also displays real-time video imagery of the user's facial features and expressions.

Journal Paper

  1. Pang, Wee Ching, Wong Choon Yue, and Gerald Seet. (2018). Exploring the use of robots for museum settings and for learning heritage languages and cultures at the chinese heritage centre. Presence: Teleoperators and Virtual Environments, vol. 26 no. 4, pp. 420-435.
  2. Pang Wee Ching, Gerald Seet Gim Lee and Yao Xiling. (2014). A study on high-level autonomous navigational behaviors for telepresence applications. Presence: Teleoperators and Virtual Environments, vol. 23, no. 2, pp. 155-171
  3. Pang Wee Ching, Gerald Seet Gim Lee, Michael Lau Wai Shing and Aryo Wiman Nur Ibrahim. (2013). From Ground to Air: Extension of RoboSim to Model UAVs and Behaviors for Urban Operations. Journal of Unmanned System Technology, 1(1), 6.

Chapters or Letters

  1. Dong Huixu, Guangbin Sun, Pang Wee-Ching, Ehsan Asadi, Dilip K. Prasad, and I-Ming Chen. (2018). Fast ellipse detection via gradient information for robotic manipulation of cylindrical objects. IEEE Robotics and Automation Letters (RA-L) vol. 3, no. 4 pp. 2754-2761.
  2. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok, and Pang Wee Ching (2012). A Hierarchically Structured Collective of Coordinating Mobile Robots Supervised by a Single HumanMobile Ad Hoc Robots and Wireless Robotic Systems: Design and Implementation.

Conference Paper

  1. Albert Causo, Zheng-Hao Chong, Ramamoorthy Luxman, Yuan Yik Kok, Zhao Yi, Pang Wee-Ching, Ren Meixuan et al. (2018). A robust robot design for item picking. In IEEE international conference on robotics and automation (ICRA), pp. 7421-7426. IEEE.
  2. Chong Zheng-Hao, Ramamoorthy Luxman, Pang Wee-Ching, Zhao Yi, Ren Meixuan, Hendra Suratno Tju, Albert Causo, and I-Ming Chen. An innovative robotics stowing strategy for inventory replenishment in automated storage and retrieval system. In 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), pp. 305-310. IEEE, 2018.
  3. Pang Wee Ching, Wong Choon Yue and Gerald Seet Gim Lee. (2016). Design and development of EDGAR—A telepresence humanoid for robot-mediated communication and social applications. In Proceedings of the IEEE International Conference on Control and Robotics Engineering, Singapore, 2–4 April 2016; pp. 1–4
  4. Pang Wee Ching, Gerald Seet Gim Lee and Yao Xiling. (2013). Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology: A Multimodal Person-following System for Telepresence Applications. Virtual Reality Software and Technology (VRST 2013) (pp. 157--164)New York, NY, USA: ACM. [PDF]
  5. Pang Wee Ching, Gerald Seet Gim Lee, Aryo Wiman Nur Ibrahim and Michael Lau Wai Shing. (2012, October). Evaluation of Intelligent Mini UAV Design Parameters for Urban Operations within 3D Robotic Simulator. Paper presented at The 8th International Conference on Intelligent Unmanned Systems 2012 (ICIUS 2012), Singapore.
  6. Pang Wee Ching, Burhan, Gerald Seet. (2012). Lecture Notes in Computer Science: Design Considerations of a Robotic Head for Telepresence Applications. The 5th International Conference on Intelligent Robotics and Applications (ICIRA 2012) (pp. 131--140)Montreal, Canada: Springer Berlin Heidelberg. [PDF]
  7. Gerald Seet Gim Lee, Pang Wee Ching, Burhan, Chen I-Ming,Viatcheslav V Iastrebov, William Gu Yuan Long and Wong Choon Yue. (2012). 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video: A Design for a Mobile Robotic Avatar - Modular Framework. 3DTV-Conference 2012 The True Vision (pp. 1--4)Zurich, Switzerland: 2012.
  8. Gerald Seet Gim Lee, Pang Wee Ching and Burhan. (2012, May). Towards the Realization of MAVEN - Mobile Robotic Avatar. Paper presented at The 25th International Conference on Computer Animation and Social Agents, Singapore.
  9. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok and Pang Wee Ching. (2010). Control Automation Robotics & Vision (ICARCV): Single-human multiple-robot systems for urban search and rescue: Justifications, design and testing. 11th International Conference on Control Automation Robotics & Vision (ICARCV) (pp. 579-584).
  10. Aryo Wiman Nur Ibrahim, Pang Wee Ching, Gerald Seet Gim Lee, Michael Lau Wai Shing and Witold Czajewski. (2010). 2010 Fourth Pacific-Rim Symposium on Image and Video Technology (PSIVT): Moving Objects Detection and Tracking Framework for UAV-based Surveillance. Image and Video Technology (PSIVT) (pp. 456-461).
  11. Pang Wee Ching, Gerald Seet Gim Lee, Burhan, Viatcheslav V Iastrebov and Michael Lau Wai Shing. (2010). 3rd International Conference on Underwater System Technology: Theory and Applications 2010 (USYS'10): Individually-Adjustable Stereoscopic TVS for the Remote Observation of Underwater Pipeline.. (pp. 1 - 6)Cyberjaya, MALAYSIA.
  12. Wong Choon Yue, Gerald Seet Gim Lee, Sim Siang Kok and Pang Wee Ching. (2010). 2010 IEEE Conference on Sustainable Utilization and Development in Engineering and Technology (STUDENT): A framework for area coverage and the visual search for victims in USAR with a mobile robot. Sustainable Utilization and Development in Engineering and Technology (STUDENT) (pp. 112-118).

In the Media

Thanks to all journalists and event organizers, who have given my team and I the chances to demonstrate our work, as well as to provide for media exposure. Thank you very much!


We were on Reuters news!

Wednesday, September 11, 2019

Well... This is one major update of my life! Besides an interview with Reuters, I have a little one in my arms.

Image


Working with Nas Daily!

Tuesday, July 9, 2019

This is so amazing!!! Awesome Overload!!! We have this incredible opportunity to work with Nas Daily's Nuseir and Agon today. It is really a blessing to work alongside with talented people like them, they are just so passionate and so full of energy during their video shooting.

Image


Highlights of 2018!

Saturday, December 15, 2018

My apologies for not blogging this year. Many things had happened at work and at home, so please pardon me.
Nevertheless, I would like to thank the university for making a video to highlight some of our work on the EDGAR robot this year.



Ted talk at TedxNTU!

Saturday, October 7, 2017

This is so awesome!!!! We have been invited to give a Ted talk at TedxNTU! We are so thankful for this opportunity where we can learn, from these talented Ted organizers, on how to prepare for a talk. The event was well executed, the crowd was exuberant and the speakers were eloquent. There were also a number of side performances as well as a livestream post-talk interview.



The National Day Parade 2017

Wednesday, August 9, 2017

I found this very beautiful photo of Edgar at the National Day Parade 2017. Thank you so much, Singapore, for having us at the national party. It's really a great honor. We have known so many people and learnt so much about deploying a robot outdoor.
Let's continue to progress as a nation and may we have more opportunities to make Edgar even better!
Once again, Happy National Day!



Recounting the Amazon Robotic Challenge Experience

Tuesday, August 1, 2017

I really cannot contain this joy anymore!!! Exploding with gladness because... We won big at the Amazon Robotics Challenge 2017!!!

We scored well, in fact very well, securing the first, second and third prizes in total. We got second place in the stowing challenge, first prize for the picking challenge and third for the final stow-and-pick challenge. In term of total score, our team is the only team that scored above 600pts.

Two months ago, I joined the Team Nanyang, as a part-time software developer, to participate in the Amazon Robotic Challenge 2017. They had much of the system up but lacked a robust object recognition solution. Hence, I developed a deep learning system for the team to use the CNN to train and recognize items during all picking and stowing tasks.

As I have mentioned in my last post on ARC, I wanted to learn and implement some of the new algorithms that I have read. I am so glad that I did. Through this opportunity, I have learnt and tested the active segmentation technique, yolo, some packing strategies and a little bit on pointcloud based object recognition.

With these knowledge, I developed a picking strategy based on the order list as well as an accumulative object recognition technique (I call it recognition memory) to help remember the items that have been stowed previously but occluded by current stowed items. Finally, I used the Tensorflow to implement a simple CNN classifier for object recognition using 2D images. However, it can take a long time to retrain a CNN classifier, so we have to tweak the training process a little to ensure that the training of novel items can be achieved within 15 minutes during the competition. It's really quite a close shave that we managed to re-train 16 new items within the time given.

All in all, I LOVE the experiences that I have gained in participating in the Amazon Robotic Challenge 2017, even I didn't get the chance to go to Japan. (I can't go because I have NDP Pre-Show at the same time.) Despite being in Singapore, I can feel the adrenaline through the updates I received from my teammates. I also got to re-program the system while I was in Singapore. Crazy I know.. but it is so exciting.



Edgar is going to NDP!!!

Tuesday, July 11, 2017

Finally, the news is out!!

Finally, I can tell everyone that we are going to be participating in the National Day Parade 2017. This year's parade theme is #OneNationTogether, and it will be showcasing some of the achievements that Singaporeans have accomplished as one nation together!
Edgar, being a Made-In-Singapore robot, has been head-hunted to host the parade. This is the first time a robot is going to host such a big event, together with Narain, Julie, Joakim, and Nurul (See Image: from left to right).

We are so honoured and thankful that the NDP committee found Edgar. As Singaporeans ourselves, we are eager to demonstrate Edgar and inspire our nation to achieve more in arts and science. Besides adding some futuristic elements to the parade, we hope that this would encourage more smart technologies (including robotics) to be developed in Singapore.

And let's be assured -- robots and smart technologies WILL NOT take jobs away from us humans.
In fact, these technologies will create more job openings, such as mechanical, electrical and computer engineers, robot designers, electronic designers, content writers, software managers, AI programmers, apps developers, artists, animators and so much more... However, this would mean that Singaporeans have to get creative, get hands-on and start creating things... Well, if we don't do it, then other people will do it.

Alright, enough of my rambling... Happy National Day, Singapore!!! May this little red dot continues to be blessed with love, peace, joy and progress.



I am officially joining the Amazon Robotic Challenge 2017...

Thursday, June 15, 2017

After much persuasion and consideration, I have decided to join Team Nanyang to participate in the Amazon Robotic Challenge 2017.

This is a difficult decision because I have many work commitments at hand and there are yet so much to do for the competition. Furthermore, I can only work on the ARC at a part-time basis. Geez... now I am worried if I can manage my time properly or not.

Nevertheless, I hope that I can use this opportunity to learn as much as I can from the team, and to implement some of the new algorithms that I have read about. It is really exciting to be able to get involved and compete in such an international robotic competition!!!



ICRA 2017

Saturday, June 3, 2017

Super excited and glad to be at the ICRA 2017!!!!

The ICRA conference is one of the best robotic conference in the world. And this year, it is held in Singapore! As the director of RRC (Prof Chen I-Ming) is a member of the program committee, RRC researchers can volunteer to help out at the conference. This would give us the privilege to be at the conference, interacting with top-notched roboticists and listening to ground-breaking research presentations.

We, however, are fortunate enough to have a booth at the exhibition hall to showcase EDGAR. Really have to thank the conference committees for that opportunity.
Here's a selfie group photo taken with the Tiago robot from PAL robotics and EDGAR, at our booth.

Besides being an exhibitor, I also helped out as an official photographer as well as a technician, who would provide help with using our "door gift". This year, the ICRA door gift is an electronic tablet... yes, each delegate will receive one Android tablet that consists of the conference proceedings!!! Furthermore, the conference is held at Marina Bay Sands --- that means sumptuous fine food and excellent services. This has to be the most luxurious ICRA conference yet..



Confirmed!

Wednesday, May 17, 2017

Finally, I passed my qualifying examination...
More work to be done before I meet these gentlemen again.

Image


NTU MAE Alumni Homecoming Dinner

Saturday, September 3, 2016

It has been an exciting day in school today because we have been invited to the MAE homecoming dinner. Thanks to EDGAR again.
It is heartening to see different batches of graduates coming back to celebrate the 35th anniversary of MAE. I heard that there were more than 200 attendees, consisting of graduates from the very first batch of 1985. Amazing!
And for the first time, the dinner is held at the new North Spine Skydeck garden in NTU. It's such a unique experience!



A*STAR One North Fest 2016

Saturday, August 6, 2016

It is a great privilege to be invited by the Agency for Science, Technology and Research (A*STAR) to their inaugural One-North Fest.
The organisers have made a promotion video to highlight the Day 1 event, and EDGAR has been featured in it. Awesome!
Thank you very much!



ROS-Industrial Asia Pacific Workshop

Friday, July 15, 2016


It is so delightful to present our work at the ROS-I workshop!! This event is co-organized by RRC and ARTC to bring ROS-I workshop to Asia for the first time.

And guess who are the guests of honor? Brian Gerkey of course, as well as Morgan Quigley and Dave Coleman.
It is so exciting for us to demonstrate EDGAR before these gurus in robotics and computer sciences.

At the final tea break, I have to summon all of my courage to approach Brian and Morgan.
In my mind, I was thinking that I have to talk to them. If I don't talk to them, I will never have a chance to talk to them again.. urg.. I will regret if I don't talk to them. What shall I talk to them about? Tell them that I have been using ROS since Diamondback.. talk to them about EDGAR.. ask them about ROS v2 or talk about Player and Stage.. Arg...  Just talk to them about anything. Just go ahead and talk to them!!!

And I'm glad I did.

I can't remember what we were talking about at all because not long into our conversation, Morgan said that he would like to give me a gift.

Then he fished out their autographed book from his bag and handed it to me . He asked me if I would like to have it.

Of course, I would love to have it!!

It was such a great surprise that I totally lost my cool. I got so crazily elated that I kept saying "omg" and thanking them.
I got Reeve to take picture of us, and this drew some attention - thereafter, everybody wanted to take picture with them..
One stranger even borrowed my book and posed a photo with them!!!
What the **** is this?

Anyway, at that time I was too happy to respond to anything, just grinning away.

What a lovely day!



Video shoot by National Geographic Channel

Tuesday, March 1, 2016

This photo illustrates our vision of using EDGAR as a telepresence humanoid --- the ability to transmit physical interaction for telecommunication.

Here, you see Max McMurdo using EDGAR to "skype" with Mischa Pollack. Through the robot, Max is able to give his friend a physcial hug and even play charade over the Internet.
These two TV hosts have been having a lot of fun stress-testing the robot!

Check out the NatGeo TV for more information!