Ph.D. Student
The University of Waterloo, 2009, B.A.Sc., Mechatronics Engineering
The University of British Columbia, 2012, M.A.Sc., Mechanical Engineering
The University of British Columbia, Ph.D., Mechanical Engineering

I am currently a PhD student in Mechanical Engineering at The University of British Columbia (UBC) in Vancouver, British Columbia, Canada.

I received my B.A.Sc. from The University of Waterloo in Mechatronics Engineering with Biomechanics Option, and M.A.Sc. from UBC in Mechanical Engineering. Since September 2009, I have been engaged as research assistant at the Collaborative Advanced Robotics and Intelligent Systems (CARIS) Laboratory under the direction of Elizabeth Croft.

My research interests generally fall within the categories human-machine interaction and biomedical engineering. My prior work has involved haptic displays, affective computing and human-computer interaction (HCI). I am currently engaged in research investigating novel methods of performing human-robot co-operative manipulation of objects.

Research Description

Ph.D. Thesis: Human-Robot Shared Object Manipulation

Current industrial robots lack the abilities (dexterity, complex sensing and cognitive processes) possessed by skilled workers need to perform many manufacturing tasks such as product assembly, inspection and packaging. For example, in the automotive industry, robots are used to perform tasks that are entirely repeatable and require little or no human intervention, such as painting, welding and pick-and-place operations. Such robots work in confined spaces isolated from human workers, as improper interactions could result in severe injury or death. Since robots have optimized production efficiency under these conditions however, industries are now directing efforts to achieve similar improvements in worker efficiency through the development of safe, robotic assistants that are able to co-operate with workers.

The research project proposed in this document seeks to exploit this emerging paradigm-shift for manufacturing systems. It is in this context that I propose to develop robot controllers and intuitive interaction strategies to facilitate cooperation between intelligent robotic assistants and non-expert human workers. It is expected that this work will focus on developing motion control models for interactions between different participants which involve safe contact, sharing and hand-off of common payloads. These control systems will allow intelligent robots to co-operate with non-expert workers safely, intuitively and effectively within a shared workspace.

To attain this goal, I intend to draw on elements of safe, collaborative human-robot interaction (HRI) explored through previously-conducted research [2, 3] to develop a preliminary motion control framework. Much of the hardware, communication algorithms and generalized interaction strategies necessary for designing this HRI already exist. However, a wide range of technological advancements necessary to support specific task-driven HRI such as real-time gesture recognition, interaction role negotiation and robust safety systems [1] must still be developed. Thus, studies investigating typical human-human collaborative interaction methods will be used to supplement this work. Specific focus will be given to examining how humans use non-verbal communication to negotiate leading and following roles. Several basic gestures and behaviors will be studied including: co-operative lifting, hand-offs and trajectory control of objects. I aim to leverage these findings by developing a library of motion control strategies for mobile manipulator-type robots which are safe, ergonomic, and allow for the efficient use of the worker and robotic assistant’s skills and abilities.

The control models constructed from these methods will be applied in the context of a specific use case representative of a typical production operation. The use case will consist of non-value added activities within an automotive manufacturing process having component tasks deemed to be complex and diverse. Motion control strategies will be evaluated and refined on a robot platform through human participant studies involving component tasks typical of those seen in the use case. These control strategies will be assessed both subjectively as they relate to the user (e.g., intuitiveness, perceived robot intelligence, ease of use) and objectively through performance measures (e.g., time trials).

The significance of this research lies in the advancement of HRI and the development and deployment of a new class of industrial robots intended to work alongside human counterparts beyond the laboratory. Novel forms of admittance control will be developed with the explicit intention of driving HRI, cooperation and shared object handling. This work is expected to produce useful data and methods contributing to the development and application of safe, collaborative HRI and human-in-the-loop control systems. Although this research is directed towards applications in manufacturing, the knowledge acquired will be extendable to HRI in other domains including rehabilitation, homecare and early child development.

[1] Breazeal, C. et al. “Effects of Nonverbal Communication on Efficiency and Robustness in Human-Robot Teamwork”, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 383-388, 2005.
[2] Fischer, K., Muller, J.P., Pischel, M., “Unifying Control in a Layered Agent Architecture,” Int. Joint Conf. on AI (IJCAI’95), Agent Theory, Architecture and Language Workshop, 1995.
[3] Moon, A., Panton, B., Van der Loos, H.F.M., Croft, E.A., “Safe and Ethical Human-Robot Interaction Using Hesitation Gestures,” IEEE Conf. on Robotics and Automation., pp. 2, May 2010.

M.A.Sc. Thesis: The Haptic Affect Loop

Today, the vast majority of user interfaces in consumer electronic products are based around explicit channels of interaction with human users. Examples include keyboards, gamepads, and various visual displays. Under normal circumstances, these interfaces provide a clear and controlled interaction between user and device [1]. However, problems with this paradigm arise when these interfaces divert a significant amount of their users attention from more important tasks. For example, consider the situation where a person in a car is trying to adjust the stereo while driving. The driver must divert some attention away from their primary task of driving to operate the device. It can be seen the concept of explicit interaction of peripheral devices can lead to an impairment in the ability for the user to effectively perform primary tasks.

The goal of my research was to design and implement a fundamentally different approach to device interaction. Rather than relying on explicit modes of communication between a user and device, I used implicit channels instead to decrease the device’s demand on the user’s attention. It is well known that human affective (emotional) states can be characterized by psycho-physiological signals that can be measured by off-the-shelf biometric sensors [2]. We had proposed to measure user-affective response and incorporate these signals into the device’s control loop so that it will be able to recognize and respond to user affect. We had also put forward the notion that haptic stimuli through a tactile display can be used as an immediate, yet unobtrusive channel for a device to communicate to a user that it has responded to their affective state, thereby closing the feedback loop [3]. Essentially, this research defined a model for a unique user-device interface that was driven by implicit, low-attention communication. It is theorized that this new paradigm will be minimally invasive and will not require him/her to be distracted by the peripheral device. We have termed this process of affect recognition leading to changes in device behaviour which is then signalled back to the user through haptic stimuli as the Haptic-Affect Loop (HALO).

My focus within the HALO concept was on the design and analysis of the overall control loop. This required me to measure, model and optimize latency, flow and habituation between the user’s affective state and HALO’s haptic display. A related problem which I needed to address was dimensionality – what aspects of a user’s biometrics should be used to characterize affective response? For example, what combination of skin conductance, heart rate, muscle twitch etc. best indicates that a user is happy or depressed? As an extension to this problem, how can the environmental context surrounding a user be established to calibrate affect recognition – for example, jogging in the park verses working in the office? Similarly, I also needed to specify the dimensionality of the haptic channels that notifies the user of device response while maintaining the goal of not distracting the user. I need to address where (e.g. back of neck, fingertip) and with what stimulus (i.e. soft tapping vs. aggressive buzzer) should the haptic feedback be delivered.

To validate the HALO concept, it was implemented in two use-cases – both showcasing HALO’s value in information network environments where attention is highly fragmented: the navigation of streaming media on a computer or portable device and background communication in distributed meetings. The results of the research included new lightweight affect sensing technologies, tactile displays and interaction techniques. This work complements and applies research in the areas of communications, haptics, and biometric sensing.

For more information on this project, please refer to my Masters thesis, which can be found in the UBC cIRcle archives.

[1] C. D. Wickens and J. G. Hollands, Engineering Psychology and Human Performance, 3rd ed. Prentice Hall, 1999.
[2] M. Pantic and L. J. M. Rothkran, “Toward an affect-sensitive multimodal human-computer interaction,” Proceedings of the IEEE, vol. 91, no. 9, pp. 1370-1390, Sep. 2003.
[3] S. Brewster and L. M. Brown, “Tactons: structured tactile messages for non-visual information display,” Proceedings of the fifth conference on Australasian user interface, vol. 28, pp. 15-23, 2004.

Publications

Pan, M.K.X.J., McGrenere, J., Croft, E. A., MacLean, K.E. (2013),Exploring the Role of Haptic Feedback in an Implicit HCI-Based Bookmarking Application. Submitted to IEEE Trans. Haptics, May 2013.

MacLean, K.E., Yohanan, S., Sefidgar, Y., Pan, M.K.X.J., Croft, E. A., McGrenere, J. (2011),Emotional Communication and Implicit Control through Touch. Proc. IEEE Haptics Symposium (HAPTICS ’12), Vancouver, Canada, March 2011.

Pan, M.K.X.J., Chang, J.-S., Himmetoglu, G. H., Moon, A., Hazelton, T. W., MacLean, K. E., and Croft, E. A. (2011). Now, Where Was I? Physiologically Triggered Bookmarks for Audio Books. In Proc. of ACM Conference on Human Factors in Computing Systems (CHI ’11), Vancouver, Canada, May 2011.

Pan, M.K.X.J., Chang, J.-S., Himmetoglu, G.H., Moon, A., Hazelton, T.W., MacLean, K.E., Croft, E.A. (2011). Galvanic skin response-derived bookmarking of an audio stream. In Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems (CHI EA ’11). ACM, New York, NY, USA, 1135-1140.

Karuei, I., Hazelton, T. W., MacLean, K. E., Baumann, M., and Pan, M.K.X.J.. (2010). Presenting a Biometrically-Driven Haptic Interaction Loop. in ACM Conf. on Human Factors in Computing Systems (CHI ’10), Workshop on Whole Body Interaction. Atlanta, GA, 4 pages, April 2010.

Pan, M.K.X.J., Baumann, M. A., Hazelton, T. W., MacLean, K. E., Croft, E.A. (2010). ”Expresive Wearable Haptic Devices.” In Proceedings of IEEE Haptics Symposium (HAPTICS ’10), Waltham, MA, USA, IEEE Press, March 2010.

Other Publications

Pan, M.K.X.J., Pardasani, U., Peters, T., and Patel, R. (2009). Robotic Laparoscopy: Design of an Image-Guided Tool for Teleoperated Minimally Invasive Surgery, Presented at the 2nd Annual Mechatronics Engineering Symposium 2009, Waterloo, Canada, April 2009.

Pan, M.K.X.J., Pardasani, U., Yip, M. (2009). Design of a 3D Image-Guided Tele-Operated Laparoscopic Instrument. Technical report written for Robarts Imaging and Canadian Surgical Technologies and Advanced Robotics at the London Health Sciences Centre.

Pan, M.K.X.J., (2008). MotionStation3D Development and Deployment. Technical report written for University of Western Ontario.

Contact Details

Institute for Computing, Information and Cognitive Systems X015
2366 Main Mall
Vancouver, British Columbia V6T 1Z4
Canada
6048223147

a place of mind, The University of British Columbia

Faculty of Applied Science
5000 - 2332 Main Mall,
Vancouver, BC, V6T 1Z4, Canada
Tel: 604.822.6413
Email:
CARIS Lab
Department of Mechanical Engineering, UBC,
Vancouver, BC, Canada
Tel: 604.822.3147
Fax: 604.822.2403
See contact page for addresses.

Emergency Procedures | Accessibility | Contact UBC  | © Copyright The University of British Columbia