Ph.D.: Human-Robot Negotiation: Using Communication to Resolve Human-Robot Conflicts

People share spaces and objects with each other every day. When conflicts regarding access to these shared resources occur, people communicate with each other to negotiate a solution. But what should a robot do when such conflicts occur between a human and a robotic assistant? Answers to this question depend on the context of the situation. In order for robots to be successfully deployed in homes and workplaces, it is important for robots to be equipped with the ability to make socially and morally acceptable decisions about the conflict at hand. However, robots today are not very good at making such decisions. The objective of my research is to investigate an interactive paradigm of human-robot conflict resolution that does not involve complicated, artificial moral decision making. I am currently working on a robotic system that can communicatively negotiate about resource conflicts with its human partner using nonverbal gestures.

M.A.Sc.: Non-Verbal Gesture and Communication in Physical Human-Robot Interaction

Studies suggest that people feel more positively toward robots that work with people rather than those that replace them. This means that in order to create robots that can collaborate and share tasks with humans, human-human interaction dynamics must be understood – key components of which could be replicated in human-robot interaction.

My master’s research project focused on how simple a non-verbal gesture (like that of jerky hesitant motion of your hand when you and another person reach for the same last piece of chocolate at the same time) can superimposed on the functional reaching motions of a robot, so that robots can express its uncertainty to human users. This research project led to the development of a characteristic motion profile, called the Acceleration-based Hesitation Profile (AHP) a robotic manipulator can use to generate humanlike hesitation motions as a response to resource conflicts (e.g., reaching for the same thing at the same time).

Take a look at how the designed hesitations look in contrast to abrupt collision avoidance responses.
Designed hesitation responses (AHP):

Abrupt stopping responses:

2017: Negotiating with robots: meshing plans and resolving conflicts in human-robot collaboration,
2012: What Should a Robot Do?: Design and Implementation of Human-like Hesitation Gestures as a Response Mechanism for Human-robot Resource Conflicts,
Two-handed motion planning
FEATHERS (Functional Engagement in Assisted Therapy through Exercise Robotics)

Supervisors: Dr. Machiel Van der Loos and Dr. Elizabeth Croft.

I am currently involved in the FEATHERS project. My research focuses on investigating the effect of integrating vibrotactile feedback in a rehabilitation therapy system that corrects users’ movements.

2015: MASc on An evaluation of the use of vibrotactile cues in bilateral upper-limb motion training with healthy adults and hemiparetic individuals ,

Standing balance is controlled by several inputs, including vision, vestibular sense, and ankle proprioception. Research studies in this field actively engage and manipulate these input mechanisms to examine their effects on the balance output, mainly muscle actuation in the lower limbs. While significant progress has been made, it is often difficult to isolate a single input and test its results on the output. The unique Robot for Interactive Sensor Engagement and Rehabilitation (RISER) has been developed in the UBC CARIS laboratory for controlling each sense independently to further our understanding of human balance control and to present new possibilities for the control of bipedal robots. We intend to use this system and the strategies developed to help safely rehabilitate people who have lost the ability to balance.

Researchers in our lab examine the human balance systems involved in maintaining anterior-posterior standing balance using a unique approach: subjects stand on a six-axis force plate mounted on a six-axis Stewart platform. The subjects are secured to the platform, so they cannot move independently of it. The forces that the subject applies to the forceplate are fed back to the platform controller, creating a simulation of standing balance in which the subject has no risk of falling. Immersive 3D stereo display goggles provide visual balance cues, and galvanic vestibular stimulation (GVS) can be employed to produce vestibular input. Additionally a two axis ‘ankle-tilt’ system has been mounted on top of the platform to control ankle angle in the sagittal plane. This decouples ankle proprioception from vestibular input, as the ankles can be moved independently of the head.

2014: MASc on The roles of ankle motion and the vestibular system in maintaining standing balance,

In the ongoing effort to make robots more humanesque, studying how people move and perform actions is a necessity. However, dynamic data collection is tricky when it comes to humans due to a distinct lack of built in software and usb ports. Happily things have just gotten a whole lot easier for us in the CARIS Lab with the installation of the new Open Stage motion capture system in room X209.

Open Stage uses colour differentiation to generate a voxel (3D pixel) cloud of your subject, from which a wire-frame skeleton is derived. Translation and rotation data are captured for 21 joints on the skeleton and then stored in Matlab as matrices. And the best part is Open Stage is markerless, so just step into the capture area, strike a pose, and let the magic* begin!

2014: MASc on The perception and control of weight distribution during sit-to-stand in hemiparetic individuals,
2013: Visiting Scholar (PhD Student) from The Chinese University of Hong Kong
Collaborative Human-focused, Assistive Robotics for Manufacturing (CHARM)

CHARM is a large multi-disciplinary, multi-institutional project in collaboration with General Motors of Canada (GM), which aims to advance safe human-robot interaction (HRI) in vehicle manufacturing industries. We investigate (1) robotic technology development: communication, control, and perception; and (2) system design methodology: interaction design, information coordination (situational awareness), and integration.

In CHARM, I Initiate and conduct collaborative research on human-robot interaction design with other members of the team, manage documentation and reporting, and coordinate project between UBC and the rest of the CHARM team.

2013: Postdoctoral Fellow
2013: MEng
CHARM (Collaborative Human-Focused Assistive Robotics for Manufacturing),
Optimal adaptive control in human standing balance

Supervisors: Drs. Elizabeth Croft, Mike Van der Loos, and Jean-Sébastien Blouin

Researchers have suggested different objectives for our neural balance controller: minimizing sway of our center of pressure, center of mass or head, or minimizing motor effort. Optimal control is an attractive architecture for modelling balance because it can achieve a weighted combination of these control objectives and includes mechanisms for controller adaptation. However, we have yet to observe that balance is optimal. For my research, I am testing whether human balance control is optimally adaptive using manipulated balance dynamics simulated by a robotic balance platform.

Adaptation of inter-limb control during robot-simulated human standing balance,
Human-Robot Handover

Handing over objects is a basic routine in any cooperative scenario. We humans perform many handovers in our everyday lives and even when we never really think about each handover, we generally execute them efficiently and with ease. However, object handover is still a challenging task for many current robot systems. When handing over an object to a person, it is very important for the robot to time the release of the object carefully. Letting go too soon could result in dropping the object and letting go too late may result in the receive pulling very hard on the object.

The goal of my research is to teach robots how to hand over objects to humans safely, efficiently, and intuitively, through understanding the haptic interaction in human-to-human handovers. By enabling robots to perform handovers well, we will be able to allow more natural human-robot interaction.

2012: MASc on A human-inspired controller for robot-human object handovers,
2012: MASc on A new platform for studying human balance control,
2011: MASc on Biomechanical Analysis of Assisted Sit to Stand,
2011: MEng – Strategies for HRI in Non-Structured Environment
2011: MASc on Affecting affect effectively : investigating a haptic-affect platform for guiding physiological responses,
Regaining a lost target for visual servoing

Robots are quickly becoming incorporated into our daily lives. Development of prototype robotic assistants such as the Willow Garage PR2, the NASA-GM Robonaut and the rise of commercial robotic products such as the Roomba, have demonstrated both interest and applications for robots that can function successfully in human environments. Important to the successful adoption of robots in human workspaces is the ability for the robot to work in semi-structured and even unstructured environments which are far different than current robotic workcells. Enabling this move is the ongoing research in vision guided robot control or, visual servoing which allows robots to operate within the “vision based” world that humans work in. Almost all examples of robot assistants to date incorporate one or more vision systems, typically a camera, which has been mounted on the robot.

One common problem associated with using a camera as the feedback sensor is losing sight of the object that the camera was viewing. In surveillance, a suspect may run away from the camera field of view. In a rescue mission an obstacle may occlude the victim from the camera. In such situations the robot needs to acquire new data to locate the lost object. The new data could be obtained from other sensor platforms if available; alternatively the robot could acquire new data by searching for the target, based on the past data it has collected.

Irrespective of the visual task at hand prior the target being lost, we want robots to find the lost target efficiently and then robustly locate it within a safe region of the acquired image. Search efficiency requires a high speed search through an optimized trajectory while robustness requires cautious transition between the completed search and the restarted visual task once target visualization is regained. This will equip robots with an algorithm to handle lost target scenarios and switch back to their visual tasks autonomously.

Where did it go? : regaining a lost target for robot visual servoing,
2009: MSc on Path Planning for Improved Target Visibility,
2009: MASc on Grasp Planning for Vision Guided Bin-Picking,
2006: PhD on Safety for Human-robot Interaction,
2005: MASc on Vision Assisted System for Industrial Robot Training,
2005: PhD on Haptic Rendering of Rigid Body Motion,
2005: PhD on Cooperative Robotic Sculpting
2001: MASc on An investigation into the reduction of stick-slip friction in hydraulic actuators,
2004: MASc on Equilibrium Point Control of a Programmable Mechanical Compliant Manipulator,
2004: MASc on Investigation of an EMG referenced control channel for grasp force supplementation,
2002: MASc on Integration of multirate heteroceptive sensor data in robotic system servo-loops,
2002: MSc in Tracking the joints of articulated objects without an a priori shape model,
2001: MASc on Multisensor fusion within an Encapsulated Logical Devices Architecture,
2001: MASc on On-line smooth trajectory planning for manipulators,
SleepSmart: Wireless Sensing Technology for Sleep Disorder Diagnostics in Pediatric Populations

My goal is to create a bedsheet that has flexible, wireless sensing technology to measure physiological signals of whoever is on it. Measurements for heart rate, pulse oximetry, respiratory rate, and body position will help with the diagnosis of sleep conditions for children with neurodevelopmental disorders.

1999: MASc on Trex: taxonomy-based robot-control expert-system,
1998: MASc on Identification of salmon can-filling defects using machine vision,
1998: MASc on Analysis of the industrial automation of a food processing quality assurance workcell,
1998: PhD on ELSA: an intelligent multisensor integration architecture for industrial grading tasks,


2015: Undergraduate Research Assistant (USRA)
HRI Poseable Hand Gesture Study
2014: Undergraduate Research Assistant (Work-study)
Combined Mobility Base-Orthosis (COMBO)

Supervisors: Drs. Mike Van der Loos and Jaimie Borisoff

My project involves the development of a virtual prototyping tool using OpenSim for the evaluation of new assistive devices for those with mobility impairments. In particular, this tool will be used to study the power requirements of a new mobility device which aims to merge the benefits of a manual wheelchair with those of a walking exoskeleton.

2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (Volunteer)
2014: Undergraduate Research Assistant (DAAD-RISE)
2014: Undergraduate Research Assistant (Mitacs Globalink)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (Volunteer)
2014: Undergraduate Research Assistant (DAAD-RISE)
2014: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (Mitacs Globalink)
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (Mitacs Globalink)
Lab Assistant
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (DAAD-RISE)

a place of mind, The University of British Columbia

Faculty of Applied Science
5000 - 2332 Main Mall,
Vancouver, BC, V6T 1Z4, Canada
Tel: 604.822.6413
Department of Mechanical Engineering, UBC,
Vancouver, BC, Canada
Tel: 604.822.3147
Fax: 604.822.2403
See contact page for addresses.

Emergency Procedures | Accessibility | Contact UBC  | © Copyright The University of British Columbia