Alumni

Researchers

Ph.D. Thesis: Human-Robot Shared Object Manipulation

Current industrial robots lack the abilities (dexterity, complex sensing and cognitive processes) possessed by skilled workers need to perform many manufacturing tasks such as product assembly, inspection and packaging. For example, in the automotive industry, robots are used to perform tasks that are entirely repeatable and require little or no human intervention, such as painting, welding and pick-and-place operations. Such robots work in confined spaces isolated from human workers, as improper interactions could result in severe injury or death. Since robots have optimized production efficiency under these conditions however, industries are now directing efforts to achieve similar improvements in worker efficiency through the development of safe, robotic assistants that are able to co-operate with workers.

The research project proposed in this document seeks to exploit this emerging paradigm-shift for manufacturing systems. It is in this context that I propose to develop robot controllers and intuitive interaction strategies to facilitate cooperation between intelligent robotic assistants and non-expert human workers. It is expected that this work will focus on developing motion control models for interactions between different participants which involve safe contact, sharing and hand-off of common payloads. These control systems will allow intelligent robots to co-operate with non-expert workers safely, intuitively and effectively within a shared workspace.

To attain this goal, I intend to draw on elements of safe, collaborative human-robot interaction (HRI) explored through previously-conducted research [2, 3] to develop a preliminary motion control framework. Much of the hardware, communication algorithms and generalized interaction strategies necessary for designing this HRI already exist. However, a wide range of technological advancements necessary to support specific task-driven HRI such as real-time gesture recognition, interaction role negotiation and robust safety systems [1] must still be developed. Thus, studies investigating typical human-human collaborative interaction methods will be used to supplement this work. Specific focus will be given to examining how humans use non-verbal communication to negotiate leading and following roles. Several basic gestures and behaviors will be studied including: co-operative lifting, hand-offs and trajectory control of objects. I aim to leverage these findings by developing a library of motion control strategies for mobile manipulator-type robots which are safe, ergonomic, and allow for the efficient use of the worker and robotic assistant’s skills and abilities.

The control models constructed from these methods will be applied in the context of a specific use case representative of a typical production operation. The use case will consist of non-value added activities within an automotive manufacturing process having component tasks deemed to be complex and diverse. Motion control strategies will be evaluated and refined on a robot platform through human participant studies involving component tasks typical of those seen in the use case. These control strategies will be assessed both subjectively as they relate to the user (e.g., intuitiveness, perceived robot intelligence, ease of use) and objectively through performance measures (e.g., time trials).

The significance of this research lies in the advancement of HRI and the development and deployment of a new class of industrial robots intended to work alongside human counterparts beyond the laboratory. Novel forms of admittance control will be developed with the explicit intention of driving HRI, cooperation and shared object handling. This work is expected to produce useful data and methods contributing to the development and application of safe, collaborative HRI and human-in-the-loop control systems. Although this research is directed towards applications in manufacturing, the knowledge acquired will be extendable to HRI in other domains including rehabilitation, homecare and early child development.

[1] Breazeal, C. et al. “Effects of Nonverbal Communication on Efficiency and Robustness in Human-Robot Teamwork”, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 383-388, 2005.
[2] Fischer, K., Muller, J.P., Pischel, M., “Unifying Control in a Layered Agent Architecture,” Int. Joint Conf. on AI (IJCAI’95), Agent Theory, Architecture and Language Workshop, 1995.
[3] Moon, A., Panton, B., Van der Loos, H.F.M., Croft, E.A., “Safe and Ethical Human-Robot Interaction Using Hesitation Gestures,” IEEE Conf. on Robotics and Automation., pp. 2, May 2010.

M.A.Sc. Thesis: The Haptic Affect Loop

Today, the vast majority of user interfaces in consumer electronic products are based around explicit channels of interaction with human users. Examples include keyboards, gamepads, and various visual displays. Under normal circumstances, these interfaces provide a clear and controlled interaction between user and device [1]. However, problems with this paradigm arise when these interfaces divert a significant amount of their users attention from more important tasks. For example, consider the situation where a person in a car is trying to adjust the stereo while driving. The driver must divert some attention away from their primary task of driving to operate the device. It can be seen the concept of explicit interaction of peripheral devices can lead to an impairment in the ability for the user to effectively perform primary tasks.

The goal of my research was to design and implement a fundamentally different approach to device interaction. Rather than relying on explicit modes of communication between a user and device, I used implicit channels instead to decrease the device’s demand on the user’s attention. It is well known that human affective (emotional) states can be characterized by psycho-physiological signals that can be measured by off-the-shelf biometric sensors [2]. We had proposed to measure user-affective response and incorporate these signals into the device’s control loop so that it will be able to recognize and respond to user affect. We had also put forward the notion that haptic stimuli through a tactile display can be used as an immediate, yet unobtrusive channel for a device to communicate to a user that it has responded to their affective state, thereby closing the feedback loop [3]. Essentially, this research defined a model for a unique user-device interface that was driven by implicit, low-attention communication. It is theorized that this new paradigm will be minimally invasive and will not require him/her to be distracted by the peripheral device. We have termed this process of affect recognition leading to changes in device behaviour which is then signalled back to the user through haptic stimuli as the Haptic-Affect Loop (HALO).

My focus within the HALO concept was on the design and analysis of the overall control loop. This required me to measure, model and optimize latency, flow and habituation between the user’s affective state and HALO’s haptic display. A related problem which I needed to address was dimensionality – what aspects of a user’s biometrics should be used to characterize affective response? For example, what combination of skin conductance, heart rate, muscle twitch etc. best indicates that a user is happy or depressed? As an extension to this problem, how can the environmental context surrounding a user be established to calibrate affect recognition – for example, jogging in the park verses working in the office? Similarly, I also needed to specify the dimensionality of the haptic channels that notifies the user of device response while maintaining the goal of not distracting the user. I need to address where (e.g. back of neck, fingertip) and with what stimulus (i.e. soft tapping vs. aggressive buzzer) should the haptic feedback be delivered.

To validate the HALO concept, it was implemented in two use-cases – both showcasing HALO’s value in information network environments where attention is highly fragmented: the navigation of streaming media on a computer or portable device and background communication in distributed meetings. The results of the research included new lightweight affect sensing technologies, tactile displays and interaction techniques. This work complements and applies research in the areas of communications, haptics, and biometric sensing.

For more information on this project, please refer to my Masters thesis, which can be found in the UBC cIRcle archives.

[1] C. D. Wickens and J. G. Hollands, Engineering Psychology and Human Performance, 3rd ed. Prentice Hall, 1999.
[2] M. Pantic and L. J. M. Rothkran, “Toward an affect-sensitive multimodal human-computer interaction,” Proceedings of the IEEE, vol. 91, no. 9, pp. 1370-1390, Sep. 2003.
[3] S. Brewster and L. M. Brown, “Tactons: structured tactile messages for non-visual information display,” Proceedings of the fifth conference on Australasian user interface, vol. 28, pp. 15-23, 2004.

2018: Human-Robot Shared Object Manipulation (Ph.D. Thesis), https://dx.doi.org/10.14288/1.0364493
2012: The Haptic Affect Loop (M.A.Sc. Thesis), https://dx.doi.org/10.14288/1.0073274
Ph.D.: Human-Robot Negotiation: Using Communication to Resolve Human-Robot Conflicts

People share spaces and objects with each other every day. When conflicts regarding access to these shared resources occur, people communicate with each other to negotiate a solution. But what should a robot do when such conflicts occur between a human and a robotic assistant? Answers to this question depend on the context of the situation. In order for robots to be successfully deployed in homes and workplaces, it is important for robots to be equipped with the ability to make socially and morally acceptable decisions about the conflict at hand. However, robots today are not very good at making such decisions. The objective of my research is to investigate an interactive paradigm of human-robot conflict resolution that does not involve complicated, artificial moral decision making. I am currently working on a robotic system that can communicatively negotiate about resource conflicts with its human partner using nonverbal gestures.

M.A.Sc.: Non-Verbal Gesture and Communication in Physical Human-Robot Interaction

Studies suggest that people feel more positively toward robots that work with people rather than those that replace them. This means that in order to create robots that can collaborate and share tasks with humans, human-human interaction dynamics must be understood – key components of which could be replicated in human-robot interaction.

My master’s research project focused on how simple a non-verbal gesture (like that of jerky hesitant motion of your hand when you and another person reach for the same last piece of chocolate at the same time) can superimposed on the functional reaching motions of a robot, so that robots can express its uncertainty to human users. This research project led to the development of a characteristic motion profile, called the Acceleration-based Hesitation Profile (AHP) a robotic manipulator can use to generate humanlike hesitation motions as a response to resource conflicts (e.g., reaching for the same thing at the same time).

Take a look at how the designed hesitations look in contrast to abrupt collision avoidance responses.
Designed hesitation responses (AHP):

Abrupt stopping responses:

2017: Negotiating with robots: meshing plans and resolving conflicts in human-robot collaboration, open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0103462
2012: What Should a Robot Do?: Design and Implementation of Human-like Hesitation Gestures as a Response Mechanism for Human-robot Resource Conflicts, open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0348225
Upper-body Motion Coordination after Stroke: Insights from Motor Synergies
2017: Upper-body motion coordination after stroke : insights from kinematic and muscle synergies (Ph.D. Thesis), https://dx.doi.org/10.14288/1.0356397
2013: The use of physiological signals and motor performance metrics in task difficulty adaptation (M.A.Sc. Thesis) , https://dx.doi.org/10.14288/1.0071962
Two-handed motion planning
FEATHERS (Functional Engagement in Assisted Therapy through Exercise Robotics)

Supervisors: Dr. Machiel Van der Loos and Dr. Elizabeth Croft.

I am currently involved in the FEATHERS project. My research focuses on investigating the effect of integrating vibrotactile feedback in a rehabilitation therapy system that corrects users’ movements.

2015: MASc on An evaluation of the use of vibrotactile cues in bilateral upper-limb motion training with healthy adults and hemiparetic individuals , http://hdl.handle.net/2429/54565

Standing balance is controlled by several inputs, including vision, vestibular sense, and ankle proprioception. Research studies in this field actively engage and manipulate these input mechanisms to examine their effects on the balance output, mainly muscle actuation in the lower limbs. While significant progress has been made, it is often difficult to isolate a single input and test its results on the output. The unique Robot for Interactive Sensor Engagement and Rehabilitation (RISER) has been developed in the UBC CARIS laboratory for controlling each sense independently to further our understanding of human balance control and to present new possibilities for the control of bipedal robots. We intend to use this system and the strategies developed to help safely rehabilitate people who have lost the ability to balance.

Researchers in our lab examine the human balance systems involved in maintaining anterior-posterior standing balance using a unique approach: subjects stand on a six-axis force plate mounted on a six-axis Stewart platform. The subjects are secured to the platform, so they cannot move independently of it. The forces that the subject applies to the forceplate are fed back to the platform controller, creating a simulation of standing balance in which the subject has no risk of falling. Immersive 3D stereo display goggles provide visual balance cues, and galvanic vestibular stimulation (GVS) can be employed to produce vestibular input. Additionally a two axis ‘ankle-tilt’ system has been mounted on top of the platform to control ankle angle in the sagittal plane. This decouples ankle proprioception from vestibular input, as the ankles can be moved independently of the head.

2014: MASc on The roles of ankle motion and the vestibular system in maintaining standing balance, circle.ubc.ca/handle/2429/50866

In the ongoing effort to make robots more humanesque, studying how people move and perform actions is a necessity. However, dynamic data collection is tricky when it comes to humans due to a distinct lack of built in software and usb ports. Happily things have just gotten a whole lot easier for us in the CARIS Lab with the installation of the new Open Stage motion capture system in room X209.

Open Stage uses colour differentiation to generate a voxel (3D pixel) cloud of your subject, from which a wire-frame skeleton is derived. Translation and rotation data are captured for 21 joints on the skeleton and then stored in Matlab as matrices. And the best part is Open Stage is markerless, so just step into the capture area, strike a pose, and let the magic* begin!

2014: MASc on The perception and control of weight distribution during sit-to-stand in hemiparetic individuals, circle.ubc.ca/handle/2429/51646
2013: Visiting Scholar (PhD Student) from The Chinese University of Hong Kong
Collaborative Human-focused, Assistive Robotics for Manufacturing (CHARM)

CHARM is a large multi-disciplinary, multi-institutional project in collaboration with General Motors of Canada (GM), which aims to advance safe human-robot interaction (HRI) in vehicle manufacturing industries. We investigate (1) robotic technology development: communication, control, and perception; and (2) system design methodology: interaction design, information coordination (situational awareness), and integration.

In CHARM, I Initiate and conduct collaborative research on human-robot interaction design with other members of the team, manage documentation and reporting, and coordinate project between UBC and the rest of the CHARM team.

2013: Postdoctoral Fellow
2013: MEng
CHARM (Collaborative Human-Focused Assistive Robotics for Manufacturing), http://charm.sites.olt.ubc.ca/
Optimal adaptive control in human standing balance

Supervisors: Drs. Elizabeth Croft, Mike Van der Loos, and Jean-Sébastien Blouin

Researchers have suggested different objectives for our neural balance controller: minimizing sway of our center of pressure, center of mass or head, or minimizing motor effort. Optimal control is an attractive architecture for modelling balance because it can achieve a weighted combination of these control objectives and includes mechanisms for controller adaptation. However, we have yet to observe that balance is optimal. For my research, I am testing whether human balance control is optimally adaptive using manipulated balance dynamics simulated by a robotic balance platform.

Adaptation of inter-limb control during robot-simulated human standing balance, https://open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0300020
Human-Robot Handover

Handing over objects is a basic routine in any cooperative scenario. We humans perform many handovers in our everyday lives and even when we never really think about each handover, we generally execute them efficiently and with ease. However, object handover is still a challenging task for many current robot systems. When handing over an object to a person, it is very important for the robot to time the release of the object carefully. Letting go too soon could result in dropping the object and letting go too late may result in the receive pulling very hard on the object.

The goal of my research is to teach robots how to hand over objects to humans safely, efficiently, and intuitively, through understanding the haptic interaction in human-to-human handovers. By enabling robots to perform handovers well, we will be able to allow more natural human-robot interaction.

2012: MASc on A human-inspired controller for robot-human object handovers, circle.ubc.ca/handle/2429/43489
2012: MASc on A new platform for studying human balance control, http://circle.ubc.ca/handle/2429/42139
2011: MASc on Biomechanical Analysis of Assisted Sit to Stand, http://circle.ubc.ca/handle/2429/33814
2011: MEng – Strategies for HRI in Non-Structured Environment
2011: MASc on Affecting affect effectively : investigating a haptic-affect platform for guiding physiological responses, http://circle.ubc.ca/handle/2429/34466
Regaining a lost target for visual servoing

Robots are quickly becoming incorporated into our daily lives. Development of prototype robotic assistants such as the Willow Garage PR2, the NASA-GM Robonaut and the rise of commercial robotic products such as the Roomba, have demonstrated both interest and applications for robots that can function successfully in human environments. Important to the successful adoption of robots in human workspaces is the ability for the robot to work in semi-structured and even unstructured environments which are far different than current robotic workcells. Enabling this move is the ongoing research in vision guided robot control or, visual servoing which allows robots to operate within the “vision based” world that humans work in. Almost all examples of robot assistants to date incorporate one or more vision systems, typically a camera, which has been mounted on the robot.

One common problem associated with using a camera as the feedback sensor is losing sight of the object that the camera was viewing. In surveillance, a suspect may run away from the camera field of view. In a rescue mission an obstacle may occlude the victim from the camera. In such situations the robot needs to acquire new data to locate the lost object. The new data could be obtained from other sensor platforms if available; alternatively the robot could acquire new data by searching for the target, based on the past data it has collected.

Irrespective of the visual task at hand prior the target being lost, we want robots to find the lost target efficiently and then robustly locate it within a safe region of the acquired image. Search efficiency requires a high speed search through an optimized trajectory while robustness requires cautious transition between the completed search and the restarted visual task once target visualization is regained. This will equip robots with an algorithm to handle lost target scenarios and switch back to their visual tasks autonomously.

Where did it go? : regaining a lost target for robot visual servoing, https://open.library.ubc.ca/cIRcle/collections/ubctheses/24/items/1.0223895
2009: MSc on Path Planning for Improved Target Visibility, http://circle.ubc.ca/handle/2429/4481
2009: MASc on Grasp Planning for Vision Guided Bin-Picking, http://circle.ubc.ca/handle/2429/12607
2006: PhD on Safety for Human-robot Interaction, http://circle.ubc.ca/handle/2429/18378
2005: MASc on Vision Assisted System for Industrial Robot Training, http://circle.ubc.ca/handle/2429/16716
2005: PhD on Haptic Rendering of Rigid Body Motion, http://circle.ubc.ca/handle/2429/17282
2005: PhD on Cooperative Robotic Sculpting
2001: MASc on An investigation into the reduction of stick-slip friction in hydraulic actuators, http://circle.ubc.ca/handle/2429/11786
2004: MASc on Equilibrium Point Control of a Programmable Mechanical Compliant Manipulator, http://circle.ubc.ca/handle/2429/15543
2004: MASc on Investigation of an EMG referenced control channel for grasp force supplementation, http://circle.ubc.ca/handle/2429/15469
2002: MASc on Integration of multirate heteroceptive sensor data in robotic system servo-loops, http://circle.ubc.ca/handle/2429/12153
2002: MSc in Tracking the joints of articulated objects without an a priori shape model, http://circle.ubc.ca/handle/2429/12131
2001: MASc on Multisensor fusion within an Encapsulated Logical Devices Architecture, http://circle.ubc.ca/handle/2429/12078
2001: MASc on On-line smooth trajectory planning for manipulators, http://circle.ubc.ca/handle/2429/11855
SleepSmart: Wireless Sensing Technology for Sleep Disorder Diagnostics in Pediatric Populations

My goal is to create a bedsheet that has flexible, wireless sensing technology to measure physiological signals of whoever is on it. Measurements for heart rate, pulse oximetry, respiratory rate, and body position will help with the diagnosis of sleep conditions for children with neurodevelopmental disorders.

1999: MASc on Trex: taxonomy-based robot-control expert-system, http://circle.ubc.ca/handle/2429/9720
1998: MASc on Identification of salmon can-filling defects using machine vision, http://circle.ubc.ca/handle/2429/7749
1998: MASc on Analysis of the industrial automation of a food processing quality assurance workcell, http://circle.ubc.ca/handle/2429/7858
1998: PhD on ELSA: an intelligent multisensor integration architecture for industrial grading tasks, http://circle.ubc.ca/handle/2429/9015

Support

2015: Undergraduate Research Assistant (USRA)
HRI Poseable Hand Gesture Study
2014: Undergraduate Research Assistant (Work-study)
Combined Mobility Base-Orthosis (COMBO)

Supervisors: Drs. Mike Van der Loos and Jaimie Borisoff

My project involves the development of a virtual prototyping tool using OpenSim for the evaluation of new assistive devices for those with mobility impairments. In particular, this tool will be used to study the power requirements of a new mobility device which aims to merge the benefits of a manual wheelchair with those of a walking exoskeleton.

2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (Volunteer)
2014: Undergraduate Research Assistant (DAAD-RISE)
2014: Undergraduate Research Assistant (Mitacs Globalink)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (USRA)
2014: Undergraduate Research Assistant (Volunteer)
2014: Undergraduate Research Assistant (DAAD-RISE)
2014: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (Mitacs Globalink)
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (Mitacs Globalink)
Lab Assistant
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (Volunteer)
2013: Undergraduate Research Assistant (USRA)
2013: Undergraduate Research Assistant (DAAD-RISE)

a place of mind, The University of British Columbia

Faculty of Applied Science
5000 - 2332 Main Mall,
Vancouver, BC, V6T 1Z4, Canada
Tel: 604.822.6413
Email:
CARIS Lab
Department of Mechanical Engineering, UBC,
Vancouver, BC, Canada
Tel: 604.822.3147
Fax: 604.822.2403
See contact page for addresses.

Emergency Procedures | Accessibility | Contact UBC  | © Copyright The University of British Columbia