<div class="breadcrumb breadcrumbs"><div class="breadcrumb-trail"> » <a href="https://caris.mech.ubc.ca" title="CARIS" rel="home" class="trail-begin">Home</a> <span class="sep">»</span> <a href="https://caris.mech.ubc.ca/research/" title="Research">Research</a> <span class="sep">»</span> <a href="https://caris.mech.ubc.ca/research/research-areas/" title="Research Areas">Research Areas</a> <span class="sep">»</span> Human-robot Interaction </div></div>

Human-robot Interaction

Successful integration of robotics into the human sphere requires significant research on methods for direct human-robot interaction (HRI), unhindered by handheld interfaces, and grounded in the physical world in which people work and play.

The overall goals of this research are to contribute to the development of knowledge, methods, and algorithms for natural, transparent HRI that enable humans and robots to interact effectively and cooperatively in unstructured, shared spaces. Motions, gestures, forces, and other cues are effectively used by pairs working together to manage cooperative tasks – particularly in situations where noise or other audio barriers preclude verbal communication. Other channels, such as physiological sensing, can provide cues around readiness and satisfaction. These cues signal transition-related information essential to the collaboration flow such as: turn taking/giving, role changes (e.g., leader/follower, instructor/trainee) and state changes (e.g., ready/waiting/busy, unsure/confident).

Check out this video demonstration of a robot-human handover controller developed in our lab.


A Multimodal System for Robot Trajectory Programming and Execution

Existing industrial robot programming interfaces, e.g., teach pendants and computer consoles, are often unintuitive, resulting in slow and tedious teaching process. While kinesthetic teaching provides an alternative for small robots, where interaction can be safe, for large industrial robots, physical interaction is not an option. Emerging augmented reality (AR) technology offers the potential for faster, safer, and more intuitive robot programming as it admits presentation of rich amounts of visual, in-situ information. However, too much information may also overload user visual perception capacity, and it may not provide adequate feedback of robot state.

This project offers a future-focused approach for robot programming using augmented reality (AR) with the goal of enabling safe and intuitive human-robot interaction in collaborative manufacturing. Using a mixed reality head-mounted display (Microsoft Hololens) and a pair of surface electromyography (EMG) and gesture sensing armbands (MYO Armband), we designed a multimodal user interface using AR to ease the robot programming task by proving multiple interactive functions: 1) Trajectory specification. 2) Virtual previews of robot motion. 3) Visualization of robot parameters. 4) Online reprogramming during simulation and execution. 5) Gesture and EMG-based control of robot trajectory execution. 6) Online virtual barrier creation and visualization. We present a multimodal system for trajectory programming and on-line control, merging AR, electromyography reading, gesture control, speech control, and tactile feedback. We validated our AR-robot programming interface by comparing it with kinesthetic teaching and other standard robot control methods and found promising results for our system.

Developed for a project in collaboration with DLR, the German Aerospace Institute, our goal is to solve complex problems in manufacturing processes that are labor-intensive, requires online expert knowledge, and have been too difficult to automate completely, such as the prime example of carbon fiber reinforced polymer manufacturing.

We’re currently working on further integration of new methods to allow more intuitive and natural operation of robotics!

Principal Investigator

Dr. Mike Van der Loos, Associate Professor, Department of Mechanical Engineering, UBC
Dr. Elizabeth Croft, Professor, Department of Mechanical Engineering, UBC

Researchers

Wesley Chan
Camilo Perez
Maram Sakr

Collaborators

German Aerospace Centre (DLR)


Sidewalk Delivery Robot Navigation: A Pedestrian-Based Approach

Sidewalks are unique in that the pedestrian-shared space has characteristics of both roads and indoor spaces. Like vehicles on roads, pedestrian movement often manifests as flows in opposing directions. On the other hand, pedestrians also form crowds and can exhibit much more random movements than vehicles. Classical algorithms are insufficient for safe navigation around pedestrians and remaining on the sidewalk space. Our approach takes advantage of natural human motion to allow a robot to adapt to sidewalk navigation in a safe and socially-compliant manner. We developed a group surfing method which aims to imitate the optimal pedestrian group for bringing the robot closer to its goal. For pedestrian-sparse environments, we use a sidewalk edge detection and following method. Underlying these two navigation methods, the collision avoidance scheme is human-aware. Components of the navigation stack are demonstrated in simulation and an integrated simulation and real-world experiment are discussed.

In addition, We’re investigating the effect of intent communication on mobile robot social-acceptability in pedestrian-rich environments. Pedestrians naturally engage in joint collision avoidance with others in public spaces, based on a variety of subtle body language cues. We do not, however, understand very well how robots move. We’re developing and testing a variety of cues that mobile robots can use to help pedestrians quickly build enough understanding and trust to feel both comfortable and safe when sharing a space with a robot.

We have two studies planned for this project. The first, a laboratory experiment, will determine the most communicative of a variety of cues. The second, a field study, will validate the cue chosen in the first study in a pedestrian space on campus.

Principal Investigator

Dr. Mike Van der Loos, Associate Professor, Department of Mechanical Engineering, UBC
Dr. Elizabeth Croft, Professor, Department of Mechanical Engineering, UBC

Researchers

Nick Hetherington
Wesley Chan
Camilo Perez


a place of mind, The University of British Columbia

Faculty of Applied Science
5000 - 2332 Main Mall,
Vancouver, BC, V6T 1Z4, Canada
Tel: 604.822.6413
Email:
CARIS Lab
Department of Mechanical Engineering, UBC,
Vancouver, BC, Canada
Tel: 604.822.3147
Fax: 604.822.2403
See contact page for addresses.

Emergency Procedures | Accessibility | Contact UBC  | © Copyright The University of British Columbia