My research focuses on developing communication mechanisms for human-robot manipulation interaction. I enjoy working in detail-oriented and multidisciplinary teams to translate research from HRI and computer vision to advance mission-critical system. Thanks to my multidisciplinary background, I have the capacity of turning a novel idea into a functional prototype.
In recent years, robots have started to migrate from industrial to unstructured human environments, some examples include home robotics, search and rescue robotics, assistive robotics, and service robotics. However, this migration has been at a slow pace and with only a few successes. One key reason is that current robots do not have the capacity to interact well with humans in dynamic environments. Finding natural communication mechanisms that allow humans to interact and collaborate with robots effortlessly is a fundamental research direction to integrate robots into our daily living. In this thesis, we study pointing gestures for cooperative human-robot manipulation tasks in unstructured environments. By interacting with a human, the robot can solve tasks that are too complex for current articial intelligence agents and autonomous control systems. Inspired by human-human manipulation interaction, in particular how humans use pointing and gestures to simplify communication during collaborative manipulation tasks; we developed three novel non-verbal pointing based interfaces for human-robot collaboration. 1) Spatial pointing interface: In this interface, both human and robot are collocated and the communication format is done through gestures. We studied human pointing gesturing in the context of human manipulation and using computer vision; we quantified accuracy and precision of human
pointing in household scenarios. Furthermore, we designed a robot and vision system that can see, interpret and act using a gesture-based language. 2) Assistive vision-based interface: We designed an intuitive 2D image-based interface for upper body disabled persons to manipulate daily household objects through an assistive robotic arm (both human and robot are collocated sharing the same environment). The proposed interface reduces operation complexity by providing different levels of autonomy to the end user. 3) Vision-Force Interface for Path Specication in Tele- Manipulation: This is a remote visual interface that allows a user to specify in an on-line fashion a path constraint to a remote robot. By using the proposed interface, the operator can guide and control a 7-DOF remote robot arm through the desired path using only 2-DOF. We validate each of the proposed interfaces through user studies. The proposed interfaces explore the important direction of letting robots and humans work together and the importance of using a good communication channel/ interface during the interaction. Our research involved the integration of several knowledge areas. In particular, we studied and developed algorithms for vision control, object detection, object grasping, object manipulation and human-robot interaction.
Qin, Xuebin, Shida He, Camilo Perez Quintero, Abhineet Singh, Masood Dehghan, and Martin Jagersand. “Real-Time Salient Closed Boundary Tracking via Line Segments Perceptual Grouping.” (IROS) 2017.
Valipour, Sepehr, Camilo Perez, and Martin Jagersand. “Incremental Learning for Robot Perception through HRI.” (IROS) 2017.
Quintero, Camilo Perez, Masood Dehghan, Oscar Ramirez, Marcelo H. Ang, and Martin Jagersand. “Flexible virtual fixture interface for path specification in tele-manipulation.” In Robotics and Automation (ICRA), 2017 IEEE International Conference on, pp. 5363-5368. IEEE, 2017.
Siam, Mennatullah, Abhineet Singh, Camilo Perez, and Martin Jagersand. “4-DoF Tracking for Robot Fine Manipulation Tasks.” In Computer and Robot Vision (CRV), 2017 Canadian Conference on. IEEE, 2017.
Hu, Huan, Camilo Perez, Han-Xu Sun, and Martin Jagersand. “Performance of Predictive Display Teleoperation under Different Delays with Different Degree of Freedoms.” In Information System and Artificial Intelligence (ISAI), 2016 International Conference on, pp. 380-384. IEEE, 2016.
Gridseth, Mona, Oscar Ramirez, Camilo Perez Quintero, and Martin Jagersand. “Vita: Visual task specification interface for manipulation with uncalibrated visual servoing.” In Robotics and Automation (ICRA), 2016 IEEE International Conference on, pp. 3434-3440. IEEE, 2016.
Camilo Perez Quintero, Masood Dehghan, Oscar Ramirez, Marcelo H. Ang, and Martin Jagersand. Vision-force interface for path specification in tele-manipulation. In Human-Robot Interfaces for Enhanced Physical Interactions Workshop, ICRA, 2016.
Quintero, Camilo Perez, Romeo Tatsambon, Mona Gridseth, and Martin Jägersand. “Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task.” In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on, pp. 349-354. IEEE, 2015.
Quintero, Camilo Perez, Oscar Ramirez, and Martin Jägersand. “Vibi: Assistive vision-based interface for robot manipulation.” In Robotics and Automation (ICRA), 2015 IEEE International Conference on, pp. 4458-4463. IEEE, 2015.
Hu, Huan, Camilo Perez Quintero, Hanxu Sun, and Martin Jagersand. “On-line reconstruction based predictive display in unknown environment.” In Robotics and Automation (ICRA), 2015 IEEE International Conference on, pp. 4446-4451. IEEE, 2015.
Roy, Ankush, Xi Zhang, Nina Wolleb, Camilo Perez Quintero, and Martin Jägersand. “Tracking benchmark and evaluation for manipulation tasks.” In Robotics and Automation (ICRA), 2015 IEEE International Conference on, pp. 2448-2453. IEEE, 2015.
Quintero, Camilo Perez, Romeo Tatsambon Fomena, Azad Shademan, Oscar Ramirez, and Martin Jagersand. “Interactive Teleoperation Interface for Semi-autonomous Control of Robot Arms.” In Computer and Robot Vision (CRV), 2014 Canadian Conference on, pp. 357-363. IEEE, 2014.
Quintero, Camilo Perez, Oscar Ramirez, Mona Gridseth, and Martin Jägersand. “Small object manipulation in 3d perception robotic systems using visual servoing.” Workshop, IROS, 2014.
Gridseth, M., C. Perez Quintero, R. Fomena, O. Ramirez, and M. Jagersand. “Bringing visual servoing into real world applications.” In Human Robot Collaboration Workshop, Robotics Science and Systems RSS, vol. 13. 2013.
Fomena, Romeo Tatsambon, Camilo Perez Quintero, Mona Gridseth, and Martin Jagersand. “Towards practical visual servoing in robotics.” In Computer and Robot Vision (CRV), 2013 International Conference on, pp. 303-310. IEEE, 2013.
Mona Gridseth, Camilo Perez Quintero, Romeo Tatsambon Fomena, and Martin Jagersand. Visual interface for task specification. In Graphics Interface, GI (best poster award), 2013.
Dick, Travis, Camilo Perez Quintero, Martin Jägersand, and Azad Shademan. “Realtime Registration-Based Tracking via Approximate Nearest Neighbour Search.” In Robotics: Science and Systems. 2013.
Quintero, Camilo Perez, Romeo Tatsambon Fomena, Azad Shademan, Nina Wolleb, Travis Dick, and Martin Jagersand. “Sepo: Selecting by pointing as an intuitive human-robot command interface.” In Robotics and Automation (ICRA), 2013 IEEE International Conference on, pp. 1166-1171. IEEE, 2013.
Camilo Perez Quintero and Martin Jagersand. Robot making pizza. In 3rd place in the IEEE Robotics and Automation Society (RAS) SAC Video contest, may, 2013.
Quintero, Camilo A. Perez, and Pablo A. Figueroa. “Poster: Vibration as a wayfinding aid.” In 3D User Interfaces, 2009. 3DUI 2009. IEEE Symposium on, pp. 135-136. IEEE, 2009.
Perez, Camilo A., and Pablo Figueroa. “VRPN and Qwerk: fast MR device prototyping and testing.” In Proceedings of the 2008 ACM symposium on Virtual reality software and technology, pp. 261-262. ACM, 2008.
Perez, Camilo A., and Pablo A. Figueroa. “Building New Mixed Reality Devices.” In International Symposium on Visual Computing, pp. 1106-1114. Springer, Berlin, Heidelberg, 2008.
Camilo Perez and Pablo Figueroa. Integrating low budget wireless devices to virtual environments. In Proceedings of the IX Symposium on Virtual and Augmented Reality, page 312 314. Brazilian Computing Society, 2007.
Camilo Perez Quintero. Ph.D. in Computer Science Thesis: Pointing gestures for Cooperative Human-Robot Manipulation Tasks in Unstructured Environments 2017: University of Alberta, Canada.
Camilo Perez Quintero. M.S. Systems Engineering Thesis: Experiences on hardware, software and interaction design for virtual reality devices 2009: Universidad de los Andes, Colombia.
Camilo Perez Quintero. Electronic Engineering Thesis: Design, construction, and automation of a bridge crane with an embedded platform which supports multiprogramming 2005: Universidad de los Andes, Colombia.
Camilo Perez Quintero. Mechanical Engineering Thesis: “Design and manufacturing of a valve for intraocular pressure control.” 2004: Universidad de los Andes, Colombia.