Computer Science

Faculty of Engineering, LTH

Denna sida på svenska This page in English

PhD project

Official call and application system

PhD project "Get a Grip" - Accurate Robot-Human Handovers

You can find the official call for the position here!

This project aims to improve human–robot collaboration in industrial and medical applications. In particular, we aim to study how to achieve responsive collaboration through a combination of multi–modal interaction and situation awareness in robot–human object handover situations.

We assume application scenarios, in which human interaction protocols already exist, allowing for data collection and direct comparison of robot performance to human performance.

The project is funded through the Wallenberg AI, Autononomous Systems and Software program, WASP (link to, and will be conducted based on an existing collaboration with the Children’s Heart Centre at Skåne University Hospital in Lund. It is also directly connected to the WASP Research Arena WARA-Robotics (link to

The project focuses on handover scenarios where the human will reach out or ask for an object without looking, expecting the object to be placed correctly in their hand so that they can use it without re–gripping the object. The objects are also expected to be domain specific so that the desired trajectory and placement has to be learnt specifically for the object and task. The overall challenge lies in the integration of many different aspects of these tasks into one system. On a more fine–grained level, we see the following three points as still open scientific challenges that we aim to find improved, sustainable, solutions for with this project. These points can be interpreted as trying to answer the questions of where (in which context), what/when, and how a collaborative task is to be performed.

  • Scene understanding / situation awareness (Where?)

  • User intention / plan recognition and high–level reasoning (What / When?)

  • Teaching and execution of interactive / collaborative tasks (How?)

These challenges can each be seen as a possible focus area for the project, assuming techniques from computer vision, body tracking, gesture recognition, plan / activity / intention recognition, machine learning, robot control, and motion planning to be the tools to work on them.

The project has two intended use cases, one is robot assisted assembly, the other robot assisted surgery which are broken down into scenarios below. To be able to understand the workflows of these scenarios and actual handovers occurring in them, respective data collections from observed human–human, or, if existing, human–robot collaborations will be needed and form thus a basis for the research on how to solve the questions underlying the challenges above.

The project will be part of a bundle of further related projects in the research group focusing on AI methods for and in HRI with significant overlap and connections within the wider division for Robotics and Semantic Systems this group is part of. These projects consider, e.g. the acquisition of robotic skills through learning, based on interactively acquired models [1,2,3] and robot skill learning in a wider sense [4,5]. In earlier work, we studied how to provide means for intuitive instruction of industrial robots in for assembly tasks [6,7,8,9]. A recently started project will focus on AI methods for situation awareness in shared autonomy, in particular on providing VR–based support to enhance a user’s situation awareness in situations requiring handover of control with, e.g. autonomous vehicles. Through these projects, the division is well connected to both WARA–Robotics (and the Robotics Area Cluster in WASP) and to WARA–PS [10].

Interested? Apply here!

[1] A. Dürr. Robot skill learning based on interactively acquired knowledge-based models. jects/robot-skill-learning-based-on-interactively-acquired- knowledge-ba.
[2] A. Dürr, L. Neric, V. Krueger, and E.A. Topp. Realeasy: Real-time capable simulation to reality domain adaptation. In International Conference on Automation Science and Engineering (CASE), 2021.
[3] Q. Yang, A. Dürr, E.A. Topp, J. Stork, and T. Stoyanov. Variable impedance skill learning for contact-rich manipulation. IEEE Robotics and Automation Letters, 7(3):8391–8398, 2022.
[4]  M. Mayr, F. Ahmad, K. Chatzilygeroudis, L. Nardi, and V Krueger. Learning of parameters in behavior trees for movement skills. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021.
[5]  M. Mayr, C. Hvarfner, K. Chatzilygeroudis, L. Nardi, and V. Krueger. Learning skill–based indus- trial robot tasks with user priors. In International Conference on Automation Science and Engineering (CASE), 2022.
[6]  M. Stenmark, M. Haage, and E.A. Topp. Simplified Programming of Re-usable Skills on a Safe Industrial Robot – Prototype and Evaluation. In Proceedings of the IEEE/ACM Conference on Human-Robot Interaction (HRI), 2017.
[7]  M. Stenmark, M. Haage, E.A. Topp, and J. Malec. Supporting Semantic Capture during Kinesthetic Teaching of Collaborative Industrial Robots. International Journal on Semantic Computing, 12(1):167– 186, 2018.
[8]  M. Stenmark and J. Malec. Describing constraint-based assembly tasks in unstructured natural language. IFAC Proceedings Volumes, 47(3):3056–3061, 2014. 19th IFAC World Congress.
[9]  M. Stenmark and P. Nugues. Natural language programming of industrial robots. In IEEE ISR 2013, pages 1–5, 2013.
[10] O. Andersson, P. Doherty, M. Lager, J.-O. Lindh, L. Persson, E.A. Topp, J. Tordenlid, and B. Wahberg. Wara-ps: a research arena for public safety demonstrations and autonomous collaborative rescue robotics experimentation. Autonomous Intelligent Systems, 1(9), 2021. DOI: 00009-9.

Currently I do not have any open call for PhD student positions. 
The most recent call closed on Sep 6th, 2023, and all applications that reached us through the recruitment portal will be considered.