Computer Science

Faculty of Engineering, LTH

Denna sida på svenska This page in English

PostDoc project

Providing better overview to operators of (semi-)autonomous systems


The Context

We can offer a two-year (full-time) position as postdoctoral researcher with me, Assoc.Prof. Elin A. Topp, in the group for AI and Human-Robot Interaction of the division of Robotics and Semantic Systems (RSS, within the Department of Computer Science of Lund University’s Faculty of Engineering. The position is funded through the WASP and Aalto University Collaborative Postdoctoral Fellow Project (see https://wasp-sweden.orgAISA - AI powered situational awareness in shared autonomy. In this project, AI methods will be combined with suitable user interface techniques including virtual or augmented reality to support situational awareness in shared autonomy, mainly targeting autonomous vehicles.  A more detailed description, also of the working environment including accessible research arenas and platforms, is available below.

The Project AISA - AI powered situational awareness in shared autonomy




The project aims to improve situational awareness for operators of autonomous systems to resolve complex situations where the system has to yield control back to the operator. This will be done with the help of AI based approaches and VR / AR interfaces.

The project consists of several work packages distributed between Lund University (LU) and Aalto University (AU) and will focus on a use case considering autonomous cars encountering an unexpected situation. The work will be primarily performed in simulation but the project will also have access to two demonstrator and data collection platforms: Alex, Aalto University’s autonomous car platform (, and the WASP research arena WARA-PS (

The main focus for the project work at LU will be on synthesizing contextual views:

Based on previous results and observations (Lager 2019, 2020), aggregated and interpreted situation explanations provided by AU will be integrated in suitable user interfaces, which can for example be implemented in virtual or augmented reality (VR / AR). We will also consider techniques to assess the operator’s attention and resulting capability to handle complex information. By that, the explanations can be further adapted and hypotheses for possible actions can be reduced or curated to a level that is manageable by the operator at the given point in time.

In direct collaboration with AU, we will bring together the results from the work described above and aim at resolving the unexpected by closing the loop by developing a user interface that allows an operator to examine the predicted effects of changes made to system’s parameters, in order to choose the best course of action. 

Apart from the research platforms mentioned above, you will have access to insights and outcomes from a recent WASP-funded industrial PhD project [1] that was also closely tied to the WASP Research Arena Public Safety (WARA-PS) [2]. In that project, we worked with the integration of several information sources in a virtual reality (VR) based user interface to provide situation awareness for an operator supervising a remote (semi-)autonomous surface vessel. We have also a recently started, ELLIIT-funded ( PhD project on mixed-initiative interaction and human intention understanding, which can give you the opportunity to collaborate with and support the respective PhD student Ayesha Jena in her research.

[1] Lager, M. and Topp, E.A. (2019). Remote Supervision of an Autonomous Surface Vehicle using Virtual Reality, IFAC-PapersOnLine, Vol 52(8), pp 387-392, Elsevier, 2019

[2] Andersson, O., Doherty, P., Lager, M., Lindh, J.-O., Persson, L., Topp, E.A., Tordenlid, J., Wahlberg, B. (2021). WARA-PS: a research arena for public safety demonstrations and autonomous collaborative rescue robotics experimentation,Autonomous Intelligent Systems Vol 1(1), pp 1-31, Springer Singapore, 2021

Apply here: 

The Candidate - YOU!

Apart from the formal requirements specified in the official call, you should match the following criteria. You have a solid background in computer science or a related field (also including, e.g., cognitive science) with good (documented) programming skills. Experience in at least one of the following areas is beneficial: Robotics, Autonomous Systems, Human-Robot/Computer Interaction, or Human-Machine Interfaces (including visualisation techniques and VR / AR). Besides technical and mathematical skills, you are expected to be curious and ambitious, with strong motivation to conduct research, but also willing to implement for and experiment with actual robotic systems. You should be used to a well-structured work style, and have the ability to work both individually and in teams. Very good communication skills in both oral and written English are required. A brief research proposal showing your own interests and background in relation to the project should be included in your application material  as part of the personal / cover letter.

The Group

In addition to the obvious interest in mixed-initiative interaction with both mobile (service) and industrial robot systems and the underlying approaches for situation understanding, intention recognition and general interaction monitoring represented by myself, the PI for the project at Lund University, the research division for Robotics and Semantic Systems has a well established track record in knowledge representation and management as well as integrated end-to-end software systems for (mainly, but not exclusively) industrial robotic applications. We have been involved in a series of respective EU-funded projects (SIARAS, ROSETTA, PRACE, SMErobotics, SARAFun) with core contributions by Prof. Jacek Malec, who is the head of the division. Recently, several PhD projects have started to investigate various approaches to Machine Learning within and for Robotics, many of those being supervised by Prof. Volker Krueger, who joined the group in 2018, Ass.Prof. Luigi Nardi, who joined us in 2019, and myself.

A second, very strong and well established line of research is pursued by Prof. Pierre Nugues in the area of natural language processing (NLP), specifically semantic parsing, role labelling, and entity recognition. Some rather recent efforts of integrating NLP into work on intuitive programming of industrial robots have led to the establishment of an end-to-end support chain from high-level (spoken) commands formulated in natural language through levels of task and skill representation and abstraction down to executable robot code for industrial, both native and external, robot controllers (see Maj Stenmark’s respective publications).

The division is collaborating closely with the robotics researchers within the Dept of Automatic Control, enabling this type of research on all system levels, from investigations of high-level programming tools to sensor-based force-controlled manipulation approaches. We are also well connected to AI Lund, the AI and ML related network at Lund University.

In addition to the opportunities for collaboration through the WASP-funded project, there are strong connections in the group to the Graduate School and various research projects within the Wallenberg AI, Autonomous Systems and Software Program (WASP). You will have thus the opportunity to benefit from the different topics, perspectives, and core competencies offered within RSS and the RobotLab and beyond.

Postdoctoral researcher - The employment

The position is fully funded for two years, and you should mainly carry out research. However, up to 20% of the working time of a postdoctoral researcher at Lund University can be filled with teaching - which includes the supervision of MSc projects relevant to your research. 

Apply here: