I am a Ph.D. Student in the Software Engineering Research Group (SERG) at the Department of Computer Science, Lund University, Sweden. I am funded by the Industrial Excellence Centre for Embedded Applications Software Engineering (EASE), working within Theme D - Aligning Requirements and Verification (Project D.2 Large-scale Test to Requirements Linking).
My main research contributions concern using information retrieval techniques to semi-automatically create traces between software artifacts. Visit my TraceRepo, a repository of publications covering related research until 2011. In September 2012 I presented my (cumulative) licentiate thesis on the topic: Advancing Trace Recovery Evaluation – Applied Information Retrieval in a Software Engineering Context. One of the included papers has been credited by Dolado et al. in Software Quality Journal (link) as the first work using equivalence hypothesis testing in software engineering.
I joined SERG in January 2010. Prior to that I worked three years at ABB in Malmö, working first as a thesis student and then as a development engineer. I was part of a team responsible for editor and compiler development in the 800xA automation system. My experiences include:
I am currently on a study-leave from ABB, working full-time with my PhD studies.
My research interests are related to information overload involved in large-scale software development. I have published more than 15 papers on the topic. More specifically, my interests include:
My favorite tools of the trade are:
I am always looking for students interested in master thesis projects related to my interests, take a look at the project proposals for inspiration.
Changes, Evolution and Bugs – Recommendation Systems in Issue Management, M. Borg, and P. Runeson, In Recommendation Systems for Software Engineering, edited by M. Robillard, W. Maalej, R. Walker, and T. Zimmermann. Springer. To appear 2014.
Supporting Regression Test Scoping with Visual Analytics, E. Engström, M. Myntälä, P.Runeson, and M. Borg, To appear in Proceedings of the 7th International Conference on Software Testing, Verification and Validation, Cleveland, Ohio, USA, March 31 - April 4, 2014.
A literature review on open source software in safety-critical systems.
A paper on requirements engineering and testing.
- Running a pilot evaluation of a tool (as a longitudinal study in an industrial context) supporting impact analyses using a combination of IR-based trace recovery techniques and network analysis. Recall@5 ~= 33%, recall@10 ~= 40%, and positive qualitative feedback.
- Tuning a classifier for automated developer assignment of bug reports (5 datasets containing 10,000+ bug reports from two private companies). Prediction accuracy varies between 30-70%. Positive response in one company, currently deploying solution.
- Analyzing answers from a survey of impact analysis in safety-critical domains. 100+ answers from different domains.
- Wrapping up a study on prediction of defect resolution times based on textual descriptions in bug reports. Replication of previous of work on KNN clustering using much larger datasets from OSS and a private company. The results are not very good, appears to be little predictive value in the approach.
I have been involved in the following undergraduate and graduate courses at our department:
Supervised master theses:
Page Manager: Markus Borg
Last updated: 2014-03-10