Conducting industry-relevant software engineering research: Experience and guidelines

 

Schedule and location

Wednesday October 17th - Thursday October 18th.

University of Oulu, Linnanmaa campus (visiting address: Pentti Kaiteran katu 1)  Please, find the map of the campus here

Registration 

Registration is open September 5th - October 10th .

Speaker

Associate Professor of Software Engineering Vahid Garousi,  Information Technology Group, Wageningen University, the Netherlands

Organizer

Professor Markku Oivo, University of Oulu, Finland.

Background

Industry-academia collaborations (IAC) and industry-relevant research in software engineering (SE) are necessary. Collaboration between software industry and researchers supports improvement and innovation in industry and helps to ensure industrial relevance of academic research in our field. However, many researchers and practitioners believe that the level of joint IAC in SE is relatively low, compared to the amount of activity in each of the two communities.

According to the Merriam-Webster Dictionary, something is relevant if it has “significant and demonstrable bearing on the matter at hand”. A paper published in the management science discipline [1] defined relevant research papers as “those whose research questions address problems found (or potentially found) in practice”.

This seminar aims at providing guidelines for conducting industry-relevant SE research. The seminar material is based on the experience of the speaker in conducting 30+ IAC projects in the past 15+ years in three countries (Canada, Turkey and the Netherlands).

We will review the state of the community w.r.t. discussions on relevance and utility of SE research, types (modes) of SE research, and the issue of basic research versus applied research in SE. We will then review several industry-driven SE research projects and discuss challenges, patterns and anti-patterns of IAC. The seminar will conclude by discussing the guidelines for conducting industry-driven SE research.

To ensure best learning experience, the seminar will engage students in the class by assignments related to the topic. Student will report their work on the assignments in the class.

Overview

The topics covered in the seminar include:

  • Reviewing the state of the community: discussions on relevance and utility of software engineering (SE) research
  • Types (modes) of SE research
  • Basic versus applied SE research
  • Reviewing several industry-driven SE research projects
  • Challenges, patterns and anti-patterns of industry-academia collaborations
  • How to conduct industry-driven research

 

Detailed Program

Day 1:

9.00-10.30

  • Introduction to the course
  • Reviewing the state of the community: discussions on relevance and utility of software engineering (SE) research
  • Types (modes) of SE research
  • Basic versus applied SE research

10.30 -10.45

Break

10.45-12.00

  • Reviewing several industry-driven SE research projects

12.00-13.00

Lunch

13.00-15.00

  • Assessing research rigor and industrial relevance of several SE papers (co-authored by the speaker)
  • Reviewing papers from SE and CS, on research rigor and relevance

15.00 -15.15

Break

15.15-16.15

  • Reviewing papers from other fields, on research rigor and relevance

16. 15-17.00

In-class exercise: In groups of two, students will work on the assignment in the class, and will report their work

  • Topic: Taking a set of SE papers (preferably students’ own papers, if they have. If not, from ICSE and other top conferences) and assessing their research rigor and relevance

 

 

Day 2:

9.00-10.30

  • Challenges, patterns and anti-patterns of industry-academia collaborations

10.30 -10.45

Break

10.45-12.00

How to conduct industry-driven research

  • Choosing the “right” topics

12.00-13.00

Lunch

13.00-15.00

How to conduct industry-driven research

  • Execution phase of projects
  • Closing phase of the projects and ensuring impact (both academic and industrial)

15.00 -15.15

Break

15.15-17.00

In-class exercise: In groups of two, students will work on the assignment in the class, and will report their work

  • Topic: Role playing (industry versus academia). Each group should choose a SE topic and then prepare a plan (document) for conducting industry-driven research in that topic

 

 

Assignment to be done by students in advance

The speaker has collected the papers about research relevance in SE. The pool has 45 papers. Students will get the link to the materials after the registration.

Each student’s pre-course assignment is to review a randomly-selected subset of at least five (5) of those papers and, based on ideas of those papers, answer the following two questions:

  • Q 1: What are the root causes of low research relevance in (some of) SE papers?
  • Q 2: What ideas have been suggested for improving relevance?

Each student should also augment her/his answers to the above two questions, by her/his ideas and experience. Ensure to do this pre-course assignment before the class since you need to report on it in the class.

A course-work to be returned after the course

The course-work will build on top of the last in-class exercise (planning a typical industry-driven research in a SE topic). It should be done by a group of two students. Each group should choose a SE topic and should then use the guidelines and suggestions discussed in the course, including the ideas suggested for improving relevance, to document a detailed plan for conducting a industry-driven research project in that SE topic.

The report should then explain, narratively, how such a project would be executed and finalized (the “happy” scenario), and what would/should be done if certain challenges are observed during the project. Think of it as a “simulation” of the Industry-academia collaboration (IAC) project, in which you should bring in challenges, just like they would occur in the real project. Include both “technical” (SE concepts) and project-management aspects in your report. 

To be able to objectively assess efforts of each student of a team, the course-work report should clearly mention the roles and efforts of each student of the team, in the work.

Deadline of the course-work will be agreed upon at the end of the course.

Assessment (marking)

Assessment of each student will be done according to her/his performance in the following activities: (1) pre-course assignment; (2) involvement in the class discussions; and (3) the course-work report.

Readings

NOTE: There is no need to review all of the following papers in details. However, a general review of a few papers in each group below, in advance, would be nice and would better prepare students for the course.

 

Methodological papers about IAC in SE:

[1-9]

Technical papers from IAC research projects by the speaker:

[10-15]

Papers about research rigor and relevance, in SE:

[16-23]

Papers about research rigor and relevance, in CS:

[24-26]

Papers about research rigor and relevance, from other fields:

[27-34]

 

[1]        C. Wohlin, A. Aurum, L. Angelis, L. Phillips, Y. Dittrich, T. Gorschek, et al., "The success factors powering industry-academia collaboration," IEEE Software, vol. 29, pp. 67-73, 2012.

[2]        A. T. Misirli, H. Erdogmus, N. Juristo, and O. Dieste, "Topic selection in industry experiments," presented at the Proceedings of the International Workshop on Conducting Empirical Studies in Industry, Hyderabad, India, 2014.

[3]        A. Sandberg, L. Pareto, and T. Arts, "Agile collaborative research: Action principles for industry-academia collaboration," IEEE Software, vol. 28, pp. 74-83, 2011.

[4]        V. Garousi, K. Petersen, and B. Özkan, "Challenges and best practices in industry-academia collaborations in software engineering: a systematic literature review," Information and Software Technology, vol. 79, pp. 106–127, 2016.

[5]        V. Garousi and K. Herkiloğlu, "Selecting the right topics for industry-academia collaborations in software testing: an experience report," in IEEE International Conference on Software Testing, Verification, and Validation, 2016, pp. 213-222.

[6]        V. Garousi, M. M. Eskandar, and K. Herkiloğlu, "Industry-academia collaborations in software testing: experience and success stories from Canada and Turkey," Software Quality Journal, pp. 1-53, 2016.

[7]        V. Garousi, D. Pfahl, K. Petersen, M. Felderer, M. Mäntylä, M. Oivo, et al., "IndAcSE: Meta-analysis of Industry-Academia Collaborations in Software Engineering," https://www.researchgate.net/project/IndAcSE-Meta-analysis-of-Industry-Academia-Collaborations-in-Software-Engineering, Last accessed: May 2017.

[8]        V. Garousi, M. Felderer, M. Kuhrmann, and K. Herkiloğlu, "What industry wants from academia in software testing? Hearing practitioners’ opinions," in International Conference on Evaluation and Assessment in Software Engineering, Karlskrona, Sweden, 2017, pp. 65-69.

[9]        V. Garousi, M. Felderer, J. M. Fernandes, D. Pfahl, and M. V. Mantyla, "Industry-academia collaborations in software engineering: An empirical analysis of challenges, patterns and anti-patterns in research projects," in Proceedings of International Conference on Evaluation and Assessment in Software Engineering, Karlskrona, Sweden, 2017, pp. 224-229.

[10]      C. Pinheiro, V. Garousi, F. Maurer, and J. Sillito, "Introducing Automated Environment Configuration Testing in an Industrial Setting," in Proceedings of the International Conference on Software Engineering and Knowledge Engineering, Workshop on Software Test Automation, Practice, and Standardization, 2010, pp. 186-191.

[11]      S. A. Jolly, V. Garousi, and M. M. Eskandar, "Automated Unit Testing of a SCADA Control Software: An Industrial Case Study based on Action Research," in IEEE International Conference on Software Testing, Verification and Validation (ICST), 2012, pp. 400-409.

[12]      V. Garousi, S. Taşlı, O. Sertel, M. Tokgöz, K. Herkiloğlu, H. F. E. Arkın, et al., "Experience in automated testing of simulation software in the aviation industry," IEEE Software, In Press, 2018.

[13]      V. Garousi, E. G. Ergezer, and K. Herkiloğlu, "Usage, usefulness and quality of defect reports: an industrial case study in the defence domain," in International Conference on Evaluation and Assessment in Software Engineering (EASE), 2016, pp. 277-282.

[14]      M. E. Coşkun, M. M. Ceylan, K. Yiğitözu, and V. Garousi, "A tool for automated inspection of software design documents and its empirical evaluation in an aviation industry setting," in Proceedings of the International Workshop on Testing: Academic and Industrial Conference - Practice and Research Techniques (TAIC PART), 2016, pp. 118-128.

[15]      V. Garousi and E. Yıldırım, "Introducing automated GUI testing and observing its benefits: an industrial case study in the context of law-practice management software," in Proceedings of IEEE Workshop on NEXt level of Test Automation (NEXTA), 2018, pp. 138-145.

[16]      M. Ivarsson and T. Gorschek, "A method for evaluating rigor and industrial relevance of technology evaluations," Empirical Software Engineering, vol. 16, pp. 365-395, 2011.

[17]      D. Lo, N. Nagappan, and T. Zimmermann, "How practitioners perceive the relevance of software engineering research," presented at the Proceedings of Joint Meeting on Foundations of Software Engineering, 2015.

[18]      X. Franch, D. M. Fernández, M. Oriol, A. Vogelsang, R. Heldal, E. Knauss, et al., "How do Practitioners Perceive the Relevance of Requirements Engineering Research? An Ongoing Study," in IEEE International Requirements Engineering Conference (RE), 2017, pp. 382-387.

[19]      J. C. Carver, O. Dieste, N. A. Kraft, D. Lo, and T. Zimmermann, "How Practitioners Perceive the Relevance of ESEM Research," presented at the Proceedings of the ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, 2016.

[20]      A. Tan and A. Tang, "On the Worthiness of Software Engineering Research," Technical report, http://shidler.hawaii.edu/sites/shidler.hawaii.edu/files/users/kazman/se_research_worthiness.pdf, Last accessed: Aug. 12, 2016.

[21]      S. Beecham, P. O. Leary, S. Baker, I. Richardson, and J. Noll, "Making Software Engineering Research Relevant," Computer, vol. 47, pp. 80-83, 2014.

[22]      L. C. Briand, "Useful software engineering research - leading a double-agent life," in IEEE International Conference on Software Maintenance (ICSM), 2011, pp. 2-2.

[23]      P. Ralph, "The two paradigms of software development research," Science of Computer Programming, vol. 156, pp. 68-89, 2018.

[24]      M. Pechenizkiy, S. Puuronen, and A. Tsymbal, "Towards more relevance-oriented data mining research," Intelligent Data Analysis, vol. 12, pp. 237-249, 2008.

[25]      D. A. Norman, "The research-practice gap: the need for translational developers," interactions, vol. 17, pp. 9-12, 2010.

[26]      K. Vizecky and O. El-Gayar, "Increasing research relevance in DSS: Looking forward by reflecting on 40 years of progress," in Proceedings of the Annual Hawaii International Conference on System Sciences, 2011.

[27]      R. M. Mason, "Not Either/Or: Research in Pasteur's Quadrant," Communications of the Association for Information Systems, vol. 6, 2001.

[28]      N. Kock, P. Gray, R. Hoving, H. Klein, M. D. Myers, and J. Rockart, "IS research relevance revisited: Subtle accomplishment, unfulfilled promise, or serial hypocrisy?," Communications of the Association for Information Systems, vol. 8, p. 23, 2002.

[29]      S. C. Srivastava and T. S. Teo, "Information systems research relevance," in Encyclopedia of Information Science and Technology, Second Edition, ed: IGI Global, 2009, pp. 2004-2009.

[30]      K. C. Desouza, O. A. El Sawy, R. D. Galliers, C. Loebbecke, and R. T. Watson, "Information systems research that really matters: Beyond the is rigor versus relevance debate," in International Conference on Information Systems, 2005, pp. 957-959.

[31]      K. C. Desouza, O. A. El Sawy, R. D. Galliers, C. Loebbecke, and R. T. Watson, "Beyond rigor and relevance towards responsibility and reverberation: Information systems research that really matters," Communications of the Association for Information Systems, vol. 17, p. 16, 2006.

[32]      M. Rosemann and I. Vessey, "Toward improving the relevance of information systems research to practice: the role of applicability checks," MIS Quarterly, vol. 32, pp. 1-22, 2008.

[33]      B. B. Flynn, "Having it all: Rigor versus relevance in supply chain management research," Journal of Supply Chain Management, vol. 44, pp. 63-67, 2008.

[34]      M. Rosemann and J. Recker, "Rigour versus relevance revisited: Evidence from IS conference reviewing practice," in Proceedings of Australasian Conference on Information Systems, 2009, pp. 257-266.

 

Credit points

Doctoral students participating in the seminar can obtain 2 credit points. This requires participating on all of the days and completing the lab assignments.

Registration fee

This seminar is free-of-charge for Inforte.fi member organization's staff and their PhD students. For others the participation fee is 400 €. The participation fee includes access to the event and the event materials. Lunch and dinner are not included.