ESEM 2018 Preview

Off

Download Conference Flyer

It is a pleasure to announce that in 2018, I will be serving together with Audris Mockus as Program Committee co-Chair for the 12th International Symposium on Empirical Software Engineering and Measurement (ESEM 2018), to be held from October 11 to 12 in Oulu, Finland. ESEM is the flagship conference of the empirical software engineering community and as such, it is an honour being part of the OC.

As in previous years, ESEM will form again part of the Empirical Software Engineering International Week and will be co-located with the

  • Annual meeting of the International Software Engineering Research Network (ISERN) – Oct 8 to 9
  • International Doctoral Symposium on Empirical Software Engineering (IDoESE) – Oct 10
  • International Advanced School on Empirical Software Engineering (IASESE) – Oct 10
  • The International Conference on Predictive Models and Data Analytics in Software Engineering (PROMISE) – Oct 10

In response to ongoing debates on the future of ESEM, and in light of positive experiences we recently made in context of changes in related conferences, such as PROFES, we have decided to take the opportunity of starting to also add some changes to the ESEM conference setup.

Those changes include:

  • We replaced the short paper track with new tracks, and
  • we included open science practices.

Here, I will very briefly address the motivation for the changes while details can be taken from the conference website and our related call for papers.

New Tracks – Short Papers is dead, long live Short Papers!

Short papers are meant to be a means to discuss new ideas and ongoing work. They build, I believe, a core pillar of conferences where researchers from all over the world gather to not only discuss their completed work, but also their visions, ideas, and ongoing work in order to strengthen collaborations and build new ones. This is exactly what short paper tracks should be for: a forum to discuss visionary ideas and ongoing work.

Rather, conferences have overall become the primary place to get completed work published (which, I personally believe, should be the primary purpose of journals) where little to no space is given for networking and discussions of ideas. Even more, instead of using short paper tracks to foster exactly what conferences are for, I get the impression that these tracks have become rather intoxicated. Understandably, short papers are very difficult to review in the sense of trying to maintain high scientific standards while giving authors the chance to present incomplete work. In consequence, they tend to be treated too often like full research papers compressed into 6 or 8 pages.

In other words: There is often an imbalance between the purpose of short papers and the way these papers are handled by the community and I get the impression that we are losing focus on the value of short papers. For this very reason, we decided to remove the short paper track and, instead, make explicit the purpose of the previous short paper tracks via two new tracks:

  • Emerging Results: The papers in this track should promote current work in progress on research and practice. Papers on emerging results should communicate initial research results and not focus on a complete evaluation. The primary purpose of such papers is the communication of new ideas in order to get an early feedback from the empirical software engineering community.
  • Vision Papers: The emphasis of this track will be on long term challenges and opportunities in empirical software engineering research that are outside of current mainstream topics of the field. Vision papers should propose imaginative ways to extend the applicability of techniques in empirical software engineering and/or challenge the existing explicit or implicit assumptions in this field. The goal of vision papers is to describe how empirical software engineering research and practice will (probably) look at least ten years from now.

With the emerging results, we make explicit the notion of the previously used short paper track for presenting incomplete research, open questions, and potentials for collaboration, all to be discussed in the community. With vision papers, we want to create a forum for visions and proposals, no matter how radical and no matter how useful they eventually turn out to be.

We sincerely hope that those little changes help bringing us back on track of using conferences to (also) discuss rather than (only) present.

Open Science Practices

First things first: I am strongly in favour of openness as one key to fostering progress via transparency, reproducibility and replicability. While open access and open data are two fundamental pillars in open science, it is open data that builds the core for excellence in evidence-based research. As ESEM is the leading conference for evidence-based research methodologies in software engineering, it should also be the ESEM community to set standards for how we conduct this kind of research.

Motivated by our experiences made while adding new open science policies for the PROFES conference, we decided to take the next step by applying those policies to ESEM.

That is, we have committed ourselves to fostering open science as a key to increase the accessibility, reproducibility, and replicability of our research outcomes. The steering principle will be that all research output should be accessible to the public and that empirical studies should be reproducible. This means that all authors are asked for the disclosure of their (anonymised and curated) empirical data and all software code used to analyse it to the public.

However, we are very much aware of that such changes cannot, and should not, be done in a radical way. This is especially true as data disclosure is highly difficult and it comes with a plethora of challenges for the authors. Curating data for disclosure to the public is an elaborate task for which we have little experience and guidance. Further, in contrast to other evidence-based (sub-)disciplines, empirical software engineering research often tends to rely on highly sensitive industrial data for which we often have signed non-disclosure agreements with companies.

For those very reasons, we decided to not make this step mandatory yet but we will set first steps in supporting for a smooth transition into a new way of working and thinking.

Instead of blindly insisting on authors employing open data practices, we have established, same as for PROFES, an open science chair who will support authors when they prepare and disclose their data. This role is taken by Daniel Graziotin from the University of Stuttgart. The willingness of authors to disclose their data will not affect the reviews and the acceptance decisions.

In the end, it is still an experiment and we still need to learn (a) whether and how well the community receives these changes, and (b) what results this yields. Our hope is still that with our changes, we send some proper signals into the software engineering community and open a new era where openness will be also for us the standard in evidence-based research.

Further Points and Feedback from the Town Hall Meeting

At ESEM 2017 in Toronto, we have conducted a town hall meeting to elaborate on the future of ESEM and find ways to best reflect the views and needs of the community. Major points discussed included, but are not limited to, the following ones:

  • We should (re-)focus the call and programme more again on empirical research methodologies and empirical data as this is the core competence of ESEM.
  • In contrast to many other conferences, ESEM is positively perceived as a community-driven conference despite an attendance rate of 50% for newcomers; the community networking (within but also beyond ISERN) is seen as a major strength.
  • We should take care of further fostering the exchange and discussions between researchers on current and new topics; This included more time for discussions after the presentations as well as more focus on critical and innovative topics in the field of empirical research methods.
  • We should further foster the inclusion of industry participants.

We will critically consider all points discussed and believe that with the above changes, we have already made good steps in the right direction. In any case, we are considering further changes and will keep you updated.

One question that might arise is why not changing to double blind reviewing at ESEM given that so many conferences are currently changing their reviewing format in that direction. Let me briefly address this point in the following.

Why not Double Blind Reviewing?

In short, regardless of the actual discipline and conferences, peer review processes are often perceived as to not work as well as as they could. It is the very human nature that makes peer reviews prone to errors emerging from, inter alia, potential hidden bias. It is thus not surprising that recent studies have indicated that

  • double-blind reviewing decreases specific forms of bias (see e.g. the work by Budden et al.), while
  • the research community still perceives the costs of employing double-blind reviewing as rather high and acknowledges the difficulties of double-blind reviewing, some even disagree  with double-blind reviewing completely expressing valuable concerns.

Still, there is only a minority that objects the transition to this reviewing format (see e.g. the work by Bachelli and Beller). Even more, a study we performed ourselves in the ICSE community, where we surveyed reviewers and authors as well, shows that the majority of researchers in the community favours a change towards either double-blind or zero-blind reviewing regimes as a means for improvement (see here). One reasonable explanation might also be the perception of a certain closeness of that particular community. In fact, I myself personally do actually favour (regardless of the conference) zero-blind reviewing while this format also comes with even additional challenges imposed to the reviewers. Further, we are also very much aware of that many conferences are moving towards double-blind reviewing despite the fact that it is often not easy for authors to not reveal their identities while preserving the relevant background information (as for ESEM, the community is rather small which makes it even more difficult to anonymise submissions.)

That all being said, we discussed the possibility to change the reviewing format for ESEM and decided to not do this, at least for the 2018 edition. Reasons include the above mentioned, but also the following:

  • ESEM has become rather inclusive already; in the 2017 edition, roughly 50% of all authors were first-timers at the conference.
  • The work of the PC is perceived by the authors as excellent. This is reflected by the feedback given by the participants of the town hall meeting, but also by the feedback surveys conducted.
  • ESEM has, with no prejudice towards the behaviour of our reviewers, still already set up checks and balances to make sure that the reviews are of high quality; for instance, reviewers are anonymous to each other until the end of the process and we have set up meta reviewers as an additional safeguard.

In any case, we remain open to new changes in order to further improve the conference and continue to use town hall meetings, feedback and recommendations by the community, as well as insights gathered from ongoing changes to improve the conference in the long run.

Further details will follow soon.