Workshop Agenda

14:00 - 14:05Welcome & Introduction
14:05 - 14:45Predictability of Performance in Public Clouds - Some Empirical Data and Lessons Learned for Software Performance Testing
Philipp Leitner (University of Zurich, Switzerland) Slides
14:45 - 15:00In-Test Adaptation of Workload in Enterprise Application Performance Testing
Maciej Kaczmarski, Philip Perry, John Murphy and Omar Portillo-Dominguez (University College Dublin, Ireland) Slides
15:00 - 15:15Reproducible Load Tests for Android Systems with Trace-based Benchmarks
Alexander Lochmann, Fabian Bruckner and Olaf Spinczyk (TU Dortmund, Germany) Slides
15:15 - 15:30Large Scale Performance Modelling for Big Data
Rekha Singhal (TCS Innovation Labs, India) Slides
15:30 - 16:00Coffee Break
16:00 - 16:15 Systematic Load Testing of IoT Systems
Felix Loesch and Julia Leibinger (Robert Bosch GmbH, Germany)
16:15 - 17:25Discussion
17:25 - 17:30Closing of LTB 2017

Call for papers [PDF]

Software systems (e.g., smartphone apps, desktop applications, e-commerce systems, IoT infrastructures, big data systems, and enterprise systems, etc.) have strict requirements on software performance. Failure to meet these requirements will cause customer dissatisfaction and negative news coverage. In addition to conventional functional testing, the performance of these systems must be verified through load testing or benchmarking to ensure quality service. Load testing examines the behavior of a system by simulating hundreds or thousands of users performing tasks at the same time. Benchmarking evaluates a system's performance and allows to optimize system configurations or compare the system with similar systems in the domain.

Load testing and benchmarking software systems are difficult tasks, which requires a great understanding of the system under test and customer behavior. Practitioners face many challenges such as tooling (choosing and implementing the testing tools), environments (software and hardware setup) and time (limited time to design, test, and analyze). This one-day workshop brings together software testing researchers, practitioners and tool developers to discuss the challenges and opportunities of conducting research on load testing and benchmarking software systems.

We solicit the following two tracks of submissions: research papers (maximum 4 pages) and presentation track for industry or experience talks (maximum 700 words extended abstract). Technical papers should follow the standard ACM SIG proceedings format and need to be submitted electronically via EasyChair. Short abstracts for the presentation track need to be submitted as "abstract only" submissions via EasyChair. Accepted technical papers will be published in the ICPE 2017 Proceedings. Materials from the presentation track will not be published in the ICPE 2017 proceedings, but will be made available on the workshop website. Submitted papers can be research papers, position papers, case studies or experience reports addressing issues including but not limited to the following:

Important Dates

Research papers: Jan. 10, 2017Jan. 17, 2017
Presentation track: Mar. 20, 2017
Paper notification: Feb. 1, 2017
Presentation notification: Mar. 24, 2017
Camera ready: Feb. 17, 2017
Workshop date: Apr. 23, 2017



Johannes Kroß fortiss GmbH, Germany
Cor-Paul Bezemer Queen's University, Canada
Zhen Ming (Jack) Jiang York University, Canada

Program Committee

Adams, Bram Polytechnique Montreal, Canada
Bezemer, Cor-Paul Queen's University, Canada
Brunnert, Andreas RETIT GmbH, Germany
Csallner, Christoph University of Texas at Arlington, USA
Eichelberger, Holger University of Hildesheim, Germany
Franks, Greg Carleton University, Canada
Garousi, Vahid Hacettepe University, Turkey
Ghaith, Shadi IBM, Ireland
Hasselbring, Wilhelm Kiel University, Germany
Heinrich, Robert Karlsruher Institute of Technology, Germany
van Hoorn, André University of Stuttgart, Germany
Horrox, Robert EMC Isilon, USA
Jamshidi, Pooyan Imperial College London, United Kingdom
Krishnamurthy, Diwakar University of Calgary, Germany
Podelko, Alexander Oracle, USA
Shang, Weiyi Concordia University, Canada
Sunyé, Gerson University of Nantes, France

Steering Committee

Ahmed E. Hassan Queen’s University, Canada
Marin Litoiu York University, Canada
Zhen Ming (Jack) Jiang York University, Canada

Past LT Workshops