Témata prací (Výběr práce)Témata prací (Výběr práce)(verze: 368)
Detail práce
   Přihlásit přes CAS
Asynchronous Duet Benchmarking
Název práce v češtině: Asynchronní Duet Benchmarking
Název v anglickém jazyce: Asynchronous Duet Benchmarking
Klíčová slova: benchmarking|cloud|duet|kontajnery
Klíčová slova anglicky: benchmarking|cloud|duet|containers
Akademický rok vypsání: 2021/2022
Typ práce: diplomová práce
Jazyk práce: angličtina
Ústav: Katedra distribuovaných a spolehlivých systémů (32-KDSS)
Vedoucí / školitel: Mgr. Vojtěch Horký, Ph.D.
Řešitel: Mgr. Tomáš Drozdík - zadáno a potvrzeno stud. odd.
Datum přihlášení: 27.05.2022
Datum zadání: 31.05.2022
Datum potvrzení stud. oddělením: 06.06.2022
Datum a čas obhajoby: 08.02.2023 09:00
Datum odevzdání elektronické podoby:05.01.2023
Datum odevzdání tištěné podoby:09.01.2023
Datum proběhlé obhajoby: 08.02.2023
Oponenti: Michele Tucci, Ph.D.
 
 
 
Zásady pro vypracování
Detecting performance regressions in software development usually requires repeated measurements and dedicated machines. Cloud offers a tempting alternative with abundance of hardware with pay-for-what-you-use principle. But the cloud environment is inherently heterogeneous and jobs are influenced by load of other users, increasing instability (variation) of obtained measurements.

Duet benchmarking is an approach where benchmarks are run in parallel on the same machine: both measured software versions are thus susceptible to the same fluctuations of the cloud machine and it is possible to filter out variations caused by the cloud environment. However, this approach requires significant support from the benchmark harness to ensure concurrent execution.

This thesis aims to remove these limitations to enable broader use of this method.

The thesis would include a prototype for running duet benchmarks without any need for synchronization between the two instances. The evaluation of the new approach and the toolchain prototype would be based on established benchmark suites running under different conditions: from dedicated servers to public clouds. The evaluation would compare the new (asynchronous) method with the original duet (synchronized) method as well as with sequential execution (i.e., the current practice).

The thesis would also provide a prototype of an automated evaluation tool that would be able to detect whether the two tested instances differ in performance.
Seznam odborné literatury
[1] Lubomír Bulej et al. “Duet benchmarking: improving measurement accuracy in the cloud”. In: Proceedings of the ACM/SPEC International Conferenceon Performance Engineering. 2020, pp. 100–107
[2] Christoph Laaber, Joel Scheuner, and Philipp Leitner. “Software microbench-marking in the cloud. How bad is it really?” In: Empirical Software Engi-neering 24.4 (2019), pp. 2469–2508
[3] A. Abedi and T. Brecht. Conducting repeatable experiments in highly variable cloud computing environments. In Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering, pages 287– 292. ACM, 2017
 
Univerzita Karlova | Informační systém UK