Benchmarking
Benchmarking & Metrics
Another important area of our research related to the described vision is the development of standard metrics, tools and benchmarks for quantitative system evaluation and analysis, focusing on quality-of-service (QoS) and efficiency-related aspects. In line with this direction, we co-founded a new group within SPEC (Standard Performance Evaluation Corporation) called SPEC Research Group (SPEC RG) with the mission to serve as a platform for collaborative research efforts in the area of quantitative system evaluation and analysis, fostering the interaction between industry and academia in the field. The scope of the group includes computer benchmarking, performance evaluation, and experimental system analysis considering both classical performance metrics such as response time, throughput, scalability and efficiency, as well as other non-functional system properties included under the term dependability, e.g., availability, reliability, and security.
The SPEC RG currently has over 40 member organizations from academia and industry. The conducted research efforts span the design of metrics for system evaluation as well as the development of methodologies, techniques and tools for measurement, load testing, profiling, workload characterization, dependability and efficiency evaluation of computing systems. As part of these efforts, our research group is actively involved in the development of: i) metrics and benchmarks for intrusion detection and prevention systems in virtualized environments, ii) metrics and benchmarks for quantifying the elasticity of IaaS/PaaS environments, and iii) metrics and benchmarks for quantifying performance isolation in multi-tenant SaaS environments.