Introducing new indices VSIavg100, VSIavg250, VSIavg500, VSIavg1000 and VSIavg5000
With the emergence of Cloud-based and Hyper Converged Infrastructures (HCI), the concept of maximum number of users is slowly getting obsolete. In tests of these auto-scaling infrastructures the VSImax score is often not reached. The index VSIbase remains valid as a description of the baseline performance (one user) of a system, but we need a series of new indicators to be able to better understand, communicate and compare the performance of systems while being used by larger number of users.
To address this changing need Login VSI is introducing a number of new indices: VSIavg100, VSIavg250, VSIavg500, VSIavg1000 and VSIavg5000. These fixed number indices give a standardized performance measurement (in milliseconds) for a system that is simultaneously used by 100, 250, 500, 1000, or 5000 active users.
Based on the highly standardized test methods and -workloads in Login VSI these new indices make performance comparisons between different VDI infrastructure solutions, on-premises versus off-premises, and different cloud solutions (e.g. Azure versus AWS), very objective and also easy to understand.
The new indices will introduce a number of fixed data points on the already existing (blue) VSImax avg. line (VSIavg100 pinpoints the average performance at exact 100 active sessions, VSIavg1000 shows the average performance at 1000 active sessions).
VSIavg100, VSIavg250 and VSIavg500 are designated to compare small/medium sized environments, the other indices are only relevant for large environments.
Guidelines for clean tests
For the tests Login VSI Benchmarking mode must be used. The (most common) Login VSI Knowledge Worker workload must be part of the tests (in case other workloads, like the multimedia workload, are used to test also). For clean testing 1% stuck sessions is the maximum, 0,5% stuck sessions is better, and 0% stuck sessions is off course best.
The VSIavg100 is calculated as the average of the duration of all active sessions at the exact moment when all 100 sessions are started, including all the sessions that happened in a defined timeframe before that moment (the blue line in our graphs, matching the VSImax v4.1 average (when VSImax is not reached)).
Today these results can already be found in the standard report, that can be generated in Login VSI after each test. Just look for the table named ‘VSI response time overview’. Currently these scores are displayed with 50 user increments. Use the VSI response time scores @ 100, 250, 500, 1000 and 5000 active sessions (the available values are dependent on the ultimate size of the test executed) in your reports. In future product updates, all the VSIavg statistics that were reached in the test will be integrated into the first summary table.
In time Login VSI plans to complement these statistics with the variance / deviation around the calculated averages (represented by the yellow line). This will add another important dimension for system quality to the toolbox of VDI and Cloud infrastructure managers (adding ‘stability’ to current dimensions ‘performance’ and ‘scalability’).
More information about clean testing can be found in our benchmarking standards and testing guidelines.