8 Secrets and Tips with the Login VSI Analyzer - Login VSI
  • Variant A
  • Variant A
  • login vsi company logo 250x40

    8 Secrets and Tips with the Login VSI Analyzer

    I’d like to review some Login VSI Analyzer features and graphs I frequently find helpful which I commonly recommend to our users. These can be used for troubleshooting, validation, and providing results in documentation.

    My colleague, Jasper Geelen, has written a similar, helpful blog named Essential VDI Performance Graphs in the Login VSI Analyzer which covers different graphs and features.

    Furthermore, our most well-known mojo is our industry-standard VSImax system, which has recently been awarded a patent. As its unique in the way it measures user experience.

    “The VSImax system is for simulating user load and evaluating performance and/or capacity of SBC or VDI datacenter infrastructure. The method runs workloads on a virtual desktop infrastructure for an increasing number of virtual users. When the average response time of the infrastructure to perform operations exceeds a predetermined threshold, the number of simulated users at that time may be considered the capacity of the infrastructure.”

    For more information, refer to Calculating Maximum Virtual Desktop Capacity – VSImax Explained.

    On to my recommendations…

    Custom timers

    Custom timers

    These can be incorporated in the virtual user workload. The timing data these custom timers collect will be presented in their own graph(s) in the Analyzer. There can be several functions within a workload that a custom timer wraps around, if desired, and several custom timers within one workload. It could be worthwhile validating that the time to perform some custom functions holds up under load, for example, how long it might take to look up a patient in medical database software.

    Refer to the documentation on custom timers.

    Side-by-side graph comparisons

    When multiple results’ data-sets are loaded in the Login VSI Analyzer, they can be laid out either horizontally or vertically by using a menu option. This allows for easier comparison of data.

    Once the data sets are loaded into the program, open the Window menu and either choose Tile horizontal or Tile vertical, depending on what looks best for comparing a specific graph/tab. The open data-sets will snap to the appropriate positions automatically.

    Login VSI Analyzer

    Saving pictures of graphs easily

    Once a graph in the Login VSI Analyzer has been arranged and scaled as desired it can be easily saved as a screenshot. This can be helpful when building Login VSI test documentation; screenshots can be included in a .docx, for example.

    saving pictures of graphs easily

    Exporting Data

    After a test result’s data-set has been loaded into the Login VSI Analyzer the data can be exported. This can be useful since it can be loaded into an external program, such as Excel. If desired, specific formulas could be applied to the data as seen fit, or data could even be presented in a third-party data visualization dashboard.

    exporting data

    Making graphs show all test data

    Login VSI 4.x – Analyzer tips – Graphs can display all data over the entire duration of a test

    After a test’s results have been loaded, most graphs can present all data collected for the whole scope of the test, as opposed to the default setting which will show data gathered only during the ramp-up phase. There will also now be vertical lines on the graph denoting when the ramp-up phase ends, steady state ends, and the session ramp-down is occurring. This could be helpful if interested in what’s happening when all sessions are testing simultaneously, which can help validate SOAK tests or look for load test issues. Also, it could be interesting to see if there’s a negative user-experience during the ramp-down phase (logoff storm), which can also be performance intensive.

    Making graphs show all test data

    “Troubleshooting” graphs

    The following three are troubleshooting graphs. 

    Stuck Sessions Over Time (Troubleshooting)

    This graph will show which sessions, during the session ramp-up phase, got stuck. Sessions will be determined as “stuck” if they were gathering data but then stopped, even though the test was in progress. This could be due to the target OS freezing, a program crash, an unexpected modal dialog, etc.

    Please see this article for more information on which issues this graph can help point out.

    Stuck Sessions Over Time Troubleshooting

    Active Sessions Over Time (Troubleshooting)

    This graph will show active sessions incrementing during the test’s ramp up phase. A session is active once it reports into the Login VSI server that it’s starting testing. Typically, the scaling will be linear, such as here:

    Active Sessions Over Time Troubleshooting

    The scaling can also have gaps in it, such as here:

    Active Sessions Over Time Troubleshooting 2

    Some examples of issues the gaps can point out are the connection server being overload or DRS shifting VMs around mid-test, rendering them temporarily unavailable.

    Please see this article for more information on which issues this graph can help point out.

    VSImax Per Computer (Troubleshooting)

    This graph can help validate that gathered user-experience performance metrics from different VM or SBC hosts are consistent or not. In most cases it’s expected there would be performance consistency, unless the same SBC or VM hosts in a test are deliberately not configured the same way. An example of an issue this graph can help point out is if power settings are accidentally configured differently on SBC or VM hosts then gathered performance metrics might not match up.

    The following graph is an example of what would be a clever idea to investigate, if unexpected. In other words, in this example, finding out why the hosts’ baseline user-experience metrics reported in as having such different response times:

    VSImax Per Computer Troubleshooting

    It was determined the hypervisor power settings were inconsistent. After ensuring all hypervisors were set to high-performance a new test was performed.

    We can see in the following graph the performance of the hosts’ are more comparable:

    VSImax Per Computer Troubleshooting 2

    Please see this article for more information on which issues this graph can help point out.

    To conclude, thinking creatively while digging into results can show extra findings that could go otherwise unnoticed. Diving right in, practicing reviewing results, and knowing what is being looked for can help.

    The Login VSI Services team offers Analyzer training and assistance with reviewing results. Please reach out to support@loginvsi.com, or your Account Manager, to schedule.

    For further reading on the Analyzer, please refer to our blogs, documentation, and the Analysis Knowledge Base category.

    To access all Login VSI Knowledge Base Articles visit this page.

    About the author
    Joshua Kennedy

    Joshua Kennedy joined the Login VSI Services Engineering team in 2016 and is fueled by passion for peer-mentoring and delivering exceptional solutions that surpass expectations. He’s been in the tech and engineering world since ’05. In his off times you’ll find him hiking, traveling, cooking, and motorcycling.