Happy users, great performance. Riding SimpliVity’s hyperconverged architecture into the Horizon
This week we are excited to post a new Validated By Login VSI hyper-converged architecture with SimpliVity. I’ve always been a big fan of “Proofs of Claim” and this is a good one. At Login VSI we are fortunate enough to live and breathe centralized desktops and published apps every day—many of which are in, or meant for, production datacenters. Leveraging our expertise, some of our customers ask for their reference architectures to be “Validated By” Login VSI. This helps to ensure unbiased test results and a solid implementation of best practices for availability and performance.
Hyper-convergence carries the promise of simplicity, performance and scalability. Hyper-convergence means there are no silos of independent IT components and there’s no need to manage discrete devices or have specialized training in component-level technology, such as storage area networks (SANs).
Hyper-converged means going from this…
SimpliVity’s latest work leveraged their Data Virtualization Platform which scaled up to 4 nodes as well as tested a failed host during one of the test runs. I was really impressed to see this architecture at work, demonstrating low latency throughout deployment tests, login storms and perturbation testing. If you are familiar with our VSImax then you know it is how we index between a good user experience and a bad one from a performance perspective. SimpliVity stayed below VSImax during all tests, proof that this architecture delivers the promise of performance, scalability and resiliency.
Login VSI runs a set of operations selected to be representative of real-world user applications, and reports data on the latencies of those operations. In these tests we used this tool to simulate a real world scenario, and then accepted the resultant application latency as a metric to measure the end user experience. Login VSI was heavily involved with SimpliVity in the “Validated by” tests, and we’ve noticed a few differentiating test results:
When we don’t hit VSImax, it is a good sign that the production experience will be good. In these cases we will also explore the VSI Baseline, which is a strong indicator of how well the solution is architected and how well the desktop images are optimized. In this case we observed significantly low baseline scores. For example, during the 1,000 linked-clone desktops test with the Knowledge Worker profile SimpliVity showed a VSI Baseline of 660. To understand how Login VSI thinks about baseline scores please see the table below… Good, right!? This forms the base for a very good user experience.
|0 – 799||Very Good|
|800 – 1299||Good|
|1300 – 1999||Reasonable|
|1999 – 9999||Poor|
Straight and predictable VSI performance (blue lines). SimpliVity’s power is in a linear growth and performance model.
Even in the failed-host scenarios, although getting very close, the VSImax was not reached.
It was exciting working so closely with our valued partner during these Validated By Login VSI tests. I have had the opportunity to work closely with many enterprise VDI customers and this represents a big win for the IT staff whose goal is to quickly find success with their VDI initiatives. Unfortunately, more often than not these teams are being asked to do more with less and these Proofs of Claim can significantly cut some of the research and validation time for these prospective buyers.
For more information about this paper which has been Validated By Login VSI please see the following link: