Browse Prior Art Database

Method for storage analysis and testing based on workload characterization data

IP.com Disclosure Number: IPCOM000206443D
Original Publication Date: 2011-Apr-26
Included in the Prior Art Database: 2011-Apr-26
Document File: 4 page(s) / 238K

Publishing Venue

Microsoft

Related People

Kushagra Vaid: INVENTOR

Abstract

Performance testing for server selection involves an extensive evaluation to compare different platforms from multiple vendors to understand which platform is better suited for the deployment from an overall TCO standpoint. The evaluation involves exercising test patterns that represent production workloads. In several instances, we find that existing tools and benchmarks for exercising storage performance patterns are not representative of datacenter workloads and there is a lot of information loss between the application characteristics and the test pattern generators. This results in inaccurate information for getting an appropriate comparison between the different systems in the evaluation process. The same issue exists for capacity planning purposes where it is important to understand the workload characteristics and map it to the server characteristics to understand what quantity of servers need to be purchased. Existing benchmarks which don’t capture workload details cannot be reliably used for such capacity planning exercises, and this typically results in overbuying to hedge against incorrect assumptions. We show how to address this issue by describing techniques to characterize storage workloads and use mathematical models to recreate the workload characteristics via newly created synthetic testing tools.

This text was extracted from a Microsoft Word document.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 44% of the total text.

Document Author (alias)

kvaid

Defensive Publication Title 

Method for storage analysis and testing based on workload characterization data

Name(s) of All Contributors

Kushagra Vaid (kvaid)

Sriram Sankar (srsankar)

Summary of the Defensive Publication/Abstract

Performance testing for server selection involves an extensive evaluation to compare different platforms from multiple vendors to understand which platform is better suited for the deployment from an overall TCO standpoint. The evaluation involves exercising test patterns that represent production workloads. In several instances, we find that existing tools and benchmarks for exercising storage performance patterns are not representative of datacenter workloads and there is a lot of information loss between the application characteristics and the test pattern generators. This results in inaccurate information for getting an appropriate comparison between the different systems in the evaluation process. The same issue exists for capacity planning purposes where it is important to understand the workload characteristics and map it to the server characteristics to understand what quantity of servers need to be purchased. Existing benchmarks which don’t capture workload details cannot be reliably used for such capacity planning exercises, and this typically results in overbuying to hedge against incorrect assumptions.

We show how to address this issue by describing techniques to characterize storage workloads and use mathematical models to recreate the workload characteristics via newly created synthetic testing tools.

 

Description:  Include architectural diagrams and system level data flow diagrams if: 1) they have already been prepared or 2) they are needed to enable another developer to implement your defensive publication. Target 1-2 pages, and not more than 5 pages.  

Performance testing for server selection involves an extensive evaluation to compare different platforms from multiple vendors to understand which platform is better suited for the deployment from an overall TCO standpoint. The evaluation involves exercising test patterns that represent production workloads. In several instances, we find that existing tools and benchmarks for exercising storage performance patterns are not representative of datacenter workloads and there is a lot of information loss between the application characteristics and the test pattern generators. This results in inaccurate information for getting an appropriate comparison between the different systems in the evaluation process. The same issue exists for capacity planning purposes where it is important to understand the workload characteristics and map it to the server characteristics to understand what quantity of servers need to be purchased. Existing benchmarks which don’t capture workload details cannot be reliably used for such capacity planning exercises, and this typically results in overbuying to hedge against incorrect assumptions.

So...