ASHRAE Book ServerEfficiency 2015
$42.79
ASHRAE Server Efficiency–Metrics for Computer Servers and Storage
Published By | Publication Date | Number of Pages |
ASHRAE | 2015 | 222 |
Performance measures designed without regard to the energy used to produce work are a thing of the past. Now, with recognition of the importance of the energy consumed to produce the results, both the numerator and denominator in “performance per watt” are seen as critically important attributes of an IT platform. This book consolidates information on current server and storage subsystem energy benchmarks. Each chapter describes a metric, its target market, and includes examples of data generated from the subject benchmark or tool, along with guidance on interpreting the data. This book provides the information needed to select the best measure of performance and power for a variety of server applications. This book is the twelfth in the ASHRAE Datacom Series, authored by ASHRAE Technical Committee 9.9, Mission Critical Facilities, Data Centers, Technology Spaces and Electronic Equipment. This series provides comprehensive treatment of datacom cooling and related subjects. Keywords: energy-efficiency, servers, metrics, performance, data centers, datacom, benchmarks
PDF Catalog
PDF Pages | PDF Title |
---|---|
6 | Contributors |
8 | About the Authors |
12 | Contents |
16 | Foreword |
18 | Preface DISCLAIMERS |
20 | Acknowledgments CONTRIBUTING ORGANIZATIONS |
22 | 1—Introduction 1 INTRODUCTION 1.1 THE INDUSTRY’S NEED—FILLING THE VOID 1.1.1 Why Performance per Watt? 1.1.2 Why Multiple Measures? |
23 | 1.1.3 Understanding Measures of Performance and Power in Computing Environments |
24 | 1.1.4 Saving Energy in the Data Center |
25 | 1.2 OVERVIEW OF THIS BOOK AND ITS ORGANIZATION |
26 | 1.2.1 Intended Audience 1.2.2 Navigation Through This Book |
32 | 1.3 SUMMARY OF BENCHMARKS AND TOOLS AND THEIR METRICS |
36 | 2—SPEC Power and PerformanceBenchmark Methodology 2.1 INTRODUCTION TO SPEC ANDTHE METHODOLOGY DOCUMENT 2.2 DOCUMENT OVERVIEW: SPEC POWER AND PERFORMANCE BENCHMARK METHODOLOGY |
37 | 2.2.1 Measurement at Different Load Levels 2.2.2 Wide Variety of Computer Configurations |
38 | 2.2.3 Combining Performance Measures with Power Measures 2.2.4 Establishing Metrics and Rules for Fair Use of Information |
40 | 3—SPEC PTDaemon |
42 | 4—SPEC Benchmarks |
43 | 4.1 SPECPOWER_SSJ2008 4.1.1 Document Excerpts: SPECpower_ssj2008 Design Documents |
45 | 4.1.2 SPECpower_ssj2008 Metric |
46 | 4.1.3 Driving Power Efficiency—SPECpower_ssj2008 Historical Trends |
47 | 4.1.4 Sample Result for SPECpower_ssj2008 4.2 SPECVIRT_SC2010 (RETIRED) 4.2.1 Document Excerpts: SPECvirt_sc2010 Design Overview |
51 | 4.2.2 Sample Result of SPECvirt_sc2010 4.3 SPECVIRT_SC2013 |
52 | 4.3.1 Document Excerpts: SPECvirt_sc2013 Design Overview |
55 | 4.3.2 Sample Result of SPECvirt_sc2013 4.4 SPECOMP2012 |
56 | 4.4.1 Document Excerpts: SPEComp2012 Run Rules/Documentation |
59 | 4.4.2 Sample Result of SPEComp2012 4.5 SPECWEB2009 (RETIRED) |
60 | 4.5.1 Document Excerpts: SPECweb2009 Design Document |
63 | 4.5.2 Sample Result of SPECweb2009 |
64 | 5—TPC-Energy 5.1 INTRODUCTION TO THE TPC AND TPC-ENERGY |
65 | 5.2 TPC BENCHMARKS 5.2.1 TPC-C 5.2.2 TPC-DS |
66 | 5.2.3 TPC-E 5.2.4 TPC-H 5.2.5 TPC-VMS |
67 | 5.3 OVERVIEW OF THE ENERGY MEASURING SYSTEM 5.4 TPC-ENERGY STAGES 5.4.1 TPC-Energy Configuration |
68 | 5.4.2 Reported Energy Configuration (REC) 5.4.3 Power Measurable Units |
69 | 5.4.4 Benchmark Configurations |
72 | 5.4.5 Power Analyzer Configuration |
74 | 5.4.6 Energy Measuring System (EMS) 5.5 TPC-ENERGY EXECUTION AND VALIDATION |
75 | 5.5.1 Run Validation |
76 | 5.5.2 Report Generator (Rgen) |
77 | 5.5.3 Power Analyzer Calculation 5.6 TPC-ENERGY TUNING |
78 | 5.7 SAMPLE RESULTS FOR TPC-ENERGY |
80 | 6—VMware VMmark 6.1 INTRODUCTION TO VMMARK 6.2 VMMARK 2.5: MEASURING POWER USAGE OFVIRTUALIZED DATA CENTERS |
81 | 6.3 EXCERPTS FROM VMMARK BENCHMARKING GUIDE |
82 | 6.3.1 VMmark Benchmark Workloads |
84 | 6.4 SAMPLE RESULT OF VMMARK POWER PERFORMANCE |
86 | 7—SAP Power Benchmarks 7.1 OVERVIEW AND MOTIVATION |
87 | 7.2 SAP POWER BENCHMARKS |
88 | 7.3 TYPES OF SAP POWER BENCHMARKS |
89 | 7.3.1 Server Power Benchmark |
90 | 7.3.2 System Power Benchmark |
92 | 7.3.3 Interpreting Results |
93 | 7.3.4 Comparison of Energy-Efficient Technologies |
95 | 7.4 CONCLUSION 7.4.1 Sample SAP Power Benchmark Result and Additional Materials |
96 | 8—Storage Energy Benchmarks |
97 | 8.1 SNIA EMERALD PROGRAM 8.1.1 Introduction to the SNIA Emerald Program 8.1.2 Current Scope of the SNIA Emerald Program 8.1.3 Taxonomy |
100 | 8.1.4 Capacity Optimization Methods |
101 | 8.1.5 Execution Overview of the Benchmark 8.1.6 Results, Disclosure, and Example of an Emerald Run |
102 | 8.1.7 Interpreting the Results of the Emerald Benchmark |
104 | 8.2 SPC ENERGY BENCHMARKS 8.2.1 Introduction to SPC: Relationship between Servers and Storage |
105 | 8.2.2 Defining Storage for SPC-1 and SPC-2 |
106 | 8.2.3 SPC Energy Metric Summary 8.2.4 Highlights from SPC-1/E |
107 | 8.2.5 Highlights of SPC-2/E |
108 | 8.3 SOME OBSERVATIONS AND CONCLUSIONS |
112 | 9—Server Efficiency Rating Tool (SERT) 9.1 STATUS 9.2 INTRODUCTION |
113 | 9.3 GENERAL DESIGN 9.3.1 Goals 9.3.2 Chauffeur—Benchmark Framework |
114 | 9.3.3 Workload 9.3.4 Target Load Levels |
115 | 9.4 SERT OVERVIEW |
116 | 9.4.1 SERT User Interface |
117 | 9.4.2 SERT Scoring System |
120 | 10—Worldwide Regulatory andStandards Organizations 10.1 EPA SERVER EFFICIENCY MEASURES |
121 | 10.1.1 Power Supply Efficiency 10.1.2 System Capacity 10.1.3 Idle Power 10.1.4 System Utilization |
122 | 10.1.5 Power Supply Efficiency 10.1.6 Power Management Enablement 10.1.7 Minimum Idle Power Requirements |
124 | 10.1.8 Real-Time Reporting of Power Use and Inlet Temperatu 10.1.9 SERT Testing |
126 | 10.1.10 Standard Information Reporting 10.1.11 Advantages |
127 | 10.1.12 Disadvantages 10.1.13 Implications of ENERGY STAR Regulations |
128 | 10.2 UPCOMING SERVER EFFICIENCY MEASURES |
129 | 10.3 CHINA 10.4 EUROPEAN UNION 10.5 INDIA 10.6 JAPAN |
130 | 10.7 KOREA 10.8 IEEE 1680.4: STANDARD FOR ENVIRONMENTAL ASSESSMENT OF SERVERS |
132 | Appendix A—SPEC Power and PerformanceBenchmark Methodology A.1 KEY ELEMENTS OF A POWER AND PERFORMANCE BENCHMARK |
133 | A.1.1 Wide Variety of Computing Environments A.1.2 One Benchmark Satisfies Only One Business Model |
134 | A.2 DEFINING POWER COMPONENTS WITHINPERFORMANCE BENCHMARKS A.2.1 Types of Performance Benchmarks A.2.2 Active-Idle |
135 | A.2.3 New Benchmark Development Efforts A.2.4 Existing Benchmark that Can Run Only at = 100% |
136 | A.2.5 Existing Benchmark Capable of Graduated Throughput Levels |
137 | A.2.6 Defining the Intermediate Measurement Intervals |
142 | A.2.7 Hybrid Benchmarks A.3 SYSTEM UNDER TEST (SUT) A.3.1 Servers versus Personal Systems |
143 | A.3.2 Discrete Server (Tower or Rack-mounted) A.3.3 Disaggregated Server |
144 | A.3.4 Blade Server (Blade Enclosure-mounted) A.3.5 Measurement of Subsystems within a SUT |
146 | A.3.6 Additional Configuration Considerations A.4 POWER MEASUREMENT A.4.1 A Note on Direct Current-Powered Data Centers A.4.2 Power Analyzer Requirements |
147 | A.4.3 Process for Accepting Power Analyzers forBenchmark Measurements |
148 | A.4.4 Environmental Considerations |
151 | A.5 PERFORMANCE/POWER METRICS A.5.1 Power Measurements at Distinct Benchmark Measurement Intervals A.5.2 Computing a Benchmark Performance-per-Power Value |
155 | A.6 REPORTING |
156 | A.6.1 Environment and Pre-measurement Reporting A.6.2 Performance Reporting |
157 | A.6.3 Power Reporting |
159 | A.7 AUTOMATION AND VALIDATION A.7.1 Integration of Commands to Collect Power and Thermal Data A.7.2 Controlling Power and Performance Benchmarks |
161 | A.7.3 SPECpower Control and Collect System |
162 | A.8 FAIR USE CONSIDERATIONS |
164 | Appendix B—Sample Results and Explanation B.1 SPECpower_ssj2008—SAMPLE RESULT AND EXPLANATION B.1.1 The Power/Performance Report (Figures B.1–B.13) |
168 | B.1.2 The Power/Temperature Details Report B.1.3 Additional Performance Reports |
169 | B.2 SPECVIRT_SC2010—SAMPLE RESULT AND EXPLANATION |
173 | B.3 SPECVIRT_SC2013—SAMPLE RESULT AND EXPLANATION |
177 | B.4 SPECOMP2012—SAMPLE RESULT AND EXPLANATION |
179 | B.5 SPECWEB2009—SAMPLE RESULT AND EXPLANATION |
184 | B.6 TPC-ENERGY PLUS TPC-E—SAMPLE PUBLICATION B.7 TPC-ENERGY PLUS TPC-C—SAMPLE PUBLICATION |
186 | B.8 VMMARK—SAMPLE RESULT AND EXPLANATION |
191 | B.9 SAP POWER BENCHMARKS—IMPLEMENTATION ANDSAMPLE RESULTS B.9.1 Measurement Methodology |
195 | B.9.2 Environmental Conditions and Power Measurement |
196 | B.9.3 Power-Related Data to be Submitted |
197 | B.9.4 Temperature-Related Data to Be Submitted |
198 | B.9.5 Benchmark Run Rules |
200 | B.9.6 Formal Aspects |
205 | B.9.7 Benchmark Report Sheet |
206 | B.9.8 A Closer Look at the Published Power Benchmark Tesults |
208 | B.10 SERT—SAMPLE RESULT AND EXPLANATIONS B.10.1 The Standard Report |
211 | B.10.2 The Detail Report |
214 | Glossary of Terms |
218 | References |