Prix bas
CHF39.90
Pas encore paru. Cet article sera disponible le 23.01.2025
Texte du rabat
Expert guide to productively and profitably put your organization's data to use Providing both underlying theory and practical solutions, Analytics the Right Way is a thorough exploration of how to create tangible business value with data. Written by Tim Wilson, seasoned industry professional with more than 20 years of proven experience, and Dr. Joe Sutherland, renowned professor and researcher who served in The White House during the Obama administration, this book shows readers how to find the answers to common data and analytics frustrations and anxieties, including lack of actionable insights, ineffective recommendations, difficulties scaling, and unclear ROI. Written in accessible language with helpful illustrations to elucidate key concepts included throughout, this book explores topics including:
Contenu
**Table of Contents
Acknowledgments xiii About the Authors xvii
CHAPTER 1 Is This Book Right for You? 1 The Digital Age = The Data Age 3 What You Will Learn in This Book 6 Will This Book Deliver Value? 7
CHAPTER 2 How We Got Here 9 Misconceptions About Data Hurt Our Ability to Draw Insights 11 Misconception 1: With Enough Data, Uncertainty Can Be Eliminated 12 Having More Data Doesn't Mean You Have the Right Data 13 Even with an Immense Amount of Data, You Cannot Eliminate Uncertainty 16 Data Can Cost More Than the Benefit You Get from It 18 It Is Impossible to Collect and Use "All" of the Data 18 Misconception 2: Data Must Be Comprehensive to Be Useful 19 "Small Data" Can Be Just As Effective As, If Not More Effective Than, "Big Data" 20 Misconception 3: Data Are Inherently Objective and Unbiased 21 In Private, Data Always Bend to the User's Will 23 Even When You Don't Want the Data to Be Biased, They Are 24 Misconception 4: Democratizing Access to Data Makes an Organization Data-Driven 26 Conclusion 28
CHAPTER 3 Making Decisions with Data: Causality and Uncertainty 29 Life and Business in a Nutshell: Making Decisions Under Uncertainty 30 What's in a Good Decision? 32 Minimizing Regret in Decisions 33 The Potential Outcomes Framework 34 What's a Counterfactual? 34 Uncertainty and Causality 36 Potential Outcomes in Summary 42 So, What Now? 43
CHAPTER 4 A Structured Approach to Using Data 45
CHAPTER 5 Making Decisions Through Performance Measurement 53 A Simple Idea That Trips Up Organizations 54 "What Are Your KPIs?" Is a Terrible Question 58 Two Magic Questions 60 A KPI Without a Target Is Just a Metric 68 Setting Targets with the Backs of Some Napkins 72 Setting Targets by Bracketing the Possibilities 74 Setting Targets by Just Picking a Number 78 Dashboards as a Performance Measurement Tool 80 Summary 82
CHAPTER 6 Making Decisions Through Hypothesis Validation 85 Without Hypotheses, We See a Drought of Actionable Insights 88 Breaking the Lamentable Cycle and Creating Actionable Insight 89 Articulating and Validating Hypotheses: A Framework 91 Articulating Hypotheses That Can Be Validated 92 The Idea: We believe [some idea] 95 The Theory: ...because [some evidence or rationale]... 96 The Action: If we are right, we will... 98 Exercise: Formulate a Hypothesis 101 Capturing Hypotheses in a Hypothesis Library 101 Just Write It Down: Ideating a Hypothesis vs. Inventorying a Hypothesis 104 An Abundance of Hypotheses 105
Hypothesis Prioritization 106 Alignment to Business Goals 107 The Ongoing Process of Hypothesis Validation 108 Tracking Hypotheses Through Their Life Cycle 109 Summary 110
CHAPTER 7 Hypothesis Validation with New Evidence 113 Hypotheses Already Have Validating Information in Them 115 100% Certainty Is Never Achievable 116 Methodologies for Validating Hypotheses 118 Anecdotal Evidence 119 Strengths of Anecdotal Evidence 120 Weaknesses of Anecdotal Evidence 121 Descriptive Evidence 122 Strengths of Descriptive Evidence 123 Weaknesses of Descriptive Evidence 124 Scientific Evidence 128 Strengths of Scientific Evidence 129 Weaknesses of Scientific Evidence 135 Matching the Method to the Costs and Importance of the Hypothesis 137 Summary 139
CHAPTER 8 Descriptive Evidence: Pitfalls and Solutions 141 Historical Data Analysis Gone Wrong 142 Descriptive Analyses Done Right 146 Unit of Analysis 146 Independent and Dependent Variables 149 Omitted Variables Bias 151 Time Is Uniquely Complicating 153 Describing Data vs. Making Inferences 154 Quantifying Uncertainty 156 Summary 163
CHAPTER 9 Pitfalls and Solutions for Scientific Evidence 165 Making Statistical Inferences 166 Detecting and Solving Problems with Selection Bias 168 Define the Population 168 Compare the Population to the Sample 168
Determine What Differences Are Unexpectedly Different 169 Random and Nonrandom Selection Bias 169 The Scientist's Mind: It's the Thought That Counts! 170 Making Causal Inferences 171 Detecting and Solving Problems with Confounding Bias 172 Create a List of Things That Could Affect the Concept We're Analyzing 173 Draw Causal Arrows 173 Look for Confounding "Triangles" Between the Circles and the Box 174 Solving for Confounding in the Past and the Future 175 Controlled Experimentation 176 The Gold Standard of Causation: Controlled Experimentation 177 The Fundamental Requirements for a Controlled Experiment 179 Some Cautionary Notes About Controlled Experimentation 184 Summary 185
CHAPTER 10 Operational Enablement Using Data 187 The Balancing Act: Value and Efficiency 189 The Factory: How to Think About Data for Operational Enablement 191 Trade Secrets: The Original Business Logic 192 How Hypothesis Validation Develops Trade Secrets and Business Logic 193 Operational Enablement and Data in Defined Processes 194 Output Complexity and Automation Costs 196 Machine Learning and AI 199 Machine Learning: Discovering Mechanisms Without Manual Intervention 199 Simple Machine-learned Rulesets 200 Complex Machine-learned Rulesets 202 AI: Executing Mechanisms Autonomously 203 Judgment: Deciding to Act on a Prediction 204 Degrees of Delegation: In-the-loop, On-the-loop, and Out-of-the-loop 204 Why Machine Learning Is Important for Operational Enablement 209
CHAPTER 11 Bringing It All Together 211 The Interconnected Nature of the Framework 212 Performance Measurement Triggering Hypothesis Validation 212 Level 1: Manager Knowledge 213 Level 2: Peer Knowledge 214 Level 3: Not Readily Apparent 215 Hypothesis Validation Triggering Performance Measurement 216 Did the Corrective Action Work? 216 "Performance Measurement" as a Validation Technique 216 Operational Enablement Resulting from Hypothesis Validation 220 Operational Enablement Needs Performance Measurement 222 A Call Center Example 223 Enabling Good Ideas to Thrive: Effective Communication 225 Alright, Alright: You Do Need Technology 226 What Technology Does Well 227 What Technology Doesn't Do Well 228 Final Thoughts on Decision-making 230 Index 233 Acknowledgements xiii About the…