paint-brush
Model Evaluation With Proper Scoring Rules: A No-Math Introductionby@nikolao
1,088 reads
1,088 reads

Model Evaluation With Proper Scoring Rules: A No-Math Introduction

by Nikola O.3mOctober 27th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Proper scoring rules offer a model evaluation framework for probabilistic forecasts. Calibration tells whether our predictions are statistically consistent with observed events or values. Sharpness captures uncertainty in the predictions (predictive distribution) without considering the actual outcomes. We want our model evaluation technique to be immune to ‘hedging’. Hedging your bets means betting on both sides of an argument or teams in a competition. If your metric or a score can be ‘hacked’ then you might have a problem.

Company Mentioned

Mention Thumbnail
featured image - Model Evaluation With Proper Scoring Rules: A No-Math Introduction
Nikola O. HackerNoon profile picture
Nikola O.

Nikola O.

@nikolao

Combines ideas from data science, humanities and social sciences. Enjoys thinking, science fiction and design.

About @nikolao
LEARN MORE ABOUT @NIKOLAO'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Nikola O. HackerNoon profile picture
Nikola O.@nikolao
Combines ideas from data science, humanities and social sciences. Enjoys thinking, science fiction and design.

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite