理解Sensitivity和Specificity
2017-12-01 14:56
99 查看
Sina Weibo:小锋子Shawn
Tencent E-mail:403568338@qq.com
http://blog.csdn.net/dgyuanshaofeng/article/details/78686117
Wikipedia给出了详细的说明Sensitivity and specificity。
Sensitivity,我们常说“敏感性”、“灵敏性”、“召回率”或者“查全率”,维基百科的说明为“Sensitivity (also calledthe true positive rate, the recall, or probability of detection
in some fields) measures the proportion of positives that are correctly identified as such (e.g. the percentage of sick people who are correctly identified as having the condition).”。计算公式如(1)式:
Sensitivity/TPR = TP / (TP + FN) (1)
其中,TP为将正类预测为正类数,FN为将正类预测为负类数。TPR为True positive rate。Sensitivity的值越大,说明“有病的被判断为有病的”越大,“漏检”(FN)越小。
Specificity,我们常说“特异性”,维基百科的说明为“Specificity (also calledthe true negative rate)
measures the proportion of negatives that are correctly identified as such (e.g. the percentage of healthy people who are correctly identified as not having the condition).”。仿照公式(1),其计算公式如(2)式:
Specificity/TNR = TN / (TN + FP) (2)
其中,TN为将负类预测为负类数,FP为将负类预测为正类数。TNR为True negative rate。Specificity的值越大,说明“健康的被判断为健康的”的越大,“误检”(FP)越小。
与这两个评价指标相关的其他指标,还有精确率(precision),也叫“查准率”,或者“PPV(positive predictive value)”,其计算如(3)式:
PPV = TP / (TP + FP) (3)
其中,TP和FP的定义像上面所述。与之对应的就是“FPR(negative predictive value)”,其计算如(4)式:
FPV = TN / (TN + FN) (4)
我们使用机器学习算法,特别是深度学习算法进行分类任务时,通常使用“分类准确率(accuracy)”,其计算如(5)式:
ACC = (TP + TN) / (TP + FP + FN + TN) (5)
另外,为了折中精确率precision和敏感性sensitivity,可以使用F1 score/F1值,其计算如(6)式:
F1 = 2TP / (2TP + FP + FN) (6)
Tencent E-mail:403568338@qq.com
http://blog.csdn.net/dgyuanshaofeng/article/details/78686117
Wikipedia给出了详细的说明Sensitivity and specificity。
Sensitivity,我们常说“敏感性”、“灵敏性”、“召回率”或者“查全率”,维基百科的说明为“Sensitivity (also calledthe true positive rate, the recall, or probability of detection
in some fields) measures the proportion of positives that are correctly identified as such (e.g. the percentage of sick people who are correctly identified as having the condition).”。计算公式如(1)式:
Sensitivity/TPR = TP / (TP + FN) (1)
其中,TP为将正类预测为正类数,FN为将正类预测为负类数。TPR为True positive rate。Sensitivity的值越大,说明“有病的被判断为有病的”越大,“漏检”(FN)越小。
Specificity,我们常说“特异性”,维基百科的说明为“Specificity (also calledthe true negative rate)
measures the proportion of negatives that are correctly identified as such (e.g. the percentage of healthy people who are correctly identified as not having the condition).”。仿照公式(1),其计算公式如(2)式:
Specificity/TNR = TN / (TN + FP) (2)
其中,TN为将负类预测为负类数,FP为将负类预测为正类数。TNR为True negative rate。Specificity的值越大,说明“健康的被判断为健康的”的越大,“误检”(FP)越小。
与这两个评价指标相关的其他指标,还有精确率(precision),也叫“查准率”,或者“PPV(positive predictive value)”,其计算如(3)式:
PPV = TP / (TP + FP) (3)
其中,TP和FP的定义像上面所述。与之对应的就是“FPR(negative predictive value)”,其计算如(4)式:
FPV = TN / (TN + FN) (4)
我们使用机器学习算法,特别是深度学习算法进行分类任务时,通常使用“分类准确率(accuracy)”,其计算如(5)式:
ACC = (TP + TN) / (TP + FP + FN + TN) (5)
另外,为了折中精确率precision和敏感性sensitivity,可以使用F1 score/F1值,其计算如(6)式:
F1 = 2TP / (2TP + FP + FN) (6)
相关文章推荐
- Notes: sensitivity & specificity
- Recall(召回率) Precision(准确率) F-Measure E值 sensitivity(灵敏性) specificity(特异性)漏诊率 误诊率 ROC AUC
- [置顶] Recall(召回率) Precision(准确率) F-Measure E值 sensitivity(灵敏性) specificity(特异性)漏诊率 误诊率 ROC AUC
- [置顶] TP,TN,FP,FN,Precision,Recall,sensitivity,specificity,FPR,TPR,F1值,ROC曲线,PR曲线的解释
- Recall(召回率) Precision(准确率) F-Measure E值 sensitivity(灵敏性) specificity(特异性)漏诊率 误诊率 ROC AUC
- 敏感性、特异性、假阳性、假阴性(sensitivity and specificity)
- 熟悉RIP的基本配置,理解RIP的工作原理工作过程,以及RIP中路由表的学习过程。
- JavaScript可否多线程? 深入理解JavaScript定时机制
- 每天进步一点点——五分钟理解一致性哈希算法(consistent hashing)
- 理解Semaphore及其用法详解
- 对于this和$(this)的个人理解
- Android四大组件理解
- unityshader固定管线的一些概念理解
- Linux 的pthread_create 和 pthread_join 函数的一些新的理解
- LinkedHashMap学习理解
- 你真的理解了Collection和Map集合吗?
- 对hibernate懒加载的理解
- 理解了塔吊是如何工作的
- 如何正确理解自动化测试?
- Android自定义view的三个构造函数理解