๐Ÿ“„ ์ธ๊ณต์ง€๋Šฅ ๋ชจ๋ธ ์ •ํ™•๋„ ์ธก์ •

2023-11-13 16:00

Confusion Matrix

  • TP: True Positive
  • TN: True Negative
  • FP: False Positive
  • FN: False Negative

Accuracy

$$accuracy = {์˜ฌ๋ฐ”๋ฅด๊ฒŒ ์ถ”๋ก ํ•œ ๋ฐ์ดํ„ฐ ๊ฐœ์ˆ˜ \over ์ „์ฒด ๋ฐ์ดํ„ฐ ๊ฐœ์ˆ˜}$$ $$accuracy = {TP + TN \over TP + TN + FP + FN}$$

Precision

$$precision = {TP \over TP + FP}$$

Recall

$$recall = {TP \over TP + FN}$$

F1 Score

$$score = 2 \times {precision \times recall \over precision + recall}$$ $$score = {2TP \over TP + FP + FN}$$

Kappa

$$K={๋ชจ๋ธ์ด True๋กœ ์˜ˆ์ธกํ•œ ๋ฐ์ดํ„ฐ ๊ฐœ์ˆ˜ - ์‹ค์ œ True์ธ ๋ฐ์ดํ„ฐ์˜ ๊ฐœ์ˆ˜ \over 1 - ์‹ค์ œ True์ธ ๋ฐ์ดํ„ฐ์˜ ๊ฐœ์ˆ˜}$$

MCC(Matthews Correlation Coefficient)

$$MCC = {TP \times TN - FP \times FN \over \sqrt{(TP+FP)(TP+FN)(TN+FP)(TN+FN)}}$$

ROC Curve

๊ฐ€๋กœ์ถ•์„ FP Rate(Specificity)๊ฐ’์˜ ๋น„์œจ๋กœ ํ•˜๊ณ  ์„ธ๋กœ์ถ•์„ TP Rate(Sensitive)๋กœ ํ•˜์—ฌ ๊ทธ๋ž˜ํ”„๋กœ ๊ทธ๋ฆฐ๊ฒƒ

$$specificity = {TN \over TN+FP}$$ $$sensitive = recall = {TP \over P}$$

AUC(Area Under Curve)

ROC ๊ณก์„ ์˜ ํ•˜๋ถ€ ๋ฉด์ .

  • ROC ๊ทธ๋ž˜ํ”„๋ฅผ ๊ทธ๋ฆฐ ๋‹ค์Œ ์ ๋ถ„ํ•˜๊ฑฐ๋‚˜ ๋†’์ด๋ฅผ ๋ชจ๋‘ ๋”ํ•˜์—ฌ ๋ฉด์ ์„ ๊ณ„์‚ฐ