fbeta_score / README.md
leslyarun's picture
Update README.md
c3c2101
|
raw
history blame
1.57 kB
---
title: FBeta_Score
datasets:
-
tags:
- evaluate
- metric
description: "Calculate FBeta_Score"
sdk: gradio
sdk_version: 3.0.2
app_file: app.py
pinned: false
---
# Metric Card for FBeta_Score
***Module Card Instructions:*** *Fill out the following subsections. Feel free to take a look at existing metric cards if you'd like examples.*
## Metric Description
*Compute the F-beta score.
The F-beta score is the weighted harmonic mean of precision and recall, reaching its optimal value at 1 and its worst value at 0.
The beta parameter determines the weight of recall in the combined score. beta < 1 lends more weight to precision, while beta > 1 favors recall (beta -> 0 considers only precision, beta -> +inf only recall).*
## How to Use
``` python
f_beta = evaluate.load("leslyarun/f_beta")
results = f_beta.compute(references=[0, 1], predictions=[0, 1], beta=0.5)
print(results)
{'f_beta_score': 1.0}
```
## Citation
@article{scikit-learn,
title={Scikit-learn: Machine Learning in {P}ython},
author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V.
and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P.
and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and
Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.},
journal={Journal of Machine Learning Research},
volume={12},
pages={2825--2830},
year={2011}
## Further References
https://scikit-learn.org/stable/modules/generated/sklearn.metrics.fbeta_score.html#sklearn.metrics.fbeta_score