ACC / README.md
Bekhouche's picture
Add main files
d181a48

A newer version of the Gradio SDK is available: 5.42.0

Upgrade
metadata
title: Accuracy
emoji: πŸ“ˆ
colorFrom: blue
colorTo: red
sdk: gradio
sdk_version: 4.44.0
app_file: app.py
pinned: false
license: apache-2.0
tags:
  - evaluate
  - metric
short_description: Accuracy (ACC)
description: >-
  The Accuracy (ACC) metric is used to measure the proportion of correctly
  predicted sequences compared to the total number of sequences. This metric can
  handle both integer and string inputs by converting them to strings for
  comparison. The ACC ranges from 0 to 1, where 1 indicates perfect accuracy
  (all predictions are correct) and 0 indicates complete failure (no predictions
  are correct). It is particularly useful in tasks such as OCR, digit
  recognition, sequence prediction, and any task where exact matches are
  required. The accuracy can be calculated using the formula: ACC = (Number of
  Correct Predictions) / (Total Number of Predictions) Where a prediction is
  considered correct if it exactly matches the ground truth sequence after
  converting both to strings.

Metric Card for ACC

Metric Description

The Accuracy (ACC) metric is used to measure the proportion of correctly predicted sequences compared to the total number of sequences. This metric can handle both integer and string inputs by converting them to strings for comparison. The ACC ranges from 0 to 1, where 1 indicates perfect accuracy (all predictions are correct) and 0 indicates complete failure (no predictions are correct). It is particularly useful in tasks such as OCR, digit recognition, sequence prediction, and any task where exact matches are required. The accuracy can be calculated using the formula:

ACC = (Number of Correct Predictions) / (Total Number of Predictions)

Where:

A prediction is considered correct if it exactly matches the ground truth sequence after converting both to strings.