Definitions[edit | edit source]

Artificial intelligence[edit | edit source]

Accuracy

pertains to an AI system's ability to make correct judgements, for example to correctly classify information into the proper categories, or its ability to make correct predictions, recommendations, or decisions based on data or models. An explicit and well-formed development and evaluation process can support, mitigate and correct unintended risks from inaccurate predictions. When occasional inaccurate predictions cannot be avoided, it is important that the system can indicate how likely these errors are. A high level of accuracy is especially crucial in situations where the AI system directly affects human lives.[1]

Data[edit | edit source]

Accuracy is

the degree of closeness of a measured or calculated quantity to its actual (true) value. When used in connection with databases, it refers to the correctness of the data contained in that database.

General[edit | edit source]

Accuracy is

  • "conformity to fact, or the degree to which the recorded value represents the 'correct' value."[2]
  • "[t]he degree to which a measured value conforms to true or accepted values."[3]
(1) [a] qualitative assessment of correctness or freedom from error. (2) A quantitative measure of the magnitude of error. Contrast with precision. (3) The measure of an instrument's capability to approach a true or absolute value. It is a function of precision and bias.[5]

Overview[edit | edit source]

"Accuracy is a measure of correctness. It is distinguished from precision, which measures exactness."[6]

References[edit | edit source]

See also[edit | edit source]

Community content is available under CC-BY-SA unless otherwise noted.