Data Quality as Meta?

dataflow Add comments

When presenting data, try to include some sense of quality or accuracy, even if it’s just a flag “I derived this” or “I got this from a very accurate source” or “this is a space-filler”.

I wanted to highlight something I saw quite interesting in Axeda Corporation‘s Gateway and Connector technologies: Quality of metrics. Axeda uses an enumeration of simple qualities (Good, Bad, or Unknown), and this could theoretically be used when choosing which of two conflicting data types to show.

The simple act of collecting and summarizing metrics is not necessarily made easier when the precision meta is tracked, but it can help the end-user make better decisions based on this data: if you see an aberrant data point, do you know it’s seriously out-of-norm and needs to be acted upon, or is it based perhaps on a ratio with a questionable denominator, and should be taken with a bit of skepticism?

Consider precision, or at least define why it’s out-of-scope for your work.

Leave a Reply

WP Theme & Icons by N.Design Studio
Entries RSS Comments RSS Log in