Posts Tagged ‘Wolfram’

Finally, validated data sets to “crunch”!

June 9, 2009

I am excited by the announcement of WolframAlpha. Wolfram claims that this new dataset database will be validated, and therefore more valuable to the user. My hope is that the validation helps me “make a measurement, and not just a “reading”. For WolframAlpha to be truly validated, I think that four elements are necessary:

  1. the source of the data must be known and published;
  2. the actual data must be verified correct and accurate;
  3. other similar datasets must be identified as well, and it must be clear why they are similar, and also how they are different; it will be essential to know the complete and explicit schema; and
  4. some data may be missing from the dataset; for example, certain US Census datasets may omit responses from rural areas and towns smaller than a certain threshold population – in other words, there may be “context” to consider, and this is the real challenge for Wolfram – to explain the context of the dataset completely enough.

If Wolfram is “religious” about this kind of validation, their WolframAlpha data services will be tremendously valuable to researchers and analysts – these users can be more confident that they are making “measurements”!

Advertisements