Steve Lohr in NY Times: The Age of Big Data.
I just want to summaraize: to exploit the data flood, data need to be linked to intelligence with techniques, like natural language processing, pattern recognition (features recognition, I add) and machine learning.
In his Blogpost Launching a Democratization of Data Science, Stephen Wolfram has explained the capabilities of Wolfram|Alpha Pro to feed data in and analyze them automatically by a bunch of methods.
In the examples you see data in tabular form.
This is not only the typical form for statistical analysis but also for all kind of supervised and unsupervised machine learning methods, as they are implemented in mlf - extracting all kind of interpretable and computational models, maps and visualizations.