Big data: the end of the scientific method?

Recently, I came across an interesting article discussing the impact of current capability to produce, store and gather useful information from breathtaking amounts of data over conventional simulation-based science. The Big Data/ Machine-learning/ Artificial Intelligence is flourishing basically due to explosive growth of data production/acquisition/navigation capabilities and reading off patterns from complex data-sets through smart search algorithms across disciplines. This poses an epistemological questions- Are we entering into a new scientific era where the power of data renders obsolete the use of the scientific method? The articles assess the expectation from data-driven research and suggests that the boldest claims of Big Data are in need of revision and toning-down in the light of following points:

  • Complex systems are strongly correlated, hence they do not (generally) obey Gaussian statistics which is the fundamental assumption in data-driven science.
  • No data are big enough for systems with strong sensitivity to data inaccuracies
  • In a finite-capacity world, too much data is just as bad as no data.

In the end, the article focuses on how instead of rendering theory, modelling and simulation obsolete, Big Data should and will ultimately be used to complement and optimize it and help in overcome its current barriers: non-linearity, non-locality and hyper-dimensional spaces.

How ‘data’, when put in context, leads to ‘information’; analyzing the ‘information’ yields ‘knowledge’; ‘knowledge’ gained can be deeply understood by hypothesizing a model for its underlying cause leading to ‘wisdom’ which can be used to optimize the model by repeating the process. (Big data: the end of the scientific method?, Volume: 377, Issue: 2142, DOI: (10.1098/rsta.2018.0145) )