Scientists currently find themselves with an explosion of available data. Big Data alters the epistemic landscape of science. Researchers increasingly rely on automated forms of reasoning to cope with the deluge of data. Some commentators characterize this state of affairs as a novel "data-driven" paradigm, in which we extract knowledge from the patterns contained within large ensembles of data. Data-intensive science is the new Baconian empiricism, wherein Big Data speak for themselves. On such a view, science is undergoing a theory-agnostic transformation that is reflected in the adoption of analytic tools that do not depend on prior understanding of the phenomena which underlie data. This shift towards model-free statistical exploration and data mining suggests a parallel to exploratory experimentation, which aims to expand extant conceptual frameworks or generate new concepts through the variation of parameters and observation of regularities. I am interested in understanding the characteristic aims of data-driven science with an eye towards developing normative standards against which we can evaluate particular methods.