p.(x) = Big Data Determinism (2020) by Daniel Sanderson - #Googleplanksip

Thinking in shades of grey is Bayesian. This inversion of causality is somewhat self-correcting by the shadow it casts. Outside the counterfactual, certain thought (and felt) experience contains a mass of intentionality. Critical by all means of regression, the arrow of causality is essential to the bias and the condition that dictates the human condition and our shared advantage as a species. Sure, the penumbra perspective is objective but only the antumbra is worthy of exhaustive rigour and continued intellectual focus. As a metaphor, this band of a shadow play is sui generis to Aristotelean forme.

Machine learning is the teleology at hand and my approach will be poetic yet swift. Examine the language and intentionality of what I present here for the counterfactual would be a refusal of the gift contained within. Speaking [thinking] of counterfactual alternatives to predicted outcomes adds intentionality that is akin to the theory of mind that I label as social inevitability. A "throwness" (Heideggerian term) where we receive the feedback of society and relationship within this flow of information defines our position in this metaphysical space.

According to Judea Pearl and Dana Mackenzie, contemporary statistical culture invokes the why question only as a desperate attempt to reconcile associations with mathematical analysis.[^1] Rather than continuing down this modal rabbit hole, the more efficient Praxis of epistemological propagation should couple counterfactual (ie. alternative outcomes) with the reality we observe, "a new paradigm has evolved according to which it is okay to base your claims on assumptions as long as you make you assumptions transparent so that you and others can judge how plausible they are and how sensitive your claims are to their volition."

The day after the data dump is an interesting day indeed. Mapping meaning across massive data sets should yield predictability, anything else would be unethical and quite frankly a waste of time. Too much time spent on the fringes of the Bayesian confuscates the goal-directed activity at a micro (algorithm) and macro (ontological) level. The logic and the arrow of causality is as important as ever, especially in predicting human behaviour, not to mention the quest for General Artificial Intelligence. What gene causes lung cancer? What kind of solar systems are likely to right for Goldilocks? What factors are causing ecological collapse? How do we effectively combat global burning (polemical term)?

Orthodox Big Data apologists preach the gospel of an unrevealed truth within the data. Like the great Italian sculptor of the Renaissance Michelangelo said, "Every block of stone [Big Data] has a statue [unrevealed truth] inside it and it is the task of the sculptor [data scientist] to discover it.". Hardly a pastiche, my approach is neither satirical but a warning to the public and thought leaders in our community to explore the counterfactuals of our nature. If likening this interplay between a counterfactual and correlative data points is too metaphysical for your taste then consider the comparison of real and imaginary numbers with real numbers and counterfactual claims of alternative outcomes. Beyond the binary dimensionality of the abscissa and the ordinate, the causal diagrams of Judea Pearl are perhaps paradigmatic like he claims. History will be the judge.

Interpretation of the data and identifying confounders is the key to understanding meaning, the significance of the information, and the potential of alternative applications. For instance, the p.(x) philosophy may appear to be a supportive claim for the Big Data bandwagon yet, in its infancy, the field of Big Data will mature and science will prevail (otherwise its a pseudo) and causality will be redeemed as the foundation of our knowledge claims.[^2] Why must always ask why?

Described as a final stage in analyzing causal inference problems this blueprint of reality[^1] describes causal models into the equations anchoring the triad of science, knowledge and data into a cohesive information structure. I agree that this is our best attempt at mapping the framework of knowledge from which our meaning should be mapped. In sociology, there are four different types of norms; forkways, mores, taboos, and laws. Mores are the cultural norms that we follow. Controlled assumptions are paramount to maintaining objectivity and falsifiability. Causal inference and accompanying diagrams are very useful for completing repeatable conceptual interpretations of estimates. Why must we follow these norms? Ah, the need for philosophy persists and so should her delightfully ordered aesthetic.

[^1]: The Book of Why, The New Science of Cause and Effect (2018), Chapter 10, pp.391-392 (electronic version).

[^2]: p.(x) = Big Data Determinism: In "sinew" 8-ing a Möbius ex nihilo (2020) is, in part, a tongue-in-cheek title meant to warn against the empty assumptions pulled from nothing (ex nihilo).


Support Your Friendly Neighbourhood Atelier Today!