How the “Continentals” Internalized Their Oppressors.

1. Scientism and Noumenal Realism in Analytic Philosophy.

It cannot be rationally denied that “Analytic philosophy” (henceforth without the shudder quotes) has always been predominantly and even aggressively scientistic, whether by way of formal science (logic and mathematics) or by way of natural science (primarily physics, but also chemistry and biology).

The Analytic tradition began with Frege’s and Russell’s Logicist formal scientistic programs for reducing either arithmetic or all of mathematics to logic, and both were Platonic noumenal realists.

Logical Empiricism gave up on Logicism after Gödel, then went over to natural scientism.

Quine and Sellars gave up on Logical Empiricism, but remained scientistic, via their scientific naturalism.

Then Kripke and Lewis fused formal scientism (via modal logic), natural scientism (via Kripke’s scientific essentialism and Lewis’s criterion of naturalness for basic properties), and both were noumenal realists (via Kripke’s essentialism and Lewis’s possible-worlds realism).

Kripke and Lewis begat Analytic metaphysics, which also fuses modal-logic driven formal scientism, Lewisian naturalness-driven natural scientism, “carving nature at the joints,” and noumenal scientific realism.

Correspondingly, Ted Sider’s Writing the Book of the World is the Gideon’s Bible of Analytic metaphysics.

And so on, and so on, till five minutes ago.

2. Going Back to SoKali.

Until 1996, “Continental philosophy” (henceforth also without the shudders) was consistently anti-scientistic and anti-noumenal-realistic.

All the major Continentals from Husserl and Heidegger through Sartre and Merleau-Ponty, et al, up to and including the Post-Structuralists (Bataille, Deleuze, Foucault, Guattari, et al) and the Deconstructionists (Derrida, De Man, et al) were deeply skeptical about the idea that “science is the measure of all things,” and anti-realists of some brand or another.

Indeed, al was probably the most important Continental up to 1996.

But then:

The Sokal affair, also called the Sokal hoax, was a publishing hoax perpetrated by Alan Sokal, a physics professor at New York University and University College London. In 1996, Sokal submitted an article to Social Text, an academic journal of postmodern cultural studies. The submission was an experiment to test the journal’s intellectual rigor and, specifically, to investigate whether “a leading North American journal of cultural studies – whose editorial collective includes such luminaries as Fredric Jameson and Andrew Ross – [would] publish an article liberally salted with nonsense if (a) it sounded good and (b) it flattered the editors’ ideological preconceptions”.

The article, “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity”, was published in the Social Text spring/summer 1996 “Science Wars” issue. It proposed that quantum gravity is a social and linguistic construct. At that time, the journal did not practice academic peer review and it did not submit the article for outside expert review by a physicist. On the day of its publication in May 1996, Sokal revealed in Lingua Franca that the article was a hoax, identifying it as “a pastiche of left-wing cant, fawning references, grandiose quotations, and outright nonsense … structured around the silliest quotations [by postmodernist academics] he could find about mathematics and physics.”

The hoax sparked a debate about the scholarly merit of humanistic commentary about the physical sciences; the influence of postmodern philosophy on social disciplines in general; academic ethics, including whether Sokal was wrong to deceive the editors and readers of Social Text; and whether Social Text had exercised appropriate intellectual rigor. (Wikipedia, boldfacing added)

Professional academic Continental philosophers were, thereby, shamed and scandalized by Sokal.

And I remember the professional academic sadistic glee with which Analytic philosophers recounted The Sokal Hoax to each other.

Or in other words, by means of The Sokal Hoax, the Continentals were professionally hyper-disciplined by their Analytic oppressors.

And ever since 1996, in the Grand Tradition of Hegel’s master-slave dialectic in the Phenomenology of Spirit, the Continentals have internalized their oppressors, by, pathetically, going over to scientism and/or noumenal realism.

Indeed to spin a phrase around The Notorious B.I.G.’s famous rap anthem, “Going Back to Cali,” ever since 1996 the Continentals have been going back to SoKali.

In more detail now, here’s how their SoKali regressionhas played out.


3. Badiou To You Too.

Alain Badiou, born 17 January 1937, is a French philosopher, formerly chair of Philosophy at the École Normale Supérieure (ENS) and founder of the faculty of Philosophy of the Université de Paris VIII with Gilles Deleuze, Michel Foucault and Jean-François Lyotard. Badiou has written about the concepts of being, truth, event and the subject in a way that, he claims, is neither postmodern nor simply a repetition of modernity.

For Badiou the problem which the Greek tradition of philosophy has faced and never satisfactorily dealt with is that while beings themselves are plural, and thought in terms of multiplicity, being itself is thought to be singular; that is, it is thought in terms of the one. He proposes as the solution to this impasse the following declaration: that the one is not. This is why Badiou accords set theory (the axioms of which he refers to as the Ideas of the multiple) such stature, and refers to mathematics as the very place of ontology: Only set theory allows one to conceive a ‘pure doctrine of the multiple’. Set theory does not operate in terms of definite individual elements in groupings but only functions insofar as what belongs to a set is of the same relation as that set (that is, another set too). What individuates a set, therefore, is not an existential positive proposition, but other multiples whose properties (i.e., structural relations) validate its presentation. The structure of being thus secures the regime of the count-as-one. So if one is to think of a set – for instance, the set of people, or humanity – as counting as one, the multiple elements which belong to that set are secured as one consistent concept (humanity), but only in terms of what does not belong to that set. What is crucial for Badiou is that the structural form of the count-as-one, which makes multiplicities thinkable, implies (somehow or other) that the proper name of being does not belong to an element as such (an original ‘one’), but rather the void set (written Ø), the set to which nothing (not even the void set itself) belongs. It may help to understand the concept ‘count-as-one’ if it is associated with the concept of ‘terming’: a multiple is not one, but it is referred to with ‘multiple’: one word. To count a set as one is to mention that set. How the being of terms such as ‘multiple’ does not contradict the non-being of the one can be understood by considering the multiple nature of terminology: for there to be a term without there also being a system of terminology, within which the difference between terms gives context and meaning to any one term, is impossible. ‘Terminology’ implies precisely difference between terms (thus multiplicity) as the condition for meaning. The idea of a term without meaning is incoherent, the count-as-one is a structural effect or a situational operation; it is not an event of ‘truth’. Multiples which are ‘composed’ or ‘consistent’ are count-effects. ‘Inconsistent multiplicity’ [meaning?] is [somehow or other] ‘the presentation of presentation.’

Badiou’s use of set theory in this manner is not just illustrative or heuristic. Badiou uses the axioms of Zermelo–Fraenkel set theory to identify the relationship of being to history, Nature, the State, and God. Most significantly this use means that (as with set theory) there is a strict prohibition on self-belonging; a set cannot contain or belong to itself. This results from the axiom of foundation – or the axiom of regularity – which enacts such a prohibition (cf. p. 190 in Being and Event). (This axiom states that every non-empty set A contains an element y that is disjoint from A.) Badiou’s philosophy draws two major implications from this prohibition. Firstly, it secures the inexistence of the ‘one’: there cannot be a grand overarching set, and thus it is fallacious to conceive of a grand cosmos, a whole Nature, or a Being of God. Badiou is therefore – against Georg Cantor, from whom he draws heavily – staunchly atheist. However, secondly, this prohibition prompts him to introduce the event. Because, according to Badiou, the axiom of foundation ‘founds’ all sets in the void, it ties all being to the historico-social situation of the multiplicities of de-centred sets – thereby effacing the positivity of subjective action, or an entirely ‘new’ occurrence. And whilst this is acceptable ontologically, it is unacceptable, Badiou holds, philosophically. Set theory mathematics has consequently ‘pragmatically abandoned’ an area which philosophy cannot. And so, Badiou argues, there is therefore only one possibility remaining: that ontology can say nothing about the event.

Several critics have questioned Badiou’s use of mathematics. [Physicist] Alan Sokal and physicist Jean Bricmont write that Badiou proposes, with seemingly “utter seriousness,” a blending of psychoanalysis, politics and set theory that they contend is preposterous. Similarly, philosopher Roger Scruton has questioned Badiou’s grasp of the foundation of mathematics, writing in 2012:

There is no evidence that I can find in Being and Event that the author really understands what he is talking about when he invokes (as he constantly does) Georg Cantor’s theory of transfinite cardinals, the axioms of set theory, Gödel’s incompleteness proof or Paul Cohen’s proof of the independence of the continuum hypothesis. When these things appear in Badiou’s texts it is always allusively, with fragments of symbolism detached from the context that endows them with sense, and often with free variables and bound variables colliding randomly. No proof is clearly stated or examined, and the jargon of set theory is waved like a magician’s wand, to give authority to bursts of all but unintelligible metaphysics.

An example of a critique from a mathematician’s point of view is the essay ‘Badiou’s Number: A Critique of Mathematics as Ontology’ by Ricardo L. Nirenberg and David Nirenberg, which takes issue in particular with Badiou’s matheme of the Event in Being and Event, which has already been alluded to in respect of the ‘axiom of foundation’ above. Nirenberg and Nirenberg write:

Rather than being defined in terms of objects previously defined, ex is here defined in terms of itself; you must already have it in order to define it. Set theorists call this a not-well-founded set. This kind of set never appears in mathematics—not least because it produces an unmathematical mise-en-abîme: if we replace ex inside the bracket by its expression as a bracket, we can go on doing this forever—and so can hardly be called “a matheme.”’ (Wikipedia, boldface added)


4. Speculative WtF?

Speculative realism is a movement in contemporary philosophy which defines itself loosely in its stance of metaphysical realism against the dominant forms of post-Kantian philosophy (or what it terms correlationism). Speculative realism takes its name from a conference held at Goldsmiths College, University of London in April 2007. The conference was moderated by Alberto Toscano of Goldsmiths College, and featured presentations by Ray Brassier of American University of Beirut (then at Middlesex University), Iain Hamilton Grant of the University of the West of England, Graham Harman of the American University in Cairo, and Quentin Meillassoux of the École Normale Supérieure in Paris. Credit for the name “speculative realism” is generally ascribed to Brassier,though Meillassoux had already used the term “speculative materialism” to describe his own position. (Wikipedia)


5. Markus Schmarkus.

Markus Gabriel (born April 6, 1980) is a German philosopher and author at the University of Bonn. In addition to his more specialized work, he has also written popular books about philosophical issues.

In 2013, Gabriel wrote Transcendental Ontology: Essays in German Idealism. In the Notre Dame Philosophical Reviews Sebastian Gardner wrote that the work is “Gabriel’s most comprehensive presentation to date, in English, of his reading of German Idealism” and notes that “due to its compression of a wealth of ideas into such a short space, the book demands quite a lot from its readers.”

In an interview, Gabriel complained that “most contemporary metaphysicians are [sloppy] when it comes to characterizing their subject matter,” using words like “the world” and “reality” “often…interchangeably and without further clarifications. In my view, those totality of words do not refer to anything which is capable of having the property of existence.” He goes on to explain:

I try to revive the tradition of metaontology and metametaphysics that departs from Kant. As has been noticed, Heidegger introduced the term metaontology and he also clearly states that Kant’s philosophy is a “metaphysics about metaphysics.” I call metametaphysical nihilism the view that there is no such thing as the world such that questions regarding its ultimate nature, essence, structure, composition, categorical outlines etc. are devoid of the intended conceptual content. The idea that there is a big thing comprising absolutely everything is an illusion, albeit neither a natural one nor an inevitable feature of reason as such. Of course, there is an influential Neo-Carnapian strand in the contemporary debate which comes to similar conclusions. I agree with a lot of what is going on in this area of research and I try to combine it with the metaontological/metametaphysical tradition of Kantian and Post-Kantian philosophy. (Wikipedia)


6. Tranhumanism, Trans-Scientism, or Trans-Scientology?

Transhumanism (abbreviated as H+ or h+) is an international and intellectual movement that aims to transform the human condition by developing and creating widely available sophisticated technologies to greatly enhance human intellectual, physical, and psychological capacities. Transhumanist thinkers study the potential benefits and dangers of emerging technologies that could overcome fundamental human limitations, as well as the ethics of using such technologies. The most common thesis is that human beings may eventually be able to transform themselves into different beings with abilities so greatly expanded from the natural condition as to merit the label of posthuman beings.

The contemporary meaning of the term transhumanism was foreshadowed by one of the first professors of futurology, FM-2030, who taught “new concepts of the human” at The New School in the 1960s, when he began to identify people who adopt technologies, lifestyles and worldviews “transitional” to posthumanity as “transhuman.”

This hypothesis would lay the intellectual groundwork for the British philosopher Max More to begin articulating the principles of transhumanism as a futurist philosophy in 1990 and organizing in California an intelligentsia that has since grown into the worldwide transhumanist movement.

The year 1990 is seen as a “fundamental shift” in human existence by the transhuman community, as the first gene therapy trial, the first designer babies,[9] as well as the mind-augmenting World Wide Web all emerged in that year. In many ways, one could argue the conditions that will eventually lead to the Singularity were set in place by these events in 1990. (Wikipedia)


7. Neuro WtF?

At Stanford University in 2012, a young literature scholar named Natalie Phillips oversaw a big project: a new way of studying the nineteenth-century novelist Jane Austen. No surprise there—Austen, a superstar of English literature and the inspiration for an endless array of Hollywood and BBC productions based on her work, has been the subject of thousands of scholarly papers.

But the Stanford study was different. Phillips used a functional magnetic resonance imaging (fMRI) machine to track the blood flow of readers’ brains when they read Mansfield Park. The subjects—mostly graduate students—were asked to skim an excerpt and then read it closely. The results were part of a study on reading and distraction.

The “neuro novel” story was quickly picked up by the mainstream media, from NPR to The New York Times. But the Austen project wasn’t merely a clever one-off—the brainchild, so to speak, of one imaginatively interdisciplinary scholar. And it wasn’t just the result of ambitious academics crossing brain science with “the marriage plot” in unholy matrimony simply to grab headlines. The Stanford study reflects a real trend in the humanities. At Yale University, Lisa Zunshine, now a literature scholar at the University of Kentucky, was part of a research team that studied modernist authors using fMRI, also in order to better understand reading. Rather than a cramped office or library carrel, the researchers got to use the Haskins Laboratory in New Haven, with funding by the Teagle Foundation, to carry out their project, in which twelve participants were given texts with higher and lower levels of complexity and had their brains monitored.

Duke and Vanderbilt universities now have neuroscience centers with specialties in humanities hybrids, from “neurolaw” onward: Duke has a Neurohumanities Research Group and even a neurohumanities abroad program. The money is serious as well. Semir Zeki, a neuroaesthetics specialist—that is, neuroscience applied to the study of visual art—was the recipient of a £1 million grant in the United Kingdom. And there are conferences aplenty: in 2012, you could have attended the aptly named Neuro-Humanities Entanglement Conference at Georgia Tech.

Neurohumanities has been positioned as a savior of today’s liberal arts. The Times is able to ask “Can ‘Neuro Lit Crit’ Save the Humanities?” because of the assumption that literary study has descended into cultural irrelevance. Neurohumanities, then, is an attempt to provide the supposedly loosey-goosey art and lit crowds with the metal spines of hard science.

The forces driving this phenomenon are many. Sure, it’s the result of scientific advancement. It’s also part of an interdisciplinary push into what is broadly termed the digital humanities, and it can be seen as offering an end run around intensifying funding challenges in the humanities. As Columbia University historian Alan Brinkley wrote in 2009,  the historic gulf between funding for science and engineering on the one hand and the humanities on the other is “neither new nor surprising. What is troubling is that the humanities, in fact, are falling farther and farther behind other areas of scholarship.”

Neurohumanities offers a way to tap the popular enthusiasm for science and, in part, gin up more funding for humanities. It may also be a bid to give more authority to disciplines that are more qualitative and thus are construed, in today’s scientized and digitalized world, as less desirable or powerful. Deena Skolnick Weisberg, a Temple University postdoctoral fellow in psychology, wrote a 2008 paper titled “The Seductive Allure of Neuroscience Explanations,” in which she argued that the language of neuroscience affected nonexperts’ judgment, impressing them so much that they became convinced that illogical explanations actually made sense. Similarly, combining neuroscience with, say, the study of art nowadays can seem to offer an instant sheen of credibility.

But neurohumanities is also the result of something else. Neuroscience appears to be filling a vacuum where a single dominant mode of thought and criticism once existed. That plinth has been held in the American academy by critical theory, neo-Marxism and psychoanalysis. Alva Noë, a University of California, Berkeley, philosopher who might be called a “neuro doubter,” sees neurohumanities as a reaction to the previous postmodern moment. “The pre-eminence of neuroscience” has legitimated an “anti-theory stance” within the humanities, says Noë, the author of Out of Our Heads.

Noë argues that neurohumanities is the ultimate response to—and rejection of—critical theory, a mixture of literary theory, linguistics and anthropology that dominated the American humanities through the 1990s. Critical theory’s current decline was somewhat inevitable, as all intellectual movements erode over time. This was exemplified by the so-called Sokal affair in 1996, in which a physics professor named Alan Sokal submitted a hoax theoretical paper on science to Social Text, only to unmask himself and lambaste the theorists who accepted and published his piece as not understanding the science. Another clear public repudiation was the harsh Times obituary in 2004 of the philosopher Jacques Derrida, who was dubbed an “abstruse theorist”—in the obit’s headline, no less. But as critical theory’s power—along with that of Marxism and Freudianism—fades within the humanities, neurohumanities and literary Darwinism are stepping up, ready to explain how we live, love art and read a novel (or rather, how the cortex absorbs text). And while much was gained as “the brain” replaced “individual psychology” or social class readings, much has also been lost.

Critical theory offered us the fantasy that we have no control, making a fetish of haze and ambiguity and exhibiting what Noë terms “an allergy to anything essentialist.” In neurohumanities, by contrast, we do have mastery and concrete, empirical ends, which has proved more appealing, even as (or perhaps because) it is highly reductive. At least since George H.W. Bush declared the 1990s the decade of the brain, the media have been flooded with simplistic empirical answers to many of life’s questions. Neuroscience is now the favored method for explaining almost every element of human behavior. President Obama recently proposed an initiative called Brain Research Through Advancing Innovative Neurotechnologies, or BRAIN, to be modeled on the Human Genome Project. The aim is to create the first full model of brain circuitry and function. Scientists are hoping that BRAIN will be as successful (and as well funded) as the Human Genome Project turned out to be. (The Nation)


8. My Critical Case Against Recent Continentals.

I rest it here and now.

–But we also shouldn’t forget WHO the oppressors were,

i.e., the Analytics,

and WHY they oppressed the Continentals,

i.e., in order to enforce the hegemony of the military-industrial-university complex:

Home Sweet Soames: “Philosophy’s True Home”

“Analytic” vs. “Continental” Philosophy.: WtF? Why Does It Still Matter So Much?

Philosophy Without Borders is creating Philosophy | Patreon
Become a patron of Philosophy Without Borders today: Get access to exclusive content and experiences on the world’s largest membership platform for artists and creators.
Share this post