Home Page
Fiction and Poetry
Essays and Reviews
Art and Style
World and Politics




By Bruce Fleming


The Montréal Review, September 2016


"Urs Fischer: Misunderstandings in the Quest for the Universal"
Installation view
Artworks © Urs Fischer, Photo by Rob McKeever,

Gagosian Gallery


Historians tell us that the University of Paris in the early 13th century was defined by its universal acceptance of a philosophical dogma derived from Plato and mandated by the Church, namely a belief in the realism of universals: what (say) makes all red things red is Redness, and has independent existence, ultimately in the mind of God. By the 15th century, however, the works of Aristotle with his rejection of Plato had gained the upper hand, and nominalism, the dogma that universals had no independent existence, gradually became the norm. One dogma displaced another. But of course both dogmas were only ever accepted by intellectuals: those outside were merely going about living their lives.

Today we face a similar situation, with those inside the universities and intellectual centers united in their comparably universal acceptance of a comparably universal dogma, while most outside are either oblivious to this phenomenon or eye it with distaste. This dogma, like that of Medieval Paris, insists on the realism of entities that to others seem more evanescent. 

The divergence between those who hold this dogma and those who don’t is not merely an academic dispute: after all the citizenry of western democracies are vastly more vocal and more powerful than were the peasants of Medieval France. In fact adherence (or not) to this dogma is behind the most visible clashes of North American (and to some degree Western European) society in the late twentieth and early twenty-first centuries, clashes between the educated and the uneducated, liberals and conservatives, coastal and inland, red and blue states in the US, coastal and interior provinces in Canada, pro-EU vs. pro-Brexit in Britain, pro-immigration vs. anti-immigration in France,  and upper and lower classes for all. The Medieval conflict of realism vs. nominalism may seem merely quaint, unrelated to our free-thinking world. Yet this remembrance of things past reminds us moderns that dogmas may underlie a whole group in a certain age, be taken for granted by all within it. Until, that is, they aren’t.

And of course dogmas aren’t really shared by everyone in a time period, only by those whom we focus on when we write the history, the ones who articulated. Peasants in France probably didn’t hold this view, or for that matter any view.

Nowadays it’s articulate educated better off liberals in North America and Europe who accept this modern dogma, such as people who read (or write) essays like this; it’s a dogma of the intellectual and usually economically privileged classes. Our dogma is related to our politics, which is dedicated to letting individuals speak, hearing their voices, getting their individual rights. We’re used to regarding with horror and not infrequently disdain those outside our group who espouse political views we find objectionable or even abhorrent, those who to us seem to be denying individual rights (while to them they seem merely to be espousing an absolute good, not trampling individuals).

The first step toward abandoning a dogma is articulating it. The dogma of the intellectual upper classes today is a bedrock belief in what I call “linguistic realism,” and by extension the realism of all representation: the belief that the world is made up of words and images, so that words (or images) are the very stuff of reality. If I say I am a woman, I am a woman, whatever others think. If I say I feel myself to be oppressed, I am. If I say that I was the victim of what we call sexual assault, I am—even if a court later decides there was no assault and hence no victim: in the meantime I get a Victim Advocate.  And as for “words will never hurt me?”: so last century. Now we focus on “hate speech.”

The indeterminacy of meaning, “facts” written with scare quotes, the subjectivity of the observer, constant references in the humanities and social sciences to Heisenberg’s Uncertainty principle (applying the reality of very small particles interacting with large human beings to practically everything), Thomas Kuhn’s famous “paradigm shifts” (which suggested that truth was relative to a specific way of seeing things), and then the Skokal Hoax where a scientist pretended to question the objectivity of science and was welcomed as a confrere, then said he was joking: what a time it was for American academia, this time of several decades before the Great Recession of 2008 and beyond!  Everything was a fiction, just some more pernicious than others. (Some, according to Richard Rorty, were actually positive, like patriotism. At least sometimes.) All inquiry, all thought, all academic enterprises—even science! only there, of course, the scientists demurred, but what did they know?—were subjective, or at least we could never achieve an objective world even if it turned out to exist. Philosophy took, in the words of Rorty, and influenced by the later Wittgenstein, a “linguistic turn”: we made structures of X and Y, simulacra of objectivity but really nothing but constructions of words.

The basis of linguistic realism is the rejection of words as a means, and the acceptance of them, and other forms of representation, as an end in themselves. It was the Romantics, those grandfathers and –mothers of the modern world, who first rejected the notion of the representational window as means out of the room of the self.  M. H. Abrams’s succinct summary of the contrast between the classical view and the Romantic offers two images of how words and art in general work: the classical view was that literature and the arts copied or mirrored the world (what Eric Auerbach, from World War II Istanbul, analyzed in Mimesis); the Romantic that they were like a lamp projecting outwards the light of the artist’s genius. Other thinkers past the early Romantics accepted their premises: Nietzsche articulated the notion that knowledge was subjective, and the expression of particular groups; this dovetailed with Marx’s more hard-headed insistence that tenets whether intellectual or artistic were the result of the economic givens of their time, as expressed by specific groups or classes within the economic landscape. And Freud articulated the subjective nature of our relationship with the world, all determined by our particular experiences and the structure of our own selves. Dreams, jokes, slips of the tongue: this subjective landscape was the world.

Thus what I am calling the linguistic realism of our day is a legitimate grand- or great-grandchild  of Romanticism, expressed in new terms. Its immediate progenitor was Modernism, itself a logical heir to Romanticism, which in turn produced post-Modernism. For the Modernists, painting was first and foremost a square with shapes in it, a flat surface; music an arrangement of tones; literature a grab bag of techniques of narration; dance movements between people on the stage rather than the means to conveying a story with development and denouement.

And meanwhile, through academia and the arts establishment, the liberal world of the US and to a great degree the entire West, the notion that representation mechanisms are as close as we can get to the world has become dogma. For the intelligentsia of the modern American world, we are surrounded by the membrane of our sign systems which we cannot pierce. Talking about the world becomes the most profound thing we can do, not interacting with it or with other people.

It’s the ultimate justification of intellectuals, probably because the vast changes in the real world in the twentieth century rendered them of only vestigial interest. It’s their form of revenge: if they can keep you talking, they’ve won, because you’re still caught in the web of words, emeshed in the play of signifiers. Derridean Deconstructionism, that intellectual dernier cri for long after its last echo should have faded on college campuses in the third quarter of the last century and beyond, waited until the speaker was finished and then showed him or her what it was that s/he had not said, what s/he had presupposed, what it was s/he had avoided saying. Anything seen as solid (human presence, say) turned out (surprise!) to be logically dependent on its opposite: anyone asserting the usual terms with plenitude was shown to actually be proving the primacy of absence. And of those asserting absence? They were the deconstructionists, and so not presupposing presence. They alone could not be trumped. It was all a game of responses, rather than assertions—which of course was the content of its philosophy (that assertions presupposed and indeed were secondary to responses). What fun!

This waiting until the other person implicated him- or herself by writing (Derrida held that writing was primary over speaking in terms of substance, just what we’d expect a writer to say) something determinate (the better to point out the indeterminacy of these words, any words—it was a magic trick performed countless times to what seemed always admiring academic audiences) palled even for those hired to do it professionally.  People wearied even of texts, as they were called. Finally the consensus became that we had to leave the library and go outside. Well, of course, not really outside, but at least to other libraries, other caches of words. Perhaps another section of the same library, leaving fiction for non-fiction (which of course was also fiction, but of a different sort—a much more malign sort because it didn’t admit to being fiction like the other, just made up.) And so the academic/intellectual view of “outside” was discussed, through the works of Michel Foucault.

The main thread through the works of Foucault, whose influence was close to absolute in the humanities and social sciences not just in America but outside as well for several decades and is still quite strong, was that it was words that established power relations in the world, invariably labeling the weaker group as subservient to the stronger. Foucault focused on things like taxonomies and illness (“The Order of Things”/”Words and Things”; “Madness and Civilization”; “The Birth of the Clinic”). The paradox he was articulating was that the motivation of those who coined words meant to identify groups so as to cure them may well have been benign, to help (for example) the mad rather than leave them to fend for themselves, but that the effect of this ostensibly neutral application of knowledge was invariably malign: naming allowed control. Before mental illness was established as a state, something that defined the individual as a type of person, there were according to Foucault merely harmless village idiots, understood as different but unlabeled, and not shut up in institutions designed to “help” them. Thus Foucault insisted, echoing Nietzsche and more faintly Marx, that there is no such thing as objective knowledge: we’re always up to something.

The most sensational form of Foucault’s Nietzschean insistence that there is no such thing as neutral knowledge, and that furthermore what poses as scholarship is actually a thinly-disguised act of appropriation and dominance, was that of his “History of Sexuality”: there was no homosexuality before the term was invented, which according to Foucault (more influential as a thinker than a historian, and much criticized in some circles for sloppy chronology) happened a bit more than a century ago. No homosexuality? asks the incredulous onlooker? Surely the ancients… Petronious…Plato… But that’s not what Foucault meant. He draws the distinction between non-essentializing actions and essentializing terms referring to the whole person or the action as something that defines the person. Sure, people did X and Y before the turn of the nineteenth century, however they were not, according to Foucault, seen as a particular type of person engaging in a type of action.

Foucault’s disciple Edward Said transferred this way of thinking to the consideration of political units as well in his astonishingly influential book “Orientalism”: not only, he claimed, had the West “invented” the Orient about two centuries ago, but the texts of the West were what had created a relation of oppressor to oppressed. This oppression, he thought, had nothing to do with greater gunpower (of course it does: since I teach in a military institution this for me is the most egregious of Said’s errors) but was effected only with words, in texts. Not only was the pen mightier than the sword, it had won by default. What a congenial idea for people who teach in universities and work in words! They were the most powerful after all! Of course some of them had been bad, very bad—but that would all be corrected now. Words had enslaved; words—now in the mouth of the formerly oppressed—would free. All of it words.

A book in words written by a professor about words in books would naturally conclude that alterations in the world were effected by words in books. Similarly a butcher might think that different ways of slaughtering animals determined what people ate and so how they lived. Only the wordsmiths were doing the talking; if butchers controlled the supply of beef they’d get to determine its nature too. This is the dogma of linguistic realism, as articulated in the Escher-esque fables of Borges and articulated by the literary theoreticians of the Yale school: we’re all caught in a hall of mirrors of references between texts, a footnote to a text that may yet have to be written. There’s no escaping the library, we heard. Of course there actually is, as anyone who has ever walked out the gates of a college campus can attest: I used to feel a great weight flying off my shoulders as I left the campus of Haverford College to go catch the Paoli Local to Philadelphia, or up to Merion to see the Barnes Foundation collection, now moved to Center City Philadelphia. If you never leave, you never feel it. It’s that simple. Or perhaps: if you don’t want to leave, you don’t feel it as a weight at all.

This view that the world is a series of footnotes to footnotes, us caught in the web of other texts, and so eternally within the referential hall of mirrors of signifiers, is the linguistic realism I am describing here. However it’s the dogma of only a specific group of people, those who use words professionally and who deal with art in the same way—which is to say the American (and to some degree Western) intelligentsia and by extension, the educated elite. Thus it is a dogma with a political valence: it’s the dogma of today’s liberals.

 It’s not held by those who work with their hands, by the dispossessed, by many people in “red” states of the US, or by most conservatives. It is not an exaggeration to say that this conflict over the dogma of linguistic realism—liberals espouse it, conservatives reject it—is the source of the deep national fissure between left and right currently playing out in the US. The liberal dogma of linguistic and semiotic realism in turn leads to the insistence that individuals when speaking create their individual worlds. Each of us has to articulate to be heard, and to make his or her world come to be. Making your voice heard and self-definition are, for liberals, the highest goods. Talking is its own end.

Meanwhile conservatives are disgusted by the value liberals place on “political correctness”—saying the right thing, with no consideration of what is thought or done. For liberals, it seems to conservatives, it’s all about saying, not doing. And conservatives see themselves as do-ers, not talkers. Increasingly, conservatives as a group overlap with the group of white males, held by some liberals to be responsible for most of the ills of the world: non-white non-males have to be linguistically protected. And it’s white males who cling to the now-discredited notion of a “master narrative” of their own values. (More articulate conservatives reject the Balkanization of the conversation that this “defend your own postage stamp of turf” view of culture implies.)

Those who accept the dogma of linguistic realism hold that because words are reality, any attempt by one person to impose a word on others is illegitimate. Racial minorities have the freedom to change their preferred monikers as often as they like and woe betide us if we fail to keep up; in the US, we no longer speak of “slaves” but of “enslaved people” (much less essentializing) or of “sex” but instead “gender”—the latter is up to us. Indeed what used to be a “sex change operation” is now “gender reassignment surgery”—as easy as changing an outfit of clothes. “Hate speech” is the worst ill on college campuses; terminology, it seems liberals are saying, is its own end. What gender do you self-identify with? is the question we are supposed to ask strangers. Race is exploded as a category; at most we ask what race(s) an individual self-identifies with. The self is paramount, and in control of its own linguistic destiny.

Conservatives roll their eyes; liberals assume this means they want to continue to control. Liberals see conservative resistance to speaking a certain way of a group as saying the group isn’t worthy of respect at all.  But this is to a degree a self-fulfilling assertion. The more the linguistic right of groups to be referred to in specific ways is asserted by liberals—something that seems irrelevant to conservatives, for whom words are not the world, but merely things in it—the more the conservatives in fact fail to respect those making these demands. It’s a vicious circle.  And it leads to polarization between right and left, intellectual and physical, male and female, young and old.

Economic conditions that favor service occupations over heavy industry are of course wreaking real-world havoc on the power not only of males, who used to fill these jobs, but also of conservatives, who are over-represented within their ranks This in turn led in part to the “tea party” obscurantism of the US and to the 2016 presidential campaign, most of whose backers are older white males who, according to the liberals, feel their power slipping from them as their world disappears—the US soon majority non-white, as advocates for illegal immigration chortle, while conservatives seethe: the one side sees lawbreakers, the other side sees the Face of the New America.

But political stakeouts are the result of dogma, buried beliefs. And the dogma behind this one is linguistic realism. No amassing of facts to explain the differences between groups creates by itself the clash we are currently experiencing. This is created by differences that go far beyond mere divergences of political views, recently pro or con gay marriage or abortion, now focused on transgender rights.

The root of the clash, if not the facts of the divergence, is the acceptance of this dogma of linguistic realism, or the rejection of that position. Part of the fury of the (frankly less linguistically adept) conservative groups that have lashed out at the very notion of consensus and shared governance is surely due to the fact that liberals are so smug, not so much about the specific tenets they espouse, but the rightness of their way of seeing things. Conservatives are attached to their beliefs too, but right now they see themselves fighting a losing battle, which infuriates them.

In North American intellectual life in the last half-century,  the  chattering and artistic classes have separated themselves from the “doing” classes, liberal from conservative, blue from red—as well as the military (a doing group) from civilians, a group they are meant to defend but actually despise because they don’t do quite as much as the military thinks it does.  The dogma of linguistic realism determines first of all the content and cast of those institutions which deal in words: universities, specifically the wordiest departments, those in the humanities and social sciences.

Since Saussure and Foucault, Western academics in these fields have insisted with practically one voice, confusing a dogma about words with their use at all, that words have meaning insofar as they relate to other words, and not by referring to an external world. (The external world is left to the scientists, and perhaps the business school.) Naming something is thus the ultimate act of power—a position unsurprising from people whose stock in trade is words.  Those who think words are just part of the real world rather than their own world, means to an end rather than the end in themselves, are simply naïve—academics today announce that they are here to “problematize” everything, which is understood by them as “showing the problems that are there but others are too dim to see” rather than, as outsiders might well assume, “create ‘problems’ with.”

Anybody who continues to talk or write is caught in the net: words beget more words and so become part of the chain, apparently proving the point of those such as the Derridean literary theorists of the end of the twentieth century. So the solution is actually not to talk, but rather to do other things: dig a ditch, sew a garment, turn screws in a factory, work out in the gym. However this is to cut the Gordian knot, and intellectuals wedded to words seemed unwilling or unable to do so. Meanwhile non-intellectuals were never part of this knot, and shrugged their shoulders. Many books were written by conservatives about how subversive the “tenured radicals” in universities were; what they failed to realize is that calling these “radical” was the highest compliment they could pay them; in the minds of the linguistic realists they were mad, bad, and dangerous to know rather than musty and stagnant.  

But the events of the last decade have given the lie to the notion that words construct the world. Reality, it turns out, has the last laugh after all.

The problem with the self-righteous indignation of acolytes of Said and Foucault about how the West oppressed the non-West with words was that we have to have decided beforehand which is the powerful and which the weak, because comparable words from the side pre-determined to be weak constitute not a reprehensible act of oppression but instead a laudable act of self-determination: some version of the “clever” pop-culture reference “The Empire Strikes Back” (get it???) was the theme of countless papers about literature from former colonial regions read at the Modern Language Association set in various US and Canadian cities in the last twenty years. Thus it turns out it’s not the words themselves that are an act of oppression, but their use by one party or another—so the whole thesis that words are themselves the agents of control crashes to the ground.  And how can we tell now who is weak and who strong when the US has been defeated in Iraq, when Indian and Chinese money is buying countless international concerns and are the source of Vancouver’s “cool,” when many American Indians (Canadian: First Peoples) are rich from casinos, and when North America fears the rise of China?

Literature, according to linguistic realism, is like an opaque image on the wall, rather than a window: it determines a world, rather than giving us a view of it. Literature, all literature, is therefore an exercise in domination: the goal of countless wild-haired Assistant Professors was to challenge, decenter, and problemetize this dominion. But what the Great Recession taught us was that nobody cares: college costs so much in the US that nobody signs up for non-lucrative degrees, such as in literature. Libraries are shrinking and don’t buy books, books themselves are going digital, professorships are disappearing, and colleges are hiring adjuncts instead. So much for saving the world through decentering hegemonic literature. The world, dismissed for so long, has struck back.

It turns out too that truth does end up mattering—that the “truth” so beloved of the linguistic realists has to lose its scare quotes. According to linguistic realism, words are the world, rather than within the world, so they can’t be compared to anything to check their truth. This means, among other things, that the anthropologist cannot give his or her view of another people; instead they must be encouraged to express themselves. (This is possible only if they speak a language conveniently comprehensible to the anthropologist, so indigenous people in richer colonial countries are ideal.) One result of this deferring to others to tell things the way they want to hear them is the Smithsonian’s National Museum of the American Indian in Washington, where individual tribes (peoples) tell their version of events in little mini-exhibits, with no institutional attempt to establish the oppressive and now rejected “master narrative” of objectivity. If a tribe says its people have been on their land since the beginnings of time, science will not speak of the Bering Strait and migration patterns.

Look around: what was once down is up, and it seems the “Third World” isn’t so Third after all. An Egyptian owns Harrods in London; the Indian Taj hotel chain owns the St. Regis Hotel in New York, and China owns a big chunk of US Treasury bonds, which allow the US to continue funding its universities, among other things.  (Full tuition-paying Chinese students are the hottest commodity at US institutions nowadays too.) Europe isn’t doing so well; America is stuck in second gear; Canada was doing well with oil sands but now the price of gasoline has tanked. Since the end of the Cold War and the Arab Spring many things have happened to shake up the notion that we Westerners control it all—through our words.  Have North American academics been paying attention?

Many have not. The things that made the dogma that words are the world a plausible one to hold—namely that those speaking were in the power position and so in that sense created the world—are fading, or have faded. And the dogma begins to look silly as a result: suddenly all those non-self-referential people in the middle of Canada or the US who didn’t buy into the notion that the world is made of words don’t look so stupid. And now Britain has voted against being part of the EU.

Yet our culture continues to be impregnated by this dogma of linguistic realism, the primacy of signs. The great discovery of Modernism a century ago was that writing was words, painting a flat surface with color on it, music an arrangement of tones, and sculpture an arrangement of shapes in space. Art nowadays takes for granted as well that art is its own world, indeed the world. The assumption of artists is that we have to go, and do go, to the museum to perceive anything at all. The goal of art is to put new things in museums—new, that is, to museums. That these are not new to the world outside isn’t considered, since this isn’t considered either. Thus all the “installations” of rooms or chairs, the films of people on the street or rain pelting into water, the rotting animals, the gigantic simulacra of machine parts in wood. When the world outside is considered, it too is seen in terms of the museum—namely as the not-museum. (For people who don’t take for granted that the world is expressed in art, it’s merely the world.) The museum is brought outside: the wrapped bridges or buildings, the ridges of dirt in the desert, the drive-in movie theater of a video loop projected on the side of the building, the entire house from the 1940s lovingly re-created. And we are supposed to applaud.

People who do not take for granted that all expression is in the form of artworks, the art form of linguistic realism, cannot applaud. To them it seems strange. They are people who have already seen rotting animals by the side of the road, the wrapped construction fronts of India (the wrapping eye-catching blue against bamboo scaffolding), the empty rooms with scattered chairs of an abandoned chain hotel’s conference room in Ottawa or Atlanta. Why should they pay attention to this one that an “artist” has signed and for which s/he is demanding our attention in the astonishing buildings of Frank Gehry or at the sprawling National Gallery of Canada in Ottawa? These people, those who do not accept the dogma of linguistic realism, do not need to go to museums or to see an artist transfer them inside. But they’re not allowed to point out that ridges in the sand are just ridges in the sand even if somebody signed them, and a pile of rocks in a museum is a pile of rocks even if it is in a museum. Artists demand we wait for them to show us the world, but we don’t have to, any more than we have to remain within the hall of mirrors of words or self-referential texts or words oppressing non-Western peoples. (These may well be oppressed, but it wasn’t the words that oppressed them; the words merely expressed the facts.)

But this is held by artists and their defenders to be dirty pool. Hey! You have to wait until we show it to you! You can’t just roll your eyes! If you do, you’re a Philistine!  Yet we don’t need artists to show us the world: the presupposition that we do is a sign of  the realism of representation that is the dogma of our day. In fact we need not pass through art; we can cut the Gordian knot: we can do without art, and merely perceive the world.

The dogma of linguistic realism is a class and education-defined dogma, the dogma of those Charles Murray, in his much-discussed book “Coming Apart,” calls the Belmonts—graduates of US elite colleges who live with each other in “super zips” (locally to both Murray and me, in Chevy Chase, MD) and who marry each other. They are the top of the pyramid. The bottom (Murray considers whites only) don’t go to college, rarely marry at all, and spend their lives in brushes with the law. The top of the pyramid hears constantly that recognizing the transmission medium as central shows how sophisticated one is: one reads James Joyce in college and goes to MoMA in New York to admire. (Margaret Trudeau, the mother of the current Canadian Prime Minister and the wife of his father, Prime Minister in the 1980s, was known for her fondness for New York: and why not? You get on the Adirondack Northway at the border and keep going through Albany. If you had a dollar—CDN or US, now about the same—for every license plate north of Albany announcing that it remembered, you’d be rich.) Thus the world that accepts linguistic realism is a world to which the average Joe or Jane, less sophisticated, feels an outsider. The average Joe or Jane does not assume s/he has to go to an art gallery to see a rotting animal, or sit in a college classroom to learn what critics have thought of other critics of novel X. Rotting animals are outside, and novels, should you want to read one, are in (fast disappearing) bookstores.

Those who fail to hold this dogma do not hold an opposing dogma, say linguistic nominalism.  For any dogma is an intellectual position, a fact of words. Those who don’t hold this self-referential dogma about words aren’t intellectuals, so they don’t have a position about words, which for them are means, not ends. This isn’t a clash of one school against another, it’s a clash of school vs. no school. It’s just that the content of the school view is so unchallenged within it. So those who hold the dogma of representational realism tend to hold in contempt those who reject this dogma without replacing it by another. Yet conservatives see themselves as do-ers, not talkers. Words for conservatives, and everything else, exist in the world, they don’t constitute it. This the conservative alternative to the dogma of representational realism is no dogma at all, but merely action. Liberals don’t understand how this can be intellectually valid, since it seems to, and indeed does, cut the Gordian knot rather than being bound by it.

The people who do not share this dogma vote with their feet: they leave the museum. Students for the Victimology 101 courses that now pass for humanities electives in major universities have already voted with their feet: nowadays the majors in the humanities are at historic lows. Students major in business, or go to the community or two-year college. The art market still posts record-breaking sales, say for Damien Hurst bejeweled skull, but that’s the 1% with its dogma that the sign system is its own end. The rest shakes its head.

Liberals aren’t wrong that so long as we continue to talk, stay within the hall of mirrors of representation, we can never get out. But this is so not because of the nature of words, or of signs, but because we refuse to leave. Conservatives leave, or never enter. Thus conservatives refuse to play by liberal rules. This drives liberals crazy, and also tends to make conservatives feel proud of themselves: they are Peck’s Bad Boy and James Dean rolled into one.

Conservative ethics are expressed in terms of actions: what should we (all of us, no distinctions of groups) do? This is why conservatives love lists of rules, like the Ten Commandments, and want to see them in public places even if we don’t really understand them all—the prohibition on graven images of God doesn’t speak to Christians except for the brief period of Byzantine iconoclasm in the 9th century, and we don’t know what “Thou Shalt Not Kill” means—animals? The enemy? Adolf Hitler? Liberal ethics, by contrast, are expressed in terms of actors: what should people (thus, I) do? The liberal world-view, and that of the educated elite that seems so foreign to Sarah Palin, Rick Santorum, the residents of Alberta, or the disgusted visitor to a big-city contemporary art gallery, sees the individual as the basic unit that can under certain circumstances be amalgamated into groups—but this has to be effected, and can always be rescinded. Still, since the group is composed of amalgamated individuals, it too has real existence. Conservative thought, by contrast, rejects the doctrine of group realism: everybody is supposed to do the same thing, follow the same rule. There is no such thing as the group because the individual doesn’t have to be amalgamated into units greater than one: this happens naturally as individuals all do the same thing, like fish in a school. Both liberals and conservatives see themselves as the last great individualists. Both are right, and both are wrong.

Conservatives see liberals as gassy whiners. If the question is, may I steal this loaf of bread?, the liberal wants to know about my situation—is my family hungry? Does the person who owns the bread deserve/need it? Much discussion follows. The conservative appeals to the rule, Thou shalt not steal, and there is no more talk. Liberals seem wishy-washy to conservatives, conservatives seem deaf to nuances (read: stupid) to liberals.

The currency of liberals is talk, elevated to the level of reality; the currency of conservatives is action. It’s not chance that liberals have adopted the dogma of linguistic realism: liberals talk professionally. This dogma works best for people who are comfortable, well educated, at home in their world and with a feeling of control, beyond the most fundamental levels of Maslow’s famous “hierarchy of needs.”

Linguistic realism is smug and self-serving: we talk, so talk is what makes the world. We make art, so art is necessary. We are sophisticated because we see that our medium of words, our stock in trade, is opaque, not translucent: if you think you can have such a thing as objective knowledge, which means—if you think you can do without us—you’re merely unsophisticated and probably stupid.

This is the dogma of the talkers, as perhaps the dogma of the bakers is that the world rests on a loaf of bread. Some powerful groups in society do not accept these premises—but they tend to be overlooked, since they’re not the ones talking, they’re the ones doing. Science for one doesn’t accept this dogma; for science, the world exists, and we can discover its secrets if we work hard enough. The military, for which I’ve worked for three decades as a professor at the US Naval Academy, doesn’t either: words aren’t power; force is power. (A similar view is prevalent at the RMC/CMR in Kingston.) But that means that the military nowadays has become conservative (immediately after World War II it barely had a political cast at all in the US), and science merely shrugs its shoulders and goes its way.

One side of a clash following a dogma and the other side not: that’s the nature of our poisonously fractured society. The polarization of our world into talkers and actors, at least in their self-identity (talkers do act and actors do talk) is at the basis of our conflicts today. Because it’s the liberal elite who hold the dogma, it’s up to them to give it up. Conservatives of course have a slew of single-issue political positions—usually associated with taxes, abortion, and sexuality. And they hold to them tenaciously. But liberals are wrong to see themselves as the reasonable ones, merely because they are ready to talk when conservatives won’t. Liberals just want to apply liberal ethics—words are their reality.

In the early 20th century it was vastly interesting to realize that the way we communicate can itself be an interesting subject. This interest is now dissipated. We were the ones who chose to spend a century focused on the glass in the windows rather than changing the focus of the lens and realizing that we can look through them to a world outside. We need not return to the classical notion that this is a mirror either: it’s clear glass, and we don’t have to consider it at all. Museums don’t have to be our defining unit of interesting objects, we can consider people whom we don’t know without the words themselves defining what we say about them, artworks don’t have to be arranged along a continuum from realistic to abstract, literature doesn’t have to be about the use of words so that that’s what we notice over other things. It is possible to approach a book without evoking the ghosts of all the people who have read and commented on it before; we do not trail the past behind every object we pick up—if, that is, we choose not to. We can relate it to us, and we are on our own journey with our own scale of values. This may be new to us, and interesting, or new and not interesting, or old but good, or old and boring. Each of us perceives the world as a combination of new and old: others can’t tell us how we are going to see it.

For that’s ultimately what the twentieth century’s dogma of linguistic realism was about: it said, everybody uses words and signs, so if I tell you how words and signs work, I have the ultimate control, I’ve made the ultimate explanation of the Universe. The Middle Ages talked about the absolute steps to salvation and levels of angels: that was the ultimate explanation. The Age of Enlightenment turned to the paraphernalia of the lived world: did objects exist outside of us? Were the rules of science real? And our age turned inwards. The linguistic turn, as Rorty had it, was the last gasp of Romanticism, the last turn of the screw to something people did as the Most Fundamental Structure. It’s sad, really: angels and redemption to things and the natural world to how people communicate. It’s an inexorable process of diminution.

Consciousness of words is possible while realizing that actions are the ultimate goal. That’s what I’d like to see for both liberals, and conservatives. But this bridging of the gap will require action by the wordsmiths, the liberals—not by the conservatives, who have already left the discussion. They’re out doing things, sometimes with disastrous consequences. Liberals have to lose the blinkers of their dogma. If the 15th century could do it, so can we.


Bruce Fleming is the author of over a dozen books and many articles, listed at www.brucefleming.net His degrees are from Haverford College, the University of Chicago, and Vanderbilt University. He taught for two years at the University of Freiburg, Germany, and for two years at the National University of Rwanda. Since 1987 he has been an English professor at the US Naval Academy, Annapolis.


Copyright © The Montreal Review. All rights reserved. ISSN 1920-2911

about us | contact us