Ghosts of '64
Asked in an interview towards the end of his life if "aesthetics might eventually become just a branch of cognitive science," the philosopher Arthur Danto answered, in part,
I don't believe that. Works of art need to be interpreted. There's something to what Derrida would say, that there is an infinite number of interpretations. It may be very difficult for anybody to think out more than one or two. But if somebody is intelligent and motivated enough, they probably could do it.
And further on,
When Marat was assassinated..., they said, 'Take up thy brush, David, take up thy brush!' They would never have said, 'Pick up your Bunsen burner,' or 'Pick up a test tube,'... They said, 'Take up thy brush,' because they felt that David would say something of very deep meaning. You wouldn't get that out of science at that time. Meaning in terms of human life: that comes from art, I think. So, I don't find the idea of aesthetics as a branch of cognitive science compelling at all. Why should it be?
Indeed, the "idea" is not compelling or even sane. Neither aesthetics nor any other branch of philosophy per se can simply be collapsed into any other discipline. Healthy boundaries form around disciplines for a reason, the same reason that they form around individuals and around communities. People create what they need. Cognitive science is a gift to philosophers, actually. It could have saved them much time and trouble over the centuries. Still, no science can supersede quite everything that philosophers contribute to knowledge and to practice. Finally, if nothing else, someone has to keep a weary eye on the lab-bound cognitivists and their armadas of "volunteer" subjects. This point has recently been remembered thanks to the "replication crisis" of the 2010s and 2020s, but it will be forgotten soon enough.
That is one way a philosopher might respond to this question, one way they might be conciliatory and realistic without participating in their own premature supersession. Unfortunately, with all due respect to the venerable Nation critic, Danto's answer is very troublesome when it is considered in relation to the specific question that was asked. It is a philosophically naive answer issuing from a philosopher of considerable intellect and learnedness, a philosopher who in fact began his career, as he also relates here, precisely in the philosophy of science. What is going on here?
These remarks imply that there is something about the social practice of art which must remain a Black Box not merely to formal "science" but to collective rationality writ large; and meanwhile, whichever human being is on the receiving end of the "interpretation" (if there is indeed anyone out there) is somehow receiving knowledge, or at least enlightenment; failing that it is edification, and as a last resort it is harmless entertainment.
This section elaborates the contrary view: it is certain individual cognitions which must remain (and will in any case) irrational, properly speaking, in order for art to be practiced socially in a just and meliorative manner. Contrarily, it is precisely the social practice of art which must be hardened in the crucible of rationality in order for any just individualism to be possible. Art is made and experienced irrationally but must be transacted rationally; which is really to say, ethically. Whether artist, astronomer or actuary, one learns through practice how to do right by ones fellows and crystallizes this knowledge somewhere on the "outside" of themselves, where others may find it close to hand and subject it their own practical evaluations.
This refers to "transaction," not to form or content. An art transaction can be made un-ethical by the form-and-content of an artwork if the outcome is both foreseeable and sufficiently adverse; but form-and-content cannot make such a transaction ethical, no matter how hard one tries. Work does not become more virtuous the more virtue is packed into its aesthetic surface. Virtuous content may be transacted unethically. These six words are a serviceable encapsulation of the entire recent history of art and entertainment. Here, then, is yet another pragmatically underdetermined contradiction. It is far easier for an artist (or anybody else) to know how to ruin everybody's day than how to make their their day; all the same if both kinds of knowledge are very incomplete.
The bottom line, then, is that works of art do not need to be interpreted; in fact they need not to be interpreted but rather experienced, as Susan Sontag says in her first famous essay. Interpretation also, as Sontag pungently argues, may well be a distress flare shot into the sky by a critic who finds that "for some reason a text has become unacceptable; yet it cannot be discarded." "The interpreter," then, "without actually erasing or rewriting the text, is altering it." Of course this can never be facilely generalized or assumed, and it has undoubtedly become less the case over time, in purely statistical terms, with the advent of innumerable genres and subgenres of criticism. It is clear enough that criticism per se has not, for a long time, been bounded by the mere need to dissimulate in the face of changing circumstances and unchanging "texts." The mere statistical norm, however, is not quite what is at issue here. What is at issue is the worst case.
What is being done, if not full-stop "alteration" of a text, when a critic of Danto's erudition issues a bulletin for which there was no life-and-death need? What if the bulletin does suggest a change to the meaning of a text, perhaps gently, or worse, perhaps simply passes down word that the meaning has changed? What is to be done about this, or because of it? If the answer is, "Nothing at all," then the matter could be closed and critics would themselves have become artists, and criticism an art form, in truth rather than just hyperbole. This, however, does not accord with everyday experience at all, it does not accord with anything critics say when prompted to elaborate upon their métier, nor does it accord with very many of their actions, in any way at all. If critics were not trying to will something very particular into being, if they were not trying to act upon the world in a very particular way, they would no longer be critics, indeed. Thou doth protest too much.
The most plausible hypothesis is found in Becker's remark that "a self-reflexive animal can only get the full meaning of its acts by observing them after they have happened;" in other words, it is that critics are committing an act for the purpose of observing it. Any "meaning in terms of human life" then emerges in terms of their own lives. No meaning of any artwork enters into it except as a cipher.
To say this is not to say that criticism is "performative" in the contemporary sense, or not exactly. In one important respect it is the reverse. It is not (or not only) a performance to be observed by others. The critic is the one doing the observing, actually: after they have observed an artwork, a blank page, a few cups of coffee, and for the luckiest ones, a meager per diem, then they may observe, finally, the full meaning of these ("their own") acts. One can only envy them this privilege, and one can only pity the "audience" for criticism, the "levers" without which the critic's "self cannot come into being" (Becker).
A great many critics have accused artists of playing or building or writing only for themselves. Perhaps it takes one to know one.
Artists do quite often proceed solipsistically and in disregard of their fellows, it is true. What is wrong with that? A few things, but probably nothing too terrible. The big problems arise when people say and act as if they are doing the reverse of what they are actually doing, when they concede that their arguments are not strictly rational, because none can be, and when they assert, therefore, that those arguments cannot be met rationally. This is some kind of problem with artists, really with all human beings. It is the whole problem with criticism.
There is much yet to be said about formal, professional criticism and critics, but be it proposed also, in brackets as it were, that "criticism" here can stand as metonym for a wide swath of pragmatist and colloquial discourse about intentionality, and "critic" for the human beings so implicated and occupied. Every artist, then, faces critics and criticism, faces the above-named contradictions; not so often in some public forum like a newspaper or a Twitter feed (not unless the artist is very well-known), but certainly in everyday life. The contours there are easy to see, though, whereas in formal criticism all tracks are covered very carefully. That is the only difference.
In Sontag's allegory of ancient interpretation, the audience, so-called for present purposes, does not have much to do but await the word from on high; and then, once things settle down, there is presumably no more word, or perhaps not until "the power and credibility of myth" has once more "been broken by the "realistic" view of the world." That is not quite how things remained, however, once "classical antiquity" was in the rear-view mirror. Instead,
the contemporary zeal for the project of interpretation is often prompted not by piety toward the troublesome text (which may conceal an aggression), but by an open aggressiveness, an overt contempt for appearances.
No reader of old Arthur Danto's books could reasonably accuse him of "aggressiveness," not in manner or intent anyway; and if it was charged, in court, that he himself evinced "an overt contempt for appearances," nothing more nor less, the litigation of that charge would have to proceed in a very peculiar way: it would go on for a very long time, certainly; it would absolutely captivate two or three dozen people scattered around the world without captivating anyone in the courtroom very much at all; it would involve establishing that Danto's elaboration of the problem of indiscernible objects was itself a concern with and for "appearances," namely with those of the identical variety; it may involve demonstrating that the man has always shown some minimal concern, at least, for his own "appearance." All of this would undoubtedly end in his full exoneration and return to public life, unscathed, having had his dirty laundry aired and his clean laundry introduced as evidence. But this, still, would not show that criticism, his or anyone else's, really needed to be written. It would not show that the world would be any different had none of it been written, nor would it show why any of it was written. To what urgent questions, then, is criticism the answer?
A standing army of Danto's colleagues have spilled a whole century's worth of ink attempting enumerate those urgent questions, the ones which purportedly arise at the beginning of the critic's process and closely guide everything that happens from there. It is obvious enough by now, though, that it is the answers which come first, answers in search of questions, always searching and never not able to find some mark. A rigorous argument may be presented, but the premises from which it has been followed out are not rigorous.
The very same year that Sontag published "Against Interpretation," Danto was present for the premier of Warhol's Brillo Boxes. This was his peripeteia; this was the detonator stuck in a pinata with the fuse already lit. Most importantly, it is the origin story of much philsophical insight into and speculation about the role of aesthetics and appearances, both in general and in the endeavor which is (still) called "art." The sheer erudition of this work of Danto's is beyond reproach. His statements here are not, however, nor are myriad details of his print works, in spite of their erudition.
"Interpretation," says Sontag, famously, "is the revenge of the intellect upon art." By 2024, ever-advancing specialist knowledge and global interconnectedness make for a gathering storm of revenging intellect. The seeking of revenge on this scale, however, is precisely what must
not
happen if there is to be any art left in the world at all, or any justice, or perhaps any
people
also. It has already begun, of course, and it proceeds apace. Therefore, it is necessary to leverage both philosophy
and
cognitive science, and much else besides, in order to show just what interpretation and criticism really come down to and why it is imperative that they, eventually, might come down to nothing at all.
Social Ceremonials
Danto's and Maes' comments here exalting interpretation actually bring into higher relief a fundamental problem of transactional criticism: to agree that "there is an infinite number of interpretations" is to more or less to grant that even "very deep meaning in terms of human life" is ultimately ephemeral.
Danto speaks of an intepretation "fall[ing] into place." He speaks of an art historian wrestling with the meaning of "a series of enigmatic frescoes in Ferrara." "He really cracked it! But it wasn't cognitive psychology that did it, it was the zodiac, finally, that gave him the clue."
Per Maes' preliminary remarks,
Given [Danto's] conception of art it seems obvious what the task of the art critic should be, namely, to find out what a work is about and then explain how the stylistic choices of the artist embody the meaning of that work.
The first problem, then, with any effort to explain how "meaning" is embodied in an artwork is that this meaning must first be nailed down; but if there are infinite interpretations limited only by the "intelligence" and "motivation" of the interpreter, this suggests either that meaning too can be infinitely elaborated or that something inscrutable happens as the critic completes the first part of her "task" and turns to the second part. If ultimate "explanations" of how meaning is "embodied" by "the stylistic choices of the artist" are literally "infinite," then what can be the use of any one of these explanations, or even a few dozen very "intelligent" ones?
To say that there are infinite interpretations is not to say that anything and everything can qualify as an interpretation. Only certain statements qualify, to be sure. This at least draws some boundaries around the enterprise, but these boundaries remain far broader than the conceit to "crack" the code of an artwork based on the available "clues" would suggest. Linguistic compositionality and "human meaning" alike are scalable, perhaps functionally unbounded; and so it is never too difficult to come up with something that "falls into place." This only becomes difficult when methodological rigor is called upon, before the fact, to close off almost everything that may be said; this all the same if, technically, there is also an unbounded field of statements which are not interpretations. Needless to say, though, laboratory psychology itself, gallingly, has only very recently understood the need for "preregistration," for the need to "separate" the "exploratory" phase from the "confirmatory" phase of research. Criticism is not psychology, nor is it any other kind of science; and yet the above bipartite method suggests "researcher degrees of freedom" so great as to preclude the ascription of any definite "task" to a critic so occupied. It also suggests, as does recent science, a bias toward positive results. Where are all the interpretations which do not quite "fall into place"? Are there any "stylistic choices" which cannot be called into service to "embody" any and all possible meanings?
Danto speaks also of seeing and interpreting the Sistine ceiling: "The first thing that hit me was that the birth of Eve is in the middle." He notes that his eventual interpretation would never have been possible to arrive at "if you [that is: he] didn't live in a culture saturated by feminism. ... So you never know where a new interpretation is likely to come from." All the same, "I don't think there was any whiff of feminism when Michelangelo was doing that."
This suggests that the meaning on which interpretation bases itself not only can change but that it will change. And indeed it does, as is suggested, to work with the given example, by the ever-changing fortunes of "feminism" and the more constant but not-quite-entirely-constant fortunes of the Sistine. The example also suggests, though, that such change is macrohistorical: it is too slow to be observed as it is happening, and it does not permit of clean distinctions between the time before and the time after the change. If so then the audience for criticism will be waiting a long time between installments. They do not wait very long at all, though, and that is the tell.
According to Maes, Danto's
emphasis on the contextual and historical nature of art . . . is an aspect of art that, in Danto's opinion, receives its most illuminating expression in Heinrich Wölfflin's claim that 'not everything is possible at every time.'
And indeed it is not. Beyond any doubt. What of it? What was possible but was not realized? What about all of that stuff? Or, is history predestined and all that remains for criticism is simply to observe of any given event that it must be just so?
What Danto does not offer, the cryptic remark about Derrida notwithstanding, is a straightforward example of something that is possible at all times: two people approach an artwork in the same time and place and, for any of a number of reasons, ascertain in it wildly different meanings. This rejoinder works best when the object is an art-object, because art reception, conventionally, is the paradigmatic subjective act; but it does not need to be an artwork qua artwork for the point to hold. If sports teams never wear the same jerseys for more than a few televised games a year, then people in the back of the bar will have trouble figuring out who is playing whom; they will know that there are only two teams, but not who is who; and they will not know the "meaning" of the final score, though to know the score is to have perfect information about it. To hazard a touchier example, people of androgynous appearance, gait and grooming are easily "misgendered" by peers; how and how easily being a matter of bitter dispute, of course.
Even on the more granular levels of perception and discourse-about-perception, there are complications. As William Ittelson puts it, in a paper that every Conceptual Artist would do well to read closely,
The wonder of art is that it "evokes a corresponding resonance in the mind of the maker and the recipient." But this is rarely achieved. Between the original intent and the final markings are many possible "alternative histories," only one of which was followed and cannot in principle be retrieved from the marking alone. We can, of course, ask the creator of the marking, if available, what the marking is intended to mean. This may work in a few simple cases, but, in the general case, it runs into a curious paradox. This seemingly innocuous question asks the originator of the marking to produce a second marking that will elucidate the first. Will this marking be more easily understood than the original? Presumably not, or it would have been created in the first place. So we enter an endless regress. ... We can, as an alternative approach, ignore the original intent and assess the marking entirely within the nexus of contemporary social structure and practice. But this produces multiple answers, no one of which can be demonstrated to be uniquely correct. In actually dealing with markings, we inevitably combine both approaches. There is, in principle, no way of determining a single, stable answer to the correct perception of a marking.
Ittelson continues,
Nevertheless, in dealing with markings, we do not experience anarchy; instead, we typically perceive a definite meaning that we are confident is correct. We can do this because the form or structure of the marking does provide powerful constraints on the meaning. These are the same constraints that guide the hand of the maker of the marking,... These constraints rarely lead us astray. ... Markings that have the form Have gone to the store are rarely intended to mean Am in the kitchen cooking dinner. ... Markings that have the form of a cat are rarely intended to represent dogs. When we couple these constraints with equally powerful constraints imposed by current social usage, very little room may be left for multiple interpretations.
This process works; we generally deal successfully with the vast numbers of markings we daily encounter. But this is a pragmatic, not principled, solution to the problem of the correctness of the perception of markings, and it can be wrong.
What, if anything, then, simply must be "principled" when it comes to the symbolic and aesthetic aspects of art? Is there anything? Is it all the same things which simply must be so for artist, audience, and critic alike? Ittelson means "principled" only in the most abstract sense, does he not? He could not possibly mean what a GP or a CEO means by "principled," could he? He would not dare to imply that one or the other of the people who disagree as to the meaning of an artwork simply must be wrong? That one or the other must have something wrong with them?
No reductionist answer to this problem fails to be highly misleading. If "the nexus of contemporary social structure and practice" happens to be derelict in some such small matter, then that matter, tiny as it may be, is fully intractable all the same; and then, what have the interpreters taught, if they have taught anything at all, but that it is the artists who are society's "principled" ones, sniffing out these microderelictions like bloodhounds?
When Ittelson refers to "alternative histories," he is not making history here, nor is he unmaking it; simply put, he is onto the problem with elevating Wölfflin's aphorism to a truism, onto it from another angle of attack; and he attacks, so to speak, in proper academic fashion, with a torpedo and a shrug.
Metacriticism's habitual elision of this half of its own hypothetical becomes comprehensible once it is seen just what havoc this ephemerality of meaning wreaks upon the doctrine that "artworks need to be interpreted" as that doctrine is rendered here by Danto and Maes. No human being has to look very hard, or at all, for "meaning" as they walk the proverbial earth. Purpose? Yes. Meaning? No. If a human being walks into a monochrome canvas or an atonal string quartet, there is all too much meaning there, evidently, for those who find these works most abominable, same as for the True Believers.
Neither type of audience has to look very hard
for what something means
to them.
This personal meaning, though, is "inside" the person and therefore is "unrevealable" in the general run of things, and critics know that the effort to reveal it anyway does not make for good transactable material. So, they have to look very hard for something that
is
transactable, and in this they also are terribly constrained, now on the "outside," by all manner of social norms and revealed preferences among the audience. So, if they do not like the first thing, or the first hundred things that pop into their head, then they just have to keep trying to "crack" the problem. But it is
their own
problem;
the audience would better be left out of it entirely.
If a critic (or anyone else) is unsure of the meaning of an artwork, even to themselves, yet they feel, for whatever reason, driven by conscience to settle the matter, they cannot really do this very well anyway simply by pinballing interpretations off of their social cohort. They can discover the "pragmatic" ways of their community this way, certainly, and that is actually far more important than settling into discrete personal meanings. But this pinball's short journey strictly "within the nexus of contemporary social structure and practice" not only cannot establish "the original intent" (another hornet's nest), it also cannot establish what the artwork
really
means to that person; and if they
say
that it does, then they have discovered a truly adept form of pragmatism which will serve them well and probably keep them happy. Unfortunately,
happiness be damned,
a statement to this effect is
unfit to
enter into
"principled" discourse.
It should not even enter into everyday life, actually, because it cannot be trusted even there.
To this it will be objected, first, that people are but products of their social surroundings. Even if this is granted in such a reductionist form, it fails as an objection. People are not mere empty vessels to be filled with whatever comes floating down the river, any and all of it. Even the most thoroughgoingly "social" people do not just blindly incorporate any and all incoming stimuli; even they do not incorporate it one-to-one without any difficulty or distortion; all appearances to the contrary, it must be said, as politely as possible. Social superstars are people, too. People filter. Most of all, people distort almost everything that they let in, and though there are some near-universal mechanisms of distortion, there is nothing like universality of outcome on the level of, say, the "interpretation" of artworks. If this were not the case, any and all information gathered about the world would be perfect information, despite being incomplete; there would be theoretical but no empirical science, and "the nexus of contemporary social structure and practice" would be never be railroaded into pinch-hitting for either while it is truly neither.
Next, it will be objected that it really is quite unimportant whether any given audience member can say, for and to themselves, what the work in front of them "really" means.
Bingo.
What is so often said of critics and academics, that they write about something because they cannot "do" it, is actually very wrong in principle, even if it is factually true once in a while. It would be better, actually, if critics became as artists, not so that they would stop writing but so that they would stop thinking it is the discerning of "meaning" or of "stylistic choices" which is their "art form" in a peculiar metaphorical (perhaps hyperbolic) sense. It is past time to put to bed the idea that criticism is not really an art because it is too rational or "objective" to be art. Criticism is wholly irrational and illogical, actually, not necessarily because it is really some kind of covert art, but because its pretensions, its methods, and its outcomes are not coherent with each other.
What on earth would artists be doing if they too were not making their work simply in order to observe its effects, after the fact? Is that not the point? Artists had very well better observe any effects of their work that they are able to observe. That is part of any practice. It is the only way that art, too, does not end in war and pestilence. This is not the same, however, as making art (or criticism or history) in order to observe its effects.
This is what Pure Art is: Pure Art is not made in order to observe its effects. They may be observed, and must be observed, sometime, in order to reconcile the practice with community norms; but the moment of creation need not be colored with the same kinds of intent as are those social practices by which consciousnesses and selves are formed. The pure artist has already done some of that, and is always doing more of it, but their art is not the vehicle for this part of life. They have enough of that already; otherwise, even this species of absolutist, still a human being, would have to go looking for some social action, and fast, lest their consciousness and their self were to disintegrate from lack of social stimulus.
Pure art differs from quotidian social practice in this way. That is not the same thing as saying that it is not a social activity or institution in any sense at all. Rather, it is made, just as its doppelganger expression says it is, "for its own sake." What does that mean? It means that the artist does not have any effect in mind from the outset. They know that making art changes them, somehow; making art also, technically speaking, changes the world too . . . somehow. But this kind of artist does not have a preconceived change in mind, not on either side of the transaction.
It is dangerous, actually, not to have thought at all about what the change might ultimately be, is it not? Indeed it is, and that is why the pure artist replaces the seeking of effects with seeking to minimize any effects; and, to reiterate, they are not ignorant of everything else they have ever observed. The theory does not hold that effects, inevitable as they are on both transmitter and receiver, are ignored. It is only that when new work is made, effects are not what is sought; whereas there is much, including much criticism, where effect is sought not even to observe it, but indeed also "for its own sake." That is the whole problem.
How to minimize effects? The ultimate minimization of effects would be to do nothing, to make no art at all. Yes? No. That is Pure Purity, the regression to nil.
The world needs
some
art, first of all. By now there is no sense in insisting that art, broadly, is not necessary, even if very little of it, in granular detail, is even pleasant or interesting. To entirely withhold or repress one's art would be just as bad as to make too much of something that is not needed.
Secondly, one cannot get out of the game other than by death, and that is (thankfully) not a palatable option for most people, at least not until they first have had their fill of life. Death is not palatable, and it need not be any more palatable than it is.
By then, artists (all people) are at the table, and they cannot get out of the game.
Something must be done.
Try to hear that phrase
as matter-of-factly as possible; try
not to hear it
in the imperative,
as though uttered by a social or existential superior. There is no
Big Idea
here, but rather a tiny granule of formal game theory.
As long as one is still alive, one is doing something. Try to hear this thought, right now, as value-free and open-ended, and yet remember that it is
fulfilled,
eventually,
only by
discrete, value-laden deeds.
Everyone who lives
does
something or other;
nobody who lives does
nothing. Life
that is already underway and ongoing does not permit of abstention.
Some
course of action must be taken.
For fans of abstention, this means that
all other courses
may be abstained from, even if life itself (without simply ending it) cannot be. Much is possible at all times, but not all is realized,
nor should it be.
Wölfflin's aphorism is not wrong, it is just meaningless.
The artist Asger Jorn, was, it would seem, not of this persuasion, not exactly, but he encapsulates the point beautifully (and in 1964, no less).
What one expresses through destruction is critique. Critique is a secondary reaction to something primary which already exists. What one expresses through artistic creation is joy of life. Art is primary action in relation to the unknown. The French have brought critique into the revolutionary plan, but if critique also becomes the purpose of creative art, and the creative artist thus a "specialized worker," whose work should only serve the permanent revolution's permanent consumption, then these Situationists have lost any sympathetic contact with the artists who seek to create a joy of life for its own sake, and drive them precisely into the arms of the power elite, which always controls the destructive instruments that can crush the people down, and which always make sure to have a moral excuse to make it all good and thorough.
What does Jorn mean by "primary" and "secondary?" As a matter of formal procedure it means that the artwork has come first and the critique second, but perhaps with time and space being relative this is not as important anymore. More importantly, what it means is that the critic works off of something fixed, or at least fixed in some peculiar ways that the mere apple in the eye of an artist is not. Criticism is knowing and calculated. There is much triangulation, "working out" and "falling into place," but there is no scientific rigor; there is no solution which is comparable to Mr. Young's dampening of the floor in the rat corridor. Certain Ivory Tower criticism can only be called rigorous, in execution, and yet it remains woolly in its very first conception, on which the execution is based. It is a long math problem with a rounding error in the very first calculation.
A critic must converge on something rather than diverge out into space (though that can just be art and it is not always bad). But they may converge on any among dozens or hundreds of things, and the way they decide which one(s) to converge upon is nothing noble or enlightenting; rather, it is "transactional," as much formerly naive social life has become.
The need for a transactable self is not, in and of itself, a new development. Becker, following Goffman, devotes considerable space to the matter.
We can only fully appreciate the importance of face when we realize that nothing goes deeper than the exposure of the self-esteem to possible intolerable undermining in the social encounter.
... On the one hand, society has a right to engage the self, to lay a social claim on it... On the other hand, each individual has the right to keep others at a distance, and insist on his body privacy, his separateness, the simple fact that he is a person. ... There is a delicate tension to be maintained in social life, between avoiding and approaching others, a recognition and respect for the self, and a tacit claim on it.
He continues, a few pages later:
If we are properly proud, we have learned not to submerge others with what may be uncomfortable private data. We may tell our boss that we are ill, but we will not tell him the shape and color of our stool. To have learned honor is to know when to refrain from encompassing others with one's inappropriate designs. It is this overflow that we call "privatizing" the social context. ... Man must make provision for the utmost sensitivity in social intercourse. Goffman goes so far as to say that this fine social sensitivity is what we mean when we speak of "universal human nature." ...
Ceremonials for avoidance provide for a psychic as well as for a physical distance. They imply that the self is personal. On the other hand, ceremonials for engaging the self imply that, if properly approached, the self cannot refuse to be social. We may politely decline a seat someone has offered,... We refuse the gesture but we acknowledge the validity of the social claim. We cannot kick the chair over and remain silent. Now, it is obvious that neither of these processes could occur at all if there were no integral performance selves. Therefore, a fundamental obligation for social living is that the individual have a self. There must be something socially transactable.
May artists, audiences, critics, and all other people "learn honor," in due time.
Criticism in situ
Reflecting upon the Situationist movement after it had entered history, Jacqueline de Jong, interviewed by Karen Kurczynski, offers some penetrating hindsight on just what all the mutual "critique" had come down to in the end.
JdJ: The Drakabygget people were making détournements of articles of mine, such as "Gog and Magog," in their magazine. They even did it with articles by [Asger] Jorn. You could say there was a degree of faking in what they were doing. One thing that has not been mentioned is that we, the Situationists, always had an anti-copyright declaration in all the magazines. The anti-copyright statement in The Situationist Times read, "all reproduction, deformation, modification, derivation and transformation of The Situationist Times is permitted." Of course it means that everything is permitted, but you don't expect your comrades to deform your texts as they were meant to be serious.
KK: Deform or détourn?
JdJ: Deform, I think. It's a good question.
As with Sontag's account, these are essentially nonrepeatable historical events which can only mislead as prospective forecasts. They are mere cautionary tales: in other words, they are historical events whose repeatability is not known but is feared.
From a distance it is easy enough to conclude, perhaps without really wanting or trying to think so, that the Situs were really asking for it when they gave the entire universe blanket permission to "modify" their work; to "transform" its very meaning. If a piece of social rhetoric is "meant to be serious," if indeed it is part of a larger project of revolutionary change, why not be absolute at least in forbidding it to be "deformed"?
Of course
there was always some
such absolutist
rigidity
just under the Situationist surface
and peridocially boiling over:
the antithesis will ride again, always.
Presumably there may be
good
strategic
reasons
to
choose ridigity or playfulness,
but it is clear enough in hindsight that
pure strategy
is difficult even for an absolutist to achieve.
Absolutists are people, too.
What is
especially
poignant here,
finally,
whether or not it really explains or prefigures anything, is that "you don't expect your comrades" to be the ones who go the
furthest
in "deforming" the (by extension
their own)
revolutionary rhetoric. Hence the "element of faking," indeed. Artists and revolutionaries alike have too often had to squander tremendous portions of their meager wherewithal policing the fakes among their comrades. That is one problem. The bigger problem, though, is that it is more than mere faking when it is your own comrades who have done it.
So it goes with the Situationists, among whom everybody seems to be asking for it, from each other, fairly obviously. That is most of the Situationist practice right there. Who is at any greater fault than any other here? Someone must be! This is hindsight calling. Hindsight is monist. People appeal to the dual nature of causality only when they need, for whatever reason, to stamp out the semantic incommensurability of "victim" and "perpetrator." Why? Probably there are as many reasons as there are people. Anyway, this is how to blame the victim without appearing to do so: blame everyone as if all were equally complicit and equally wronged; despair of sorting things out any more granularly (which is, admittedly, usually very difficult). Go dual. Be absolutely correct, intellectually correct, in doing so.
One way to go granular, to identify a true victim, is if the victim was not using the perpetrator as a lever; if the victim was not intent on observing the effects of their actions; if the victim, rather, had no intent and no intended actions to realize at all, or not vis-a-vis the perpetrator at least. It was the perp, rather, who had intent, acted on it, and more importantly, observed its effects. After the act is "completed," it must be re-completed. It must be contemplated, admired, reflected upon, perhaps promoted and written about and archived, or perhaps condemned and punished and outlawed. Heads may roll, or they may explode. Something must be done, in any case. It matters less what is done than that something is done.
Hiding in plain sight here, there is a tell. Namely, the events de Jong describes, conducted among an insular bunch of woolly-headed post-artists, these events do not move very many other people very strongly; they are not even known to anyone, really, who is not a scholar of art history or of left-wing activism. That is because these events do not matter very much, and because they are, probably, nonrepeatable. But there are other matters far more grave, so grave that there is no way for humans to live together or live at all unless the severity of an event is permitted, against all purely rational considerations, to modulate the two sides of the causative duality such that it ceases to be the same duality in practice that it is in the abstract. The worse is done before, the more people know and care about it after. Bad news travels faster than good, the faster the badder.
For as long as there have been humans, humans have lived in groups. Becker, in the first of his trilogy which is unfortunately also the least read, goes into some detail about what is known here.
Probably the most exciting development in modern anthropology is the discovery of the australopithecines,... As far back as over a million years ago the first of these animals roamed the grasslands of southern and eastern Africa, and one of their outstanding features was that they roamed well:...
We used to think that a large-brained upright primate arrived on the evolutionary scene, and that this large brain permitted him to learn to use tools, develop complex speech,... But now we see that man's large brain is a rather late development. ... As you read the fossil record, it appears that the man-ape's taste for meat was progressively satisfied by increases in hunting skill:... The hunters seem to be slowly coming of age, taking possession of the world around them more masterfully and surely. ...
But as we said, in order to be efficient hunters these man-apes had to develop new forms of social organization... Popular writers today try to convince us that what we call distinctively human is something that we really share with the baboons:... But to make such analogies is not only cheap journalism, it is all wrong. ...man developed away from the apes precisely because he had to hunt meat; and if you want to hunt meat you cannot afford yourself the luxury of baboon behavior. ...you cannot fight over the kill, or over the females back at the camp... ...you need rules about social relations. The best way to get cooperation among volatile, erotic primates is to regulate sexual relations... ...the result of this, as Marshall Sahlins has so well pointed out, is that you get your recognition from others not on what you take—like the baboons, but on what you give. Among primitives today the main reward of the one who kills the big animal is the prestige of being able to distribute it to his family and to others. Often the hunter himself gets the smallest share or the least desirable part of the animal. Unlike the baboon who gluts himself only on food, man nourishes himself mostly on self-esteem.
In other words, these "man-apes," once they had advanced and "roamed" a bit, also had to work out some quick-and-dirty pragmatism, the kind of pragmatism that is "radical empiricism's companion theory of truth." Like all later human groups, they would periodically have been confronted by certain sticky situations wherein both parties to an intercourse could not be held equally responsible in the eyes of the community without the community itself dissolving. Ascriptions of responsibility thereby begin to depend on what the intercourse has been, not merely on such causal facts as can be established. This is a pragmatic, not principled, solution . . . and it can be wrong. This is a matter of judgment; it is a judgment call, so to speak, which becomes crystallized as practice, perhaps later as built environment, as institutions, and indeed as thought too, which is where it began. And so it goes, around and around again; and if the human beings involved in this process are fortunate enough to be capable of some abstract thinking and much practical learning, if they continue making small adjustments in this way without arrogating to leap over and past their own capacity to observe the effects of their actions, then they can well and truly advance. They will progress. They are not progressing toward infinity or utopia, to be sure, but they may converge upon on some waystation, and then lose their way ever so slightly, and then discover the next. They are never quite finished getting the mix dialed in, because it is no more possible to simply set it and forget it than to start from nothing each time the sun rises anew. Anyone who is thinking about giving this latter idea another shot to prove itself would do well to consult the fossil record first.
The zero-degree of community practice, starting from nothing, is blindly destructive and irrational. If it is not known how to live, not even provisionally, then almost everything will be a mistake and successes will accrue so slowly that no one can detect them. Human beings have long since passed through the very worst of this. There is absolutely no good reason to go back to it. Looking on the bright side, it seems very unlikely that human beings ever again would be truly thrust back to the very beginnings of communal practice and literally have to discover or rediscover their way forward. Anytime a wide swath of practical knowledge is forgotten, however, when it is suppressed or repressed, or when it for some inscrutable reason just disappears precisely when it is needed most, then the community and all the humans which comprise it will also disappear, and fast. If intellectual people start "interpreting," if they start trying to offer up "meaning in terms of human life" that does not actually come from life but from the ether, if they, "without actually erasing or rewriting the text," are indeed "altering it" pragmatically and semantically, then things will soon take their course. Whether the carrying out of this interpretation in fact evinced "aggression" or "revision" or "detachment," whether bystanders feel one or another of these to have been the case, post hoc, whether any of this has meaning "in terms of human life," nothing of this sort much matters by then, because by then human life has ceased; if not yet literally, then pragmatically all the same.
This is not to say anything at all about the global or cosmic necessity of any one community continuing while others disappear. That is the view from outer space, and if more and more human beings will, finally, be treated to that view in the coming years, this does not change anything about what one is and is not able to learn this way. Like the self itself, which according to Stephenson is "overly attitudinal" and formed only "in relation to social controls," national prejudice is formed merely by awareness of certain kind of others and by awareness of real or projected "control" they might exert upon one another: by nations' mere awareness of each other, that is. Mere awareness is all the "social control" any nation needs to form a self in opposition to an other. Nations are abstractions which are beyond the observation of any one person. This is why immigrants, eventually, on the whole, are accepted by and also do some accepting of their new communities and surroundings; and meanwhile, national feuds can remain quite literally blood feuds for centuries on end, for dozens or hundreds of years after good numbers of the principals have managed to live together, grudingly but decadently, on the other side of the globe.
Viewed from outer space there is no destiny of nations or peoples which can possibly be privileged over all the others. It is only the view from the inside, from within one's own community, from which arises the question: should I stay or should I go? Shall this community continue to exist as such? Shall I remain a part of it? Those questions may be answered independently of each other, certainly, but it is not easy to actually choose disappearance. Not very many people are ever going to choose to disappear, though they may know that they will, eventually, and they may be utterly at peace with all of that. They may appear to have chosen this, but probably they did not. On this, Becker says, poignantly,
Anthropologists have long known that when a tribe of people lose the feeling that their way of life is worth-while they may stop reproducing, or in large numbers simply lie down and die beside streams full of fish...
As the folk wisdom would have it, nothing lasts forever. Nothing ought to, either. The problem is that disappearance is not usually instantaneous. Usually it takes just long enough to wreak tremendous suffering on the creatures, before they pass out of existence, mercifully and much the worse for wear. And so it is absolutely essential to do everything possible, always, to preserve, to conserve the work that has already been done, to at least ensure that it remains in the toolbox even if it is not required at the moment, and to ensure that living practice has taken its cue from this work and not simply wiped the slate clean or smeared it beyond legibility; all of this so that disappearance and destruction, the ultimate "changes," also progress only at a comfortable and leisurely pace which gives people a chance to make their own peace with this reality, in whatever manner, and to find a little bit of happiness along the way.
All of this, for all human communities, everywhere and always, no matter who they are.
There is just one caveat:  
One global community is no community at all.
". . . everything, and nothing . . ."
If human beings notice progress in their midst, without looking too hard for it, then they are already in grave danger of being destroyed by it (and they probably have been for a long time). Change must never seem to be happening very fast, or at all, although it is always happening and must always happen, and it cannot be kept under raps forever. The question is simply: What change? How fast? Does it work?
Human tolerance of change is very well understood. Each among the canonical academic silos has some understanding of this problem that is very good, and there is plenty of integral folk wisdom, scattershot but periodically validated by practice, which can also be consulted. There really need not be too much argument about what people can withstand and what they cannot, and so there is no argument to be had at all about what they ought to be expected to withstand. And of course where change really comes from, in some ultimate causal sense, is not too important. If humanity eradicates itself, certainly, it will not have to worry about refining its sciences of worldly causality.
Short of all that, change is good, but only if it is made gradually, and only if it is validated by practice. It does not matter where or who it comes from. If your nemesis builds the best widgets or brews the finest beverages, you steal the idea (you appropriate it, without asking permission first) and you leave the sleeping bear alone to get its beauty rest. You do not interpret or critique the work. Who has such time to donate from among their meager accounts? Who can afford to risk destruction in the face of danger in order to stop and "work out" an "interpretation" of the practical knowledge that is to save one's own hide? And if a comrade declares that you have thus become as the nemesis, or that you have wronged the nemesis by appropriating their still-living idea as if it were mere deadwood knowledge, you point out that all of you, self, nemesis and comrade, are alive, still, and you offer your comrade a drink and a pat on the back.
As Becker relates it,
We call precisely those people "strong" who can withdraw a pseudopod at will from trifling parts of their identity, or especially from important ones. Someone who can say "it is only a scratch on a Ferrari,"... They disentangle themselves easily and flexibly from the little damages and ravages to their self-extensions. Financiers who can say of a several million dollar loss: "well, it's only paper." ... Or those, like Nikos Kazantzakis' father witnessing the ruin of the grape crop in a massive rain storm, and in the midst of the general hysteria, calmly saying "Never mind, we are still alive."
So, what is "trifling," and what is "important?" If human beings do not judge as well as do and think, if they do not always do these things, at all times, somewhat, as opposed to carving out special preserves where special authorities do the work on everyone else's behalf, then their community will disintegrate very quickly; and if they have already advanced very far indeed, sometime previous, then there will be unfathomable suffering as economic niches evaporate and social ecosystems collapse. And if their "community" is the entire globe . . .
Besides the fact that responsibility for human acts can be established only in "pragmatic" and never in "principled" terms, there is the problem that the act itself may be so heinous that to allow it ever to happen again is unthinkable. And this, indeed, is most of what a community is. This is precisely why there are communities: so that people can look out for each other, can look out for the interests of those among their fellows who cannot do so very well for themselves. That is what communities do. Everything else they do is secondary, and distantly; the art too is a distant second, or third, or fourth, as even most artists understand quite well and unproblematically (the exception being of course that art too has "advanced" through practice, and so its recession does cause some discontent, perhaps even real suffering). Therefore, it is not only right but imperative for communities, provided this is done as communities, to handle an assault in a contradictory fashion to how they handle a sarcastic remark which is mistaken for a sincere one.
This is more than right, it is essential to survival itself. But it is true only of "communities." It is not true of worlds, globes, galaxies, or universes. Those things are very large and do not work quite the same way. It is not known how human beings would adapt this process to a "global community," because the very notion is a contradiction in terms. More is different.
The really pernicious part of intellectual activity, the thing that has led so many of its finest practitioners to disavow and condemn it, is that it can uncover contradictions such as that between the principled and the pragmatic semantics of "victim" and "perpetrator," and from there it has once again become possible, for some people, sometimes and someplaces, to feel secure and correct in blaming victims for their own misfortune, whether it is a death or a stubbed toe, because that is the intellectually coherent way to look at this problem. It is right and wrong at the same time, and when intellectuals confront this through the lens of the gravest issues, many of them shrink back in terror. Some of the very best cannot bear to face this type of problem, not in spite but because of their sensitivity, farsightedness and intellect. It is precisely the finest who are most likely to make disavowals because they are the ones who actually understand just how baldfacedly destructive "principled" human action can be, and by then there does not seem to be anything to do but simply to make a boisterous repentance and to then pursue some sort of soft nihilism or split life. The mediocrities may see the problem but are not as concerned about getting it just right; or they may be finests who have shrunk back; or they may be adventurists who have sprayed ammo at a problem and gotten lucky enough to graze the butt end of it. The rest of the people, the folk, the non-intellectuals, a-intellectuals, and anti-intellectuals, could land anywhere that happenstance delivers them, including upon the highest spiritual enlightenment borne of practice, or upon a viscious rigidity and unwillingness to heed all the signals that their social world is sending them; and everything in between and afield of these too, of course.
All the same, though intellectual activity can go wrong in a thousand and one ways, the world suffers greatly from the inability of so many brilliant people to face up to their own pessimistic conclusions and to allow these conclusions, gently, to poke their heads out into the world to see-and-be-seen. Optimism begins in pessimism, always. Very intelligent people must cultivate their intelligence, because uncultivated intelligence is the single most dangerous force on planet earth. It will not suffice for all to become naive empiricists, nor for any kind of empiricists to cling to their naivete like a teddy bear. Intellectuals still must apply their intellect to the task of parsing empiricist findings, and all parties must be gentle in transacting their conclusions. It takes a village . . .
The bald fact is: there is an all-too-human contradiction between mundane causality and catastrophic causality, and there is nothing to do but to accept this contradiction in the very most severe and least severe cases. And in the middle? The middle, though there is less "severity" residing there, is very, very large.
NOTES
. . . "Works of art need to be interpreted" . . .
Arthur Danto, interviewed by Hans Maes in August, 2011.
http://aesthetics-conversations.com/arthur-danto/
It will be objected that these may be casual or off-the-cuff remarks made during an interview, in Danto's own apartment, with his guard down towards the end of a long life. It will be objected that they need not or should not be parsed so closely, that they are throwaways, and that the man leaves behind a towering bibliography of formal publications.
God willing, there will be plenty of time and space allotted here to consider Danto's oeuvre, later. For now there are just two rebuttals. For one thing, the interviewer takes these remarks quite seriously: he has gone to press with Oxford and has hard-coded this online excerpt as graphics rather than text; and for another thing, it may be better to catch a philosopher with his guard down if the aim is to find out what he really thinks.
. . . "People create what they need" . . .
Albert Murray. Quoted in Eugene Holley, "What Albert Murray Taught Us About Jazz."
https://www.npr.org/sections/ablogsupreme/2013/08/24/214831904/what-albert-murray-taught-us-about-jazz
We invented the blues; Europeans invented psychoanalysis. You invent what you need.
For Rank, in the English at least, this is called "dynamic need of equalization." (Art and Artist, p. xlviii)
. . . "if the outcome is foreseeable and if it is negative enough" . . .
An aside: think about the constraints upon form and content, and upon the social and transactive contours, that would make the affective "outcome" of an artwork the least bit "foreseeable." Not much can be foreseen when one's audience is an entire school or city, let alone the entire globe. (And, see next note.)
. . . "the worst case" . . .
To reduce Nassim Nicholas Taleb's most famous thesis to its barest bones: the probability of an event occurring is one thing, its severity is something else. An event can be extremely unlikely, and yet if its consequences are severe enough it is worth taking precautions against it anyway.
This is the root of Taleb's great contempt for economic "forecasters." It is, by the same token, at the root of this study's great contempt for artists who "forecast" the effects of their work on an audience.
. . . "for some reason a text has become unacceptable" . . .
Susan Sontag, "Against Interpretation," sec. 3 (p. 6).
In
Against Interpretation.
. . . "the contemporary zeal" . . .
Sontag, ibid.
. . . "Interpretation is the revenge of the intellect upon art" . . .
Sontag, "Against Interpretation," sec. 4 (p. 7).
. . . "methodological rigor ... before the fact ... everything that may be said" . . .
Center for Open Science, "What is Preregistration?"
https://www.cos.io/initiatives/prereg
As this section is being written, this is the top Google search hit for "preregistration." In sum,
Preregistration separates hypothesis-generating (exploratory) from hypothesis-testing (confirmatory) research. Both are important. But the same data cannot be used to generate and test a hypothesis, which can happen unintentionally and reduce the credibility of your results.
. . . "The wonder of art ... "alternative histories" ... endless regress ... a pragmatic, not principled, solution" . . .
William H. Ittelson, "Visual perception of markings" (pp. 184-185).
. . . "a corresponding resonance" . . .
For purposes of style, Ittelson's citation is omitted in the main body above. Here he cites Rudolf Arnheim, "From Pleasure to Contemplation" (p. 197).
. . . "inside ... unrevealable" . . .
Ernest Becker, The Birth and Death of Meaning: An Interdisciplinary Perspective on the Problem of Man (pp. 28-29).
(See complete discussion at 0-2, "The Essentiality of Essentialism.")
. . . "people distort almost everything that they let in . . .
There is no value judgment in saying "distort." This is just a literal (if also tremendously reductionist) description of how certain aspects of human cognition are currently understood work. It may well be that science is not done rolling back old essentialist prejudices, that it may just be getting started on that. Science may not be finished revealing just how adaptable human beings can be. This does not mean that the knee-jerk constructivist battle cry of "social, political and cultural context" can explain much of anything. On this, see Rank, who figured all of this all out without the aid of contemporary cognitive science.
. . . "minimize any effects" . . .
The resonance here with so-called "entertainment" will immediately be noticed. One theory about the difference between art and entertainment (if such a theory of "difference" can be accepted, provisionally, as a value-free assessment and not as a blanket intellectual aggression against popular culture) is that art changes people while entertainment leaves them the same as they were before.
The reality, of course, is that change is never reduced to zero in this way. The whole point here, regardless, is that one need not seek change in order, eventually, to find it. Similarly, the best way not to find change is to seek it.
Much recent art and entertainment seeks change explicitly and wilfully, with obvious consequences.
. . . "Critique is a secondary reaction ... Art is primary action" . . .
Karen Kurczynski, The Art and Politics of Asger Jorn (p. 183).
This excerpt comes from "a 1964 lecture, after the artists and the SI had split." The source is an "unpublished manuscript" in the Jorn Museum archive, translated by Kurczynski.
Kurczynski continues:
Jorn perceived Debord's SI [Situationist International] as going too far, so that its claims to destroy art effectively relinquished art entirely to those in power. This self-marginalization would allow art institutions to take control of the group's historicization by default and perpetuate the very apolitical conception of art that the Situationists wanted to overturn.
Jorn's points stand well enough on their own, but to be fair, he somewhat talks past Debord when he threatens a loss of "sympathetic contact" with artists. Debord put forth a powerful and nuanced theory justifying this total break with art, centered on
not
giving artists the generous benefit of the doubt that they had come to enjoy in mid-twentieth century bourgeois European societies. It is fair to rebut Jorn on this basis, but only to a point. The difference is not really reconcilable, as events indeed indicate.
A stronger strategy for Jorn here would be to eschew any bold forecasting and to attack Debord where he lives, i.e. to take his theory apart in more detail. (Jorn is of course perfectly able to do so, but this is not a simple task nor a concise one.) One does not have to proceed very far with this before plenty of irresolvable problems arise for Debord, whose obscurantism and utter contempt for the "existentialists" in his own midst (with whatever justification) lead him into a life-denying absolute teleology; this most especially vis-a-vis the anticipated thrust from fragmentary to unitary life, which reduces the complexity of the situation beyond the point where such reduction is tenable. Human beings are nature's supreme acclimators; they will find the angst even in utopia, and then it is not utopia any more.
Jorn stands to Debord somewhat as Engels to Marx, or Rank to Freud, and skewing acrimonious rather than comradely, but the very most important points are held in common nonetheless. Hence it is hard to shake the impression that Debord simply had it out for art whereas Jorn had more personally invested in it, though of course that is irrelavant here and cannot be proven anyway.
. . . "something fixed ... the mere apple in the eye" . . .
There is a critical tendency, as specious as it is erudite, to deploy a combination of cognitivist reductionism and classical sociology to argue that the "apple in the artist's eye" could only be exactly what it becomes; hence it also is "fixed" before the fact as rigidly as any concrete artwork is after the fact. There is no rebuttal to this line of argument, at least not here on earth, because it is irrational in every detail. Causality is dual. Human beings are not able to view or experience causality from only one side, although they are very capable of speaking this way all the same. Enough has already been said about this.
. . . "the importance of face" . . .
Becker, Birth (pp. 88-89).
. . . "If we are properly proud" . . .
Becker, Birth (p. 91).
. . . "you don't expect your comrades to deform your texts" . . .
"A Maximum of Openness: Jacqueline de Jong in conversation with Karen Kurczynski."
In
Expect Anything Fear Nothing,
ed. Mikkel Bolt Rasmussen and Jakob Jakobsen (p. 195).
. . . "the discovery of the australopithecines" . . .
Becker, Birth (pp. 1-3).
. . . "radical empiricism's companion theory of truth" . . .
Mark Reybrouck, "Musical Sense-Making and the Concept of Affordance: An Ecosemiotic and Experiential Approach" (pp. 397-398).
In an original epistemology which he [William James] has coined as radical empiricism, he states that the significance of concepts consists always in their relation to perceptual particulars. What matters is the fullness of reality which we become aware of only in the perceptual flux. Conceptual knowledge is needed only in order to manage information in a more ‘economical’ way. As such, it is related to principles of cognitive economy:
"It is possible ... to join the rationalists in allowing conceptual knowledge to be self-sufficing, while at the same time one joins the empiricists in maintaining that the full value of such knowledge is got only by combining it with perceptual reality again."
...
Pragmatism, which can be considered as radical empiricism's companion theory of truth, attempts to deal with the problem of meaning. ...
Elsewhere (i.e. everywhere), Reybrouck's account grates harshly against myriad conclusions reached by the present study, but his terminology, both the borrowed and the original, is very helpful and is good for populists to have close at hand.
. . . "consult the fossil record first" . . .
It is often pointed out that the human bones themselves show much gruesome evidence of violent death, the more so the earlier one looks, and the less so as time (and progress) marches on.
Lewis Mumford writes eloquently about the "Neolithic Village" and holds that this was, very long ago, already the closest human beings have yet come to getting the mix dialed in.
e.g.,
Mumford, The Myth of the Machine, Vol. 1: Technics and Human Development (pp. 158-159).
The extraordinary durability of neolithic village culture, as compared with the more daring transformations of later urban civilizations, bears witness to its having done justice to natural conditions and human capabilities better than more dynamic but less balanced cultures.
. . . "Anytime a wide swath of practical knowledge is forgotten" . . .
See Mumford, again, on the transition from pre-modern "polytechnics" to industrial "monotechnics."
. . . "overly attitudinal ... social controls" . . .
William Stephenson, The Play Theory of Mass Communication.
(See 0-2, notes.)
. . . "beside streams full of fish" . . .
Becker, Birth (p. 76).
. . . "Human tolerance of change" . . .
Bruce E. Wexler, Brain and Culture.
Wexler's book is a concise, readable and sensitive overview of cognitivist thinking on this topic.
Becker, Birth (p. 34).
The immediately preceding passage reads, in part,
an individual's house in a posh neighborhood can be more a part of his self-image than his own arm—his life-pulse can be inseparable from it.
How else can you understand an event like the one that happened a few years back in Paris, when a stranger sat behind the wheel of a parked new Jaguar for a few minutes, evidently only to "get the feel" of it: the owner came out and shot him dead. He must have felt that the stranger had defiled his own inner self. It is not that he was "crazy" so much as he was brittle and over-identified with his material object, the material extension of his self. I once read the case of a man who could have nothing more to do with a wife who had been raped. Anatomically the wife's damage was slight or non-existent; but in terms of the extensions of his self, it was the husband who felt defiled. ... And what about the financiers in 1929 who threw themselves from tall buildings because something had happened to the numbers in their bank-accounts? It is just as William James had said: they were the numbers, and their value had gone down to zero so they were already dead. Generally, the more anxious and insecure we are, the more we invest in these symbolic extensions of ourselves. In the United States today, ridden by social change and crisis, "desecrating the flag" has become a major offense. It is not that the flag has risen in value, but that the selves are more anxious about their own. ... You get a good feeling for what the self "looks like" in its extensions if you imagine the person to be a cylinder with a hollow inside, in which is lodged his self. Out of this cylinder the self overflows and extends into the surroundings, as a kind of huge amoeba, pushing its pseudopods to a wife, a car, a flag, a crushed flower in a secret book. The picture you get is of a huge invisible amoeba spread out over the landscape, with boundaries very far from its own center or home base. Tear and burn the flag, find and destroy the flower in the book, and the amoeba screams with soul-searing pain.
Usually we extend these pseudopods not only to things we hold dear, but also to silly things; our selves are cluttered up with things we don't need, artificial things, debilitating ones. For example, if you extend a pseudopod to your house, as most people do, you might also extend it to the inventory of an interior decorating program. And so you get vitally upset by a piece of wallpaper that bulges, a shelf that does not join, a light fixture that "isn't right." Often you see the grotesque spectacle of a marvelous human organism breaking into violent argument, or even crying, over a panel that doesn't match. Interior decorators confide that many people have somatic symptoms or actual nervous breakdowns when they are redecorating. And I have seen a grown and silver-templed Italian crying in the street in his mother's arms over a small dent in the bumper of his Ferrari.
(pp. 33-34).
No comments:
Post a Comment