Sunday, June 30, 2013

At Noon

The thick-walled room’s cave-darkness,
cool in summer, soothes
by saying, This is the truth, not the taut  
cicada-strummed daylight.
Rest here, out of the flame—the thick air’s
stirred by the fan’s four
slow-moving spoons; under the house the stone
has its feet in deep water.
Outside, even the sun god, dressed in this life
as a lizard, abruptly rises
on stiff legs and descends blasé toward the shadows.

— Reginald Gibbons

At Every Turn, an Angel with a Flaming Sword

Emily Eakin:

He abandoned his faith as a teenager and eventually completed a Ph.D. with Roland Barthes, by which point he'd concluded that Christian concepts of guilt and redemption were inescapable. "Baudelaire said that civilization is the abolition of the original sin," he told me. "In fact, it's not true; we haven't abolished original sin but rather spread it all over." In Bruckner's scenario, Marxism transposed the idea of Christ onto the working class; paradise would come after the revolution. Then, with third-worldism, colonized peoples became the embodiment of virtue. Now, he says, it's Mother Earth: "She is suffering, the metaphor of all victims." Or, as he writes in The Fanaticism of the Apocalypse, in reference to the ubiquitous phrase "carbon footprint": "What is it, after all, if not the gaseous equivalent of Original Sin, of the stain that we inflict on our Mother Gaia by the simple fact of being present and breathing?"

Mechanism Is Not Meaning


It is only natural that advances in knowledge about the brain make people think more mechanistically about themselves. Still, mechanism is not meaning. The brain creates the mind through the actions of neurons and circuits, yes, but it cannot reveal its nuanced contents. Despite our romance with the brain, we make sense of ourselves and the world by thinking about desires, intentions, and actions. No matter how intricately scientists understand the brain, they won’t be able to answer why we sabotage ourselves—the question that, in some form or another, has launched a zillion therapy hours. It won’t compel us to adopt a new moral code or revamp our system of criminal justice.

Learning more about the brain will help unravel more about the knotty relationship between mind and brain; it will make deep inroads into treatments for diseases such as Alzheimer’s. But no matter how dazzling the fruits of inquiry or how ingeniously they are obtained, brain-based explanations of our longings, exploits, and foibles are sure to break our hearts.

Saturday, June 29, 2013

Dysrationalia

Keith Stanovich:

To think rationally means taking the appropriate action given one’s goals and beliefs — what we call instrumental rationality — and holding beliefs that are in synch with available evidence, or epistemic rationality.

Again, the competing definitions of what it means to be rational make this difficult. Instrumental rationality takes the goals and beliefs as givens, and simply seeks the most efficient way of connecting them. A murderer or drug addict could exhibit rationality within those parameters. Epistemic rationality involves more of a value judgment on those goals and beliefs, but as anyone who has ever attempted to argue a friend or relative out of some crazy opinion knows, even available evidence can be undermined by mistrust of the source, or the suspicion/hope that later evidence will contradict what's now being asserted as fact. Ah, well, I was already pessimistic about human nature anyway.

This seemed kind of interesting:

I coined the term dysrationalia — an analogue of the word dyslexia — in the early-1990’s in order to draw attention to what is missing in IQ tests. I define dysrationalia as the inability to think and behave rationally despite having adequate intelligence. Many people display the systematic inability to think or behave rationally despite the fact that they have more than adequate IQs.

Literature Cops Pulled Me Over and Beat Me with Their Book Clubs

Norman Geras:

One method of defining human nature is - more or less empirically - as constituted by those characteristics which human beings share in common. On this basis, reading good fiction can't be one of the constituents of human nature, given how many members of the human species haven't read Proust, Tolstoy, F. Scott Fitzgerald or whoever else it might be.

A conception of human nature can, on the other hand, be an openly normative one, containing an ideal of how to live well. As part of such a conception one might commend reading as making us human or more human. But it is important to note, then, that this is precisely an ideal, held by some people and not by others. It doesn't oblige anybody. For my own part, I think the locution is regrettable that suggests that non-readers aren't as human as readers.

Along those same lines, John Gray quipped that it was strange to see those who call themselves "humanists" spending so much time denigrating what is possibly the defining human activity, religious worship. Do you really love "humanity" just as it is, or do you only love the idea of it, what you imagine to be the potential of it? There's a busybody continuum ranging from mildly annoying to truly dangerous, and while proselytizing book lovers are obviously harmless, it's interesting to see how minor preferences are so quickly and easily shaded with moralistic colors. However, vicarious experience is merely an increased, intensified version of what's already there in the reader's head, good and bad.

As a general rule, I have no great affection for my fellow hairless apes, individually or collectively, but rather than seek to impose my particular vision of the good life on as many of them as possible in the hope of making the world more to my liking, I'd rather spend my time creating that world in miniature and let the rest go about their folly.

Friday, June 28, 2013

The Usefulness of What Is Not

Caspar Melville:

For Dennett there is no hard problem of consciousness at all, and the fact we think so is a consequence of misdirection by philosophers like David Chalmers and illusions created within our own brains – we may think that we are something more than chemicals, and that our mind is not confined inside our skulls, or that we have a soul that transcends the plane of mere matter, but we’d be wrong. “There’s nothing immaterial,” he says; “consciousness isn’t immaterial any more than centres of gravity are immaterial.” Does this include my sense of self?

“Oh, yes, in exactly the same way as the centre of gravity is. The centre of gravity is a nice concept, because it’s physics, it’s not sociology or psychology. It’s a term well regarded in the physical sciences. And yet it’s not an atom, not a particle. It’s a mathematical point, an abstraction. A very, very useful abstraction. And the self is another very, very useful abstraction. It’s not made of anything, any more than the centre of gravity is made of something. But its features are not arbitrary and they are entirely fixed, in the end, by the chemistry of the atoms that compose the phenomenon.”

I think it would be funny to start referring to Dennett as a Taoist sage, though I suspect he might not appreciate being associated with mysticism even in jest.

Summer Days in the Mountains

Too lazy even to move a feather fan,
stripped naked in the deep green forest,

even my headband left on a stone wall somewhere,
I let the pine winds ruffle my hair.

— Li Po

Evening After Rain

Sudden winds brought rain this afternoon
to save my thirsty garden.

Now sunset steams the grass
and the river softly glistens.

Who'll organize my scattered books?
Tonight I'll fill and fill my glass.

I know they love to talk about me.
But no one faults me for my reclusive life.

— Tu Fu

Infallibilism and Nihilism are Twins

David Deutsch:

Fallibilism, correctly understood, implies the possibility, not the impossibility, of knowledge, because the very concept of error, if taken seriously, implies that truth exists and can be found. The inherent limitation on human reason, that it can never find solid foundations for ideas, does not constitute any sort of limit on the creation of objective knowledge nor, therefore, on progress. The absence of foundation, whether infallible or probable, is no loss to anyone except tyrants and charlatans, because what the rest of us want from ideas is their content, not their provenance: If your disease has been cured by medical science, and you then become aware that science never proves anything but only disproves theories (and then only tentatively), you do not respond “oh dear, I’ll just have to die, then.”

The theory of knowledge is a tightrope that is the only path from A to B, with a long, hard drop for anyone who steps off on one side into “knowledge is impossible, progress is an illusion” or on the other side into “I must be right, or at least probably right.” Indeed, infallibilism and nihilism are twins. Both fail to understand that mistakes are not only inevitable, they are correctable (fallibly). Which is why they both abhor institutions of substantive criticism and error correction, and denigrate rational thought as useless or fraudulent. They both justify the same tyrannies. They both justify each other.

As a mere pseudo-philosopher, it pleases me to see recognized thinkers arriving at the same conclusions that I have through my plucky, can-think spirit.

Tuesday, June 25, 2013

Bros Before Prose (Slight Return)

As you've no doubt noticed, the Internet has struggling and/or failed writers standing around forlornly at the end of every URL off-ramp. It's an unforgiving, stingy market for them. Thus I can't realistically fault some poor fucker with an utterly useless English degree and an unread manuscript for grasping at any straw that promises a slight career advantage, even if it might seem unsporting, say, to invoke racist or sexist discrimination in order to shoulder one's way closer to the head of the line. Whatever idealistic bullshit you may occasionally read about the moral benefits of fiction, writers, like anyone else, will do what they have to in order to get ahead.

It is tedious, however, to see the faux-civil-rights issue of gender imbalances in prominent literary outlets becoming an increasingly popular spectator sport, complete with all the snark and posturing typical of online drama. But whatever, fine; let's just say you somehow manage to browbeat all the prestigious journals into presenting an equally-divided pie chart of male and female authors and reviewers in every issue. Then let's say that a new study reveals that over the course of a year, male authors received positive reviews 58% of the time, while female authors only received positive reviews 39% of the time. Will that be accepted as just the way it goes, or will the same insinuations and accusations of sexism be leveled again in an attempt to game the refs? As you can guess, I'm inclined toward cynicism.

The Overheated Rantings of Some Fool Who Can't Read

Maria Bustillos:

Kickstarter is a private entity, and can police its use however it likes. No one has a "right" to Kickstarter. Out in the real world, however, we know that the rights to publish and speak are ten thousand times more important than the overheated rantings of some fool who can't read. As PUA guides go, Hoinsky's is very, very far from being the most aggressive or objectifying. It is an entirely harmless book—as all books are. In fact I found Hoinsky's Reddit guide to be full of interesting and valuable observations. There isn't a misogynist syllable in the whole thing. But even if it were as bad as the worst excesses of Frank T.J. Mackey, it is wrong and dangerous to suggest that shutting someone up is the best answer to any problem, ever.

Yeah, I think I heard something about that. Wasn't there some fatuous, conclusion-jumping blowhard manufacturing drama about it recently? Ah, right, that's the one. Well, surely, once he had time to calm down and look into it more carefully, he acknowledged letting confirmation bias and ideology get the better of him, yes? Oh.

Skepticism, honey, I know the divorce couldn't have been pleasant, but you did the best you could. There's no reason you should have had to put up with his repeated infidelity, and we're all just glad you got away before he dragged you down into the gutter with him.

Every Single One of Us, the Devil Inside

John Gray:

Evangelical Christians will look at you with blank disbelief if you suggest that Christian teachings played any part in the Inquisition, the early modern witch craze or later forms of persecution. “How could a religion of love,” they splutter, “possibly be responsible for such hateful crimes?” Similarly, today’s Enlightenment evangelists respond to the fact that some of the worst modern crimes have been committed by militant secular regimes with incredulity: “How could a philosophy of reason and humanity possibly be involved in anything so irrational and inhuman?”

These responses illustrate one of the central tenets of fundamentalism: the pristine creed is innocent of all evil. Any fact that runs counter to this conviction is screened out by what Karl Popper – one of the more interesting 20th-century Enlightenment thinkers, who along with Freud is absent from Pagden’s account – called a strategy of immunisation. Just as any Christian who participated in hate crimes can’t really be a Christian, anyone who took part in bloodthirsty political experiments such as Jacobinism and communism can’t really belong in the Enlightenment.

To departisanize a favorite saying of a partisan hack blogger, ideologies can never fail, they can only be failed.

Rather Like the Seasons


Fourth, the power of the progress idea stems in part from the fact that it derives from a fundamental Christian doctrine—the idea of providence, of redemption. Gray notes in The Silence of Animals that no other civilization conceived any such phenomenon as the end of time, a concept given to the world by Jesus and St. Paul. Classical thinking, as well as the thinking of the ancient Egyptians and later of Hinduism, Buddhism, Daoism, Shintoism and early Judaism, saw humanity as reflecting the rest of the natural world—essentially unchanging but subject to cycles of improvement and deterioration, rather like the seasons.

“By creating the expectation of a radical alteration in human affairs,” writes Gray, “Christianity . . . founded the modern world.” But the modern world retained a powerful philosophical outlook from the classical world—the Socratic faith in reason, the idea that truth will make us free; or, as Gray puts it, the “myth that human beings can use their minds to lift themselves out of the natural world.” Thus did a fundamental change emerge in what was hoped of the future. And, as the power of Christian faith ebbed, along with its idea of providence, the idea of progress, tied to the Socratic myth, emerged to fill the gap. “Many transmutations were needed before the Christian story could renew itself as the myth of progress,” Gray explains. “But from being a succession of cycles like the seasons, history came to be seen as a story of redemption and salvation, and in modern times salvation became identified with the increase of knowledge and power.”

Eve Fairbanks:

Those images became a consuming mystery for me. UFS hadn’t remained segregated after apartheid’s end—it had integrated and then resegregated later. I wanted to know why the white students raised those ancient flags, and why the black students had left Karee. I uncovered a tale of mutual exhilaration at racial integration giving way to suspicion, anger and even physical violence. It seemed to hold powerful implications well beyond South Africa, about the very nature of social change itself. In our post–civil rights struggle era, we tend to assume progress toward less prejudice and more social tolerance is inevitable—the only variable is speed.

But in Bloemfontein, social progress surged forward. Then it turned back.

Monday, June 24, 2013

Conscript, Deserter

Stephen Asma:

Steven Pinker expresses a well-worn normative suggestion when he says that the world should move away from tribal or group thinking and feeling, and embrace the "rights tradition" of individualism. He argues, in The Better Angels of Our Nature, that violence recedes as individualism rises.  The rest of the world could profit from the recognition, Pinker argues, that we are individuals, and individuals are the ones that "really count" (they actually feel the pleasure and pain). "Groups," he says, "are a kind of abstraction."

I'm going to disagree here and argue, somewhat counter-intuitively, that Pinker is the abstraction. I am the abstraction. You, gentle reader, are the abstraction.

The independent individual is a hero to WEIRD cultures (Western, Educated Industrialized, Rich and Democratic), and it serves as the starting place for both pessimistic and romantic theories of the social contract. Whether you're a Hobbesian who thinks the selfish ego must be constrained by the community, or a Rousseauian who laments such constraint (or even a Rawlsian), you still start from a metaphysic of individualism. But what if the individual is actually an ecological, developmental, and political construct?

The argument will be familiar to all those who grew up watching the Saturday morning philosophy roundtables, where Bugs and Daffy enquired into the "true" nature of the famous rabbit/duck illusion (only to have Elmer aggressively advance his Marxist viewpoint that philosophers have hitherto only interpreted the world, but the point is to change it, and that the power to do so grows from the double-barrels of his shotgun; thus always can the rarefied life of the mind be brought low by thuggish violence). There was also the pop star Spinoza who famously encapsulated his theory of modal metaphysics in the whimsical lyric, "I am he as you are he as you are me and we are all together."

Me, I'm just a simple farmer, a person of the land, common clay, a fellow who merely knows what he likes, and what I likes is the sort of social arrangement that leaves me be, unmolested and undetained. Like air conditioning, Western individualism may be a highly particular and modern creation which may be unsustainable in the long run, but damned if I'm not going to exult in it regardless.

Friday, June 21, 2013

A Sensation Not Unlike Slapping Yourself in the Face

One must eat the other
Who runs free before him
Put them right into his mouth
While fantasizing the beauty of his movements
A sensation not unlike slapping yourself in the face...

Jane's Addiction

Rhys Southan:

So this logical debate over veganism becomes either an eternal loop, or a stalemate in which everyone’s logical justifications slap against a stone wall and dissolve into a mist. Does this exercise seem as pointless to you as it now seems to me? The main value in it, I think, is that it affirms my earlier intuition that logic and rationality alone cannot tell us what to eat. Emotions have a big role in most of our decisions, and if someone doesn’t have the sort of emotional response to animal agriculture that compels them to give up meat, berating them with logic probably won’t do much.

I was hoping this essay would be the prelude to a closer look at the question of why we, with all our vaunted cleverness, can't find a workable way to rationalize feasting on our fellow shaved apes, but it appears our society still lacks the "stomach", ah ha ha, for such "meaty", oh ho ho, dilemmas. I mean, can't we at least scavenge our own? All that protein, buried deep underground or turned into ash? What a waste. Let me know when you're ready to get serious about this. Until then, good luck with continuing to avoid directly reckoning with the true, sanity-obliterating extent of pointless suffering woven into the very fabric of existence.

An Oppositional Ornament

Peter Orner:

My friend cut to the chase. “You’re not famous enough to be reclusive,” he said. “Actually, you’re not famous at all. Maybe you’ll get some traction after you’re dead?”

Apart from the obvious — i.e., there’s always death and the possibility of posthumous resurrection — my wise friend might also be right that a person might need a certain amount of celebrity in order to be known for having disappeared. And to my discredit, deep down, I admit this is pretty attractive. I want to retreat from the world and think and write in solitude. At the same time I wouldn’t mind a few readers knowing I’m out here being all mysterious.

Orner? Wait, didn’t he kick for the Vikings? No, no I’m talking about the writer, you know the dude that vanished…

A genuine recluse, of course, wouldn’t give a damn.

Yeah, genuine reclusiveness never looks back over its shoulder. John Lennon said life is what happens to you while you're making other plans; well, reclusiveness is what happens when you're too absorbed in your life's work to look up and take notice of the reaction it's getting from others.

Then again, Mr. Magoo might be a recluse by this definition.

The point is this! — all you can do is give yourself completely to your writing, and whatever happens, happens. Liu Xiaobo's wife had the right of it: are you truly incompatible enough to be invisible, or are you simply being coy and playing hard-to-get in the hope of greater reward? The online world especially, lending itself so readily to pantomime performance in lieu of substance, is full of people who want to loudly and visibly separate themselves from the group while being recognized and praised by the group for their independence. I would think such conflicted coquettishness would only detract from a serious writing practice.

Thursday, June 20, 2013

You Know the Good Old Days Weren't Always Good and Tomorrow Ain't as Bad as It Seems

When I noticed years ago in the course of my readings in philosophy and poetry that even the ancient Greeks and Chinese spent a lot of time griping about how the kids those days were a bunch of retarded delinquents and society had been going steadily downhill ever since some vaguely-defined golden age, a low-watt bulb buzzed and flickered over my head, and I said to myself I said, Self, you know what, I think all the curmudgeonly grumping you hear along these lines may just be some omnipresent thing in basic human psychology with no real correlation to anything objective about the world, and after congratulating myself on that perspicacious insight, I quickly resolved to ignore any such bloviating I might later encounter, seeing as how life is short and death is unanswered, and I have largely stuck to that resolution.

Therefore I'm glad that XKCD has gone to the trouble to spell out in more detail what I was content to dismiss with a grunt and an impatient hand-wave.

Wednesday, June 19, 2013

And They Showed Me a World Where I Could be So Dependable, Clinical, Intellectual, Cynical

Chris Mooney:

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,’” explains Stanford social psychologist Jon Krosnick. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

The whole article is very good, but I especially liked this part as it echoes what I was saying not too long ago. Also:

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

The fact that such an obvious truism is not practiced by so many of the people who like to imagine themselves activists for sociopolitical progress is largely what convinces me that their professed ideals are little more than means to the expression of their own righteousness.

Monday, June 17, 2013

He Who Writes in Blood and Aphorisms Does Not Want to be Read, He Wants to be Learned by Heart

In the midst of another paint-by-numbers essay lamenting the baleful influence of digital media upon our reading and thinking, this part stood out to me:

As Nicholas Carr (2013) thoughtfully points out, Friedrich Nietzsche is an example worth considering. When Nietzsche started to compose on the typewriter rather than by hand due to poor eyesight, it changed the way he wrote philosophy. His prose changed from arguments to aphorisms, thoughts to puns, and rhetoric to telegram style. If we significantly change how we write, it significantly changes what we write. If we significantly change how we read, it significantly changes what we read.

He bought a typewriter in 1882. That was the year The Gay Science was published, a book many scholars feel to be the first expression of his "mature" philosophy. The following year saw the publication of Thus Spoke Zarathustra, which I might describe as a book-length epic poem/irreligious parable. Over the next few years, Beyond Good and Evil and The Genealogy of Morals, neither of which can be said to consist primarily of aphorisms, puns, or telegram-style terseness, completed this roughly-agreed-upon phase of his career where his most important ideas were developed to their fullest potential. Two of his most aphoristic works, Human, All Too Human and (my personal favorite) Daybreak, were published before his supposedly fateful encounter with the typewriter, the former's consciously-chosen style owing more to the French writers he admired than any crude technological determinism.

Funny enough, a different sort of determinism — physiological — has also been suggested as responsible for his change of style. While rebutting that notion, Steven Michels stresses that however the aphorism might have first appeared in his style, Nietzsche quickly recognized the value it held for his philosophical project, which was a conscious rejection of the style of ponderous thought and argument that had traditionally dominated philosophy, and used it as a tool to that end:

Certainly illness played a role in Nietzsche’s brief bouts of creativity. But his aphorisms started much earlier than his chronic illnesses; many of his later books are less aphoristic, and his literary masterpiece, Thus Spoke Zarathustra, is anything but aphoristic. Nietzsche’s philosophising had more to do with the change in his philosophy than physiology. His aphoristic turn marked a turn in his intellectual development, to be sure, but it also allowed for a great advance in his philosophy, in presentation if not in development. His philosophy was limited by conventional forms, such as the treatise and the essay, and the aphorism allowed him to break from that. Although it was initially involuntary or subconscious, Nietzsche soon understood what this change of style allowed him to do philosophically, and he embraced it.

Oh, and about that typewriter:

Unfortunately Nietzsche wasn't totally satisfied with his purchase and never really mastered the use of the instrument. Until now, many people have tried to understand why Nietzsche did not make more use of it, and a number of theories have been suggested such as that it was an outdated and poor model, that it was possible to write only upper case letters, etc. Today we can say for certain that all this is only speculation without foundation.

The writing ball was a solidly constructed instrument, made by hand and equipped with all the features one would expect of a modern typewriter.

 You can now read the details about the Nietzsche writing ball in a book, "Nietzches Schreibkigel", by Dieter Eberwein, vice-president of the International Rasmus Malling-Hansen Society, published by "Typoscript Verlag". In it, Eberwein tells the true story about Nietzche's writing ball based upon thorough investigation and restoration of the damaged machine.

Friedrich Nietszche was not aware that his trouble in using the machine was caused by damage to it during transportation to Genoa in Italy, where he lived at the time. And when he turned to a mechanic who had no typewriter repair skills, the mechanic managed to damage the writing ball even more. In his book, Dieter Eberwein presents all the typescripts Nietzsche ever wrote on his machine (about 60) and reveals the true story concerning the damages. Nietzsche also did not know how to change the direction of the colour ribbon, so that he had to ask the mechanic to help him each time the ribbon was out.

Friday, June 14, 2013

Rattlers and Eagles All the Way Down

Kevin Hartnett:

After posting about John Gray's new book "The Silence of Animals" I found this talk that he gave at the Royal Society of Arts in April. There Gray concisely expressed the argument in his book. He explained that he doesn't deny the possibility of instances of progress, because such a statement is immediately, empirically false. Rather, he explained that he doesn't think that the advance of scientific knowledge has any correlation with advances in our ethical and political practices. That is, we could get smarter and smarter, know more and more about the world, and yet the same deep flaws in the human constitution would continue to cause the same ethical calamities to occur over and over again.

Perhaps counterintuitively, language often obfuscates perception rather than clarifying. Whuddameanis, when we see chimpanzees in the wild conducting violent raids on enemy territory, there's no confusion over the meaning of it. It is exactly what it appears to be. Humans have developed more intricate, complicated methods of attaining power and wealth, as well as a wide variety of proxy simulations of violent conflict, but let me suggest that you try looking at any Internet squabble, not as a potential participant, but as if you're Jane Goodall describing events at Gombe Stream — ignore the abstract symbols, the linguistic justifications, and just observe the empirical reality, the neverending story of shit-flinging and status-seeking.

I'm not bemoaning humanity's failure to embody its professed ideals, lest you get the wrong impression. I'm saying that anger, jealousy, greed, melancholy, pride, scheming after power, etc. are fundamental aspects of human nature. Not only that, none of them are absolutely bad things. There are contexts in which all of them are perfectly rational and effective. And even if these "flaws" do frequently snowball into what we judge disasters and tragedies, there is no philosophy or algorithm that can anticipate and prevent them from ever coming into being. Some people dream that broad, sophisticated taste in the arts will maximize empathy in the audience, while forgetting that the artists themselves are often spectacularly flawed human beings despite their intuitive insight into the human condition. Latter-day positivists favor hard knowledge over touchy-feely intuition and think that an ideal, just society can be rationally deduced into existence, even though the distinction between is/ought and facts/values has been demonstrated over and over, and despite the fact that conscious awareness is insufficiently capable of contending with all the variables of existence and manipulating them to its lasting advantage.

The tragedies and dilemmas we encounter, individually and collectively, are not necessarily errors which can be fixed with more knowledge. Increased knowledge will only be used in service to the same old unenlightened desires, possibly even creating new dilemmas in the process. And even when we know how little we know, we find a way to convince ourselves that we've learned our lesson and become smarter for the experience.

Atheists are, it must be admitted, particularly prone to this. Despite their pretensions to rational objectivity, a disinterested observer might be forgiven for suspecting that atheists are just another group of tribesmen who slew a rival desert god and ate his brains, believing they were ingesting his powers of omniscience. And so we find the Bad News Bears of social justice fighting (once again) over... hats. Seriously, tempers are getting hot and feelings are getting bruised as these oh-so-rational progressives desperately search for a way to justify the ancient urge to hate the everloving fuck out of someone just because. How ridiculous that anyone would ever fight to the death over the doctrine of the Trinity! How absurd that people would take skin pigmentation to be significant of anything important! How irrational that... hey, wait; that sunnamabitch over there is wearing the wrong kind of hat. And walking on our turf! I don't like the looks of him. Let's go fuck him up.

Like I was saying, I'm sure if chimps could talk to us, they'd also explain how, no, this is totally different, we have valid reasons for raiding their territory, they deserve it.

I realize this is just another group of dysfunctional morons on the Internet, of course; I'm not saying they're an actual threat to anything other than their parents' pride, no matter how "dangerous" their Napoleon-in-clown shoes leader likes to pretend they are. The point is just to notice how, even among a subculture that prides itself on its supposed clear, rational, scientific thinking, the same old behavioral templates are right there, ready to be filled in with new details. You know the old myth about the World Turtle? Well, when it comes to human psychology, despite all the fantasies and rationalizations, it's Rattlers and Eagles all the way down.

Wednesday, June 12, 2013

The Superstition of Those Afraid of the Water

When I think of Leon Wieseltier, I usually picture him with Brian Leiter's spitballs dotting the side of his face and neck, and though his relationship with Isaiah Berlin makes me wish I could be favorably disposed toward him, this sort of thing makes it tough:

I don’t call myself a journalist and I can’t call myself a professor, so the only professional term to which I answer happily is “intellectual.” And I think it is a very honourable term and I am happy with it. The Springsteen thing was written against a kind of journalistic complaisance. And to be wicked. I think that it is very important to give people an example of irreverence. We live in a culture of worthless praise.

It seems that in an age of blogging and online journalism everything is either solipsistic back-patting or pure vitriol.

Or it’s talk radio. And I have to say that there is not one blog, out of the eight million that must exist, that I read. The thing about blogging is that it is either someone’s first thoughts—which we know by definition are never their best thoughts—so that’s not interesting, or as time goes by they simply repeat themselves. Moreover there isn’t a lot you can say about anything consequential in 300 words. I write the back page of the magazine and I always wish it was three times as long as it is.

"I don't read any of those modern-day penny dreadfuls. Now stand back as I proceed to tell you all about them." Snort. For a fellow who's so concerned about scientism infringing on the humanities' jurisdiction, he doesn't seem to have any problem dictating the form "serious" writing must take to meet his standards. However, as I'm sure he's consolingly heard a few times in his life, length isn't everything.

(Hey, didn't you hear him say it's important to give people an example of irreverence? I mean, if he's already got all these haughty preconceptions about blogging, I might as well play to type.)

Personally, I prefer as with poetry, so too with prose: study your subject silently, choose your angle carefully, and then illuminate it with a flash of insight, a striking turn of phrase. Or, as a fellow who knew a thing or two about expressing profound thoughts in aphoristic form put it:

For I approach deep problems like cold baths: quickly into them and quickly out again. That one does not get to the depths that way, not deep enough down, is the superstition of those afraid of the water, the enemies of cold water; they speak without experience. The freezing cold makes one swift.

And to ask this incidentally: does a matter necessarily remain misunderstood and unfathomed merely because it has been touched only in flight, glanced at, in a flash? Is it absolutely imperative that one settles down on it? That one has brooded over it as over an egg? Diu noctuque incubando, as Newton said of himself? At least there are truths that are singularly shy and ticklish and cannot be caught except suddenly — that must be surprised or left alone.

There certainly are plenty of blogs that are nothing but tribal vitriol and status competition, but unlike Twitter, there are no inherent constraints on the form a blog takes. It can be anything from an online diary to a space for long-form essays, and like the essay, its adaptable, mongrel nature likely bodes well for its longevity.

Felonious Bunk

Kottke:

In a book called Three Felonies A Day, Boston civil rights lawyer Harvey Silverglate says that everyone in the US commits felonies everyday and if the government takes a dislike to you for any reason, they'll dig in and find a felony you're guilty of.

That looks very interesting, in a pants-shitting sort of way, of course. Additionally, it looks like it might be useful as a definitive response to the "If you're not doing anything wrong, what do you have to hide?" types.

Monday, June 10, 2013

Quot Libros, Quam Breve Tempus (X)



Two more additions to the stack. This is just a fun-sized edition of QLQBT, you might say.

That tour guide last year tried to tell me that William Penn was the first to create genuine freedom of worship in the colonies. The jacket copy of Barry's book says, however:

For four hundred years, Americans have wrestled with and fought over two concepts that define the nature of the nation: the proper relation between church and state and between a free individual and the state. These debates began with the extraordinary thought and struggles of Roger Williams, who had an unparalleled understanding of the conflict between a government that justified itself by "reason of state"-i.e. national security-and its perceived "will of God" and the "ancient rights and liberties" of individuals.

This is a story of power, set against Puritan America and the English Civil War. Williams's interactions with King James, Francis Bacon, Oliver Cromwell, and his mentor Edward Coke set his course, but his fundamental ideas came to fruition in America, as Williams, though a Puritan, collided with John Winthrop's vision of his "City upon a Hill."

Acclaimed historian John M. Barry explores the development of these fundamental ideas through the story of the man who was the first to link religious freedom to individual liberty, and who created in America the first government and society on earth informed by those beliefs. The story is essential to the continuing debate over how we define the role of religion and political power in modern American life.

I haven't really read much about Williams since my school days, so that should be interesting. As for John Gray, whose book I'm already half-finished with and enjoying as usual, I thought this interview with Nick Talbot was one of the better ones I've seen of his, with Talbot's questions actually adding to the quality:

I was surprised to see you so often characterised as a conservative thinker – you certainly don't hold any positions that characterise, say, the “paleo-conservative” American right. (You have identified strains of utopianism in free market neo-liberalism and liberal interventionism and your embrace of James Lovelock's Gaia theory is anathema to most on the right.) Is your conservatism more in the mould of David Hume, perhaps? A sceptical caution over the human tendency to see patterns where there are none.

JG: I’m not sure it makes much sense to talk of conservatism these days. Certainly I share the view, often held by conservatives in the past, that there is such a thing as human nature, that it’s relatively constant and in some ways inherently flawed. (Thinking this way is one reason why I’m not a post-modernist.) It was this type of conservatism that the painter Francis Bacon had in mind when he said he always voted for the right because it made the best of a bad job. The poet T.E. Hulme said something very similar. But that kind of conservatism scarcely exists any more: Today conservative thinking oscillates between neo-con progressivism – a species of inverted Marxism – and paleo-conservative reaction, which amounts to not much more than a collection of ugly prejudices (racism, homophobia, misogyny). Both these versions of “conservatism” seem to me hostile to the conservation of civilised life. The genuine scepticism of David Hume is much preferable to anything that passes as conservative today.

At the same time I doubt if Hume’s rationalistic Enlightenment variety of scepticism is enough – for one thing, he had the good fortune to live before the age of militant political faiths and modern fundamentalism. Montaigne is a better guide, possibly the best, to living in a time of modern wars of faith.

...You argue that popular music's trite language of self-realisation owes much to the Romantic movement's emphasis on originality, but I see it as a logical result of the culture of individualism perpetuated by the New Right; instead of thinking how they can contribute to their community, young people have been encouraged to indulge egoistic fantasies. Is there any hope for encouraging a communitarian ethos in young people?

JG: I wonder if communitarianism means anything any more – think of Cameron’s big society. The prevailing individualism runs much deeper than anything owed to the New Right. Maybe we’re in a time akin to those in which the Buddha and Epicurus lived – in which it’s up to each individual, along with those they care about, to live as well as they can. To be sure, political and other types of collective action may be necessary to defend civilised values. But I don’t think any collective project can or should be viewed as providing meaning in life.

Hume, Montaigne, Buddha and Epicurus. Now there's a dinner party to fantasize about hosting.

Saturday, June 08, 2013

Mind-Monkey in the Middle

Galen Guengerich:

I describe myself as speaking for the majority in the middle: People between the atheists on the one hand and the fundamentalists on the other—people who value individual freedom when it comes to what we believe and how we live, yet reject the traditional views of God, scripture, the creeds, and all that. But I’m not convinced that individuals are the be-all and end-all, either, or that transcendence plays no role in life, as the atheists insist.

I think what we’re trying to do, what I’m trying to do, is speak for that group of people—not just Unitarian Universalists, but people from all points along the religious/non-religious spectrum who are trying to figure out how to balance what is rightly individual in our lives and in our culture with what is necessarily a communal undertaking. I think both ends of the spectrum err in one way or the other: the atheists by being radically individualist and spiritually isolationist; the fundamentalists by being radically collective and leaving no real room for individual belief. It’s a hard balance to find.

In the not-too-distant past, I might have made a predictable snarky remark along the lines of the classic XKCD panel. But honestly, I don't even care all that much about the ostensible subjects here of religion vs. atheism, individuality vs. collectivism. Now, when I read things like this, I'm interested in the cognitive bias which seems far more fundamental of an issue — namely, the Goldilocks tendency to place oneself directly in the reasonable, moderate center of an issue, beset by extremists on either side. I'd be willing to bet most atheists and fundamentalists would fail to recognize themselves in this characterization. They all likely see themselves as maintaining a healthy balance between self and community, with perfectly logical reasons for drawing the lines where they do. It's only those crazy people over there who have it all wrong, obviously. Name any issue, and you'll find the same dynamics.

One of the first things we heard in philosophy class was Socrates' assertion that people are incapable of choosing what they truly believe to be the wrong choice. Ferzample, even if you perform horrible, depraved actions with a gun to your head, it's because you feel that the greater good of preserving your life at any cost takes precedence over violating ethical standards you would otherwise uphold. Along those lines, I wonder if it's possible for most people to view themselves as societal outliers, as exceptions to the norm, without suffering either depression or delusions of grandeur as a result. Psychologically, we seem to need to envision ourselves at the median of whichever social context we find ourselves in.

All of Those So Proud Won't Improve Upon It

Annie Murphy Paul:

Recent research in cognitive science, psychology and neuroscience has demonstrated that deep reading—slow, immersive, rich in sensory detail and emotional and moral complexity—is a distinctive experience, different in kind from the mere decoding of words. Although deep reading does not, strictly speaking, require a conventional book, the built-in limits of the printed page are uniquely conducive to the deep reading experience. A book’s lack of hyperlinks, for example, frees the reader from making decisions—Should I click on this link or not?—allowing her to remain fully immersed in the narrative.

That immersion is supported by the way the brain handles language rich in detail, allusion and metaphor: by creating a mental representation that draws on the same brain regions that would be active if the scene were unfolding in real life. The emotional situations and moral dilemmas that are the stuff of literature are also vigorous exercise for the brain, propelling us inside the heads of fictional characters and even, studies suggest, increasing our real-life capacity for empathy.

Sigh. You know, having been an enthusiastic reader since childhood, I'm almost ideally suited to be favorably disposed toward this message. Wouldn't my ego just love to believe that my ability to concentrate intently on printed material made me wiser, deeper, more authentic? Luckily, I retain enough objective perspective to realize what a load of happy horseshit that would be. It's just one skill set out of many, and by no means the most important. (Ironically, it's hard to escape the impression that, for all the talk of empathy, advocates for the progressive benefits of literature seem to find it difficult to sympathetically imagine that a life lacking in sustained, focused concentration and reading for pleasure could still be fulfilling.)

Don't get me wrong — I'm perfectly content with reading being my main hobby, and I wouldn't want it any other way. But it's not like reaching a certain level of knowledge or experience or empathetic wisdom elevates you above the storm and stress of everyday life, allowing you to always feel serene or in control. Complexity is not a uniformly "positive" thing. Your problems and obstacles likewise increase in complexity as well.

And if being able to vicariously experience such vivid emotions and moral complexity makes us "better" people, wouldn't it follow, then, that the creators who envisioned such situations to begin with would be better as well? If reading great literature is such a net positive, shouldn't the writing of it indicate an exquisitely developed moral perspective? To just go ahead and put too fine a point upon it, are we to understand that great writers are rarely, if ever, shitty excuses for everyday human beings?

Friday, June 07, 2013

Cocaine Runnin' Around My Brain

America, I'm here today because I love and care about you. This is why, after the latest episode, I want to urge you to seek treatment for your addiction to moronic pop-science stories about how everything from chocolate to the Internet, make-up sex and fatty foods is "like cocaine". There are only so many facile comparisons your brain can handle, and I don't want to find you dead of an overdose.

Occam's Restaurant


David Cain:

With an increasing number of options in almost every aspect of life, we presume that our results in each of those areas should be getting better and better, because with each new possibility it becomes more likely that one of them suits us perfectly. Our expectations for perfection and total satisfaction are too high.

As freedom of choice grows, the perfect career, the perfect partner, the perfect schedule or the perfect salad dressing seem more likely to happen. Perhaps they are, but psychologically we’re less likely to be pleased with whatever we do choose, because our satisfaction with what we have shrinks as the number of things we don’t have — or could have — grows.

...The options at mealtime are a microcosm of the lifestyle options available to the ordinary, free Western citizen. We have never been freer to live how we want to live, which is wonderful and empowering but simultaneously taxing and intimidating. I want to take advantage of the freedoms provided by the incredible time we live in without getting paralyzed by too many options and endless unmade decisions.

Speaking of food, it was in his kitchen that Mark Sandman devised the minimalist aesthetic he later made famous in Morphine's music:

These were simple, common-sense ideas. And Mark liked simple. He once told me that if people really wanted to know about his musical aesthetic, they'd be better off asking him about his cooking techniques. "I've applied a lot of that to my music. For example, for years I made myself a red sauce for pasta with oregano, some thyme, some basil, black pepper, salt, some of this, some of that. I thought that's how you were supposed to make it. Then one day I didn't put anything in. I just forgot. And it was the best sauce I ever made. That moment right there taught me a lot."

I agree philosophically, but that's probably because I'm congenitally disposed toward simplicity anyway. My mental switchboard gets overloaded too quickly for me to indulge in endless customization.

Thursday, June 06, 2013

Do or Do Not. There is No "Should"

Bunny Lebowski: Uli doesn't care about anything. He's a nihilist.
The Dude: Ah, that must be exhausting.

Simon Critchley:

Sometimes I think John Gray is the great Schopenhauerian European Buddhist of our age. What he offers is a gloriously pessimistic cultural analysis, which rightly reduces to rubble the false idols of the cave of liberal humanism. Counter to the upbeat progressivist evangelical atheism of the last decade, Gray provides a powerful argument in favor of human wickedness that’s still consistent with Darwinian naturalism. It leads to passive nihilism: an extremely tempting worldview, even if I think the temptation must ultimately be refused.

The passive nihilist looks at the world with a highly cultivated detachment and finds it meaningless. Rather than trying to act in the world, which is pointless, the passive nihilist withdraws to a safe contemplative distance and cultivates his acute aesthetic sensibility by pursuing the pleasures of poetry, peregrine-watching, or perhaps botany, as was the case with the aged Rousseau (“Botany is the ideal study for the idle, unoccupied solitary,” Jean-Jacques said). Lest it be forgotten, John Stuart Mill also ended up a botanist.

In a world that is rushing to destroy itself through capitalist exploitation or military crusades — two arms of the same Homo rapiens — the passive nihilist resigns himself to a small island where the mystery of existence can be seen for what it is without distilling it into a meaning. The passive nihilist learns to see, to strip away the deadening horror of habitual, human life and inhale the void that lies behind our words.

No, no, no. Acting in the world is not "pointless". You can't help but act. Just because an action doesn't have inherent meaning, or ultimate meaning, doesn't mean it has absolutely no meaning. As I keep saying, nihilism is the flip side of universalism, an inverted attempt to keep believing in one rule that is true at all times for all people everywhere.

Put it this way: the question is not "Should I act?" or "Should I not act?" The question is, what do you mean by "should"?

Wednesday, June 05, 2013

Far From the Madding Crowd's Ignoble Strife

Astra Taylor:

Zuckerman, however, is not a knee-jerk naysayer about all things digital. The director of the MIT Center for Civic Media and cofounder of the international bloggers’ website Global Voices, he is extremely enthusiastic about the potential of using technology to connect people across cultures. He wants the Internet to be an empathy machine. The difference between him and the full-throated apostles of cyber-utopianism is that he does not believe that the online world is foreordained to fulfill this purpose, nor does he naively assume that the outcomes of cross-cultural connections will always be desirable.

Gregory Currie:

Everything depends in the end on whether we can find direct, causal evidence: we need to show that exposure to literature itself makes some sort of positive difference to the people we end up being. That will take a lot of careful and insightful psychological research (try designing an experiment to test the effects of reading “War and Peace,” for example). Meanwhile, most of us will probably soldier on with a positive view of the improving effects of literature, supported by nothing more than an airy bed of sentiment.

The archetypal image of the sage is one of serenity and an almost-otherworldly lack of concern for the mundane obsessions of everyday life. In other words, the accumulation of knowledge and experience, rather than simply being a super-sum of positive integers, tends to produce a state of being that would seem very much at odds with the sort of simplistic, progressive partisan vision of "better" or "nicer" citizens. Knowledge is a double-edged blade that can be used for the same values and desires people have always had; increasingly complex experience isn't likely to express itself in platitudes.

Saturday, June 01, 2013

I Tried to Think of Something Deep to Say, But My Well is Dippin' Dry Today

Pulling extra shifts is nice for the bigger paychecks, but I swear, I feel mentally flabby when I have to go a few days without having time to read and write. As Mark Sandman once wailed, don't they know that I've got other plans? Well, truth be told, I have been able to spend a few hours here and there on the web, but I've come away with nothing to show for my efforts but familiarity with the latest fútbol transfer market rumors, the latest evidence of irredeemable shitheelery from certain FTB bloggers, and the usual digital detritus, none of which is conducive to my practice.

You lovely people don't come here for that sort of frippery, do you? Of course not; you come here to delight in my prose stylings, and who could blame you for refusing to settle for less. But while I recover from this busy week and prepare for the upcoming one, I will offer, in lieu of my own writing this fine day, some writing about writing which I've read and appreciated: Manjula Martin considering the ways in which a dreaded "day job" keeps writing from disappearing up its own Platonic ideal, and Freddie Dee Bee taking a hammer and chisel to your whole identity as a writer to see if anything is left once he's done.