Sunday, December 30, 2012

Waldeinsamkeit

Florence Williams:

The kale analogy is pretty apt, because it turns out that even when we don’t enjoy spending time in nature, like during lousy winter conditions, we benefit from it just the same. At least that’s what Toronto’s Berman found when research subjects took walks in an arboretum on a blustery winter day. The walkers didn’t really enjoy themselves, but they still performed much better on tests measuring short-term memory and attention.

Japanese researchers understand our draw to nature, but American researchers understand our pull away from it—our distractions, inertia, and addictions. They want to help motivate us, to make our doses of nature so palatable and efficient that we hardly notice them. This is the next frontier in forest-therapy science, all aided by brain imaging.

Berman, for example, wants to figure out exactly which features (ponds, trees, biodiversity) yield the biggest bang in the brain. The idea is that once researchers know more about what makes our brains happy, that information can be fed into public-policy decisions, urban planning, and architectural design. The research has profound implications for schools, hospitals, prisons, and public housing. Imagine bigger windows, more trees in cities, and mandatory lie-on-the-grass breaks.

This approach, of course, is classically Western. Manipulate the environment; feel nature without even trying. As for me, I’m going to be looking for a more East-meets-West approach. I’ll try harder to quit checking my text messages and instead watch for rock bass jumping in the C&O Canal. Scratch and sniff some pine cones. Run my hands through the moss. Maybe even drink a little bark tea.

My memories of winter break from my school days largely revolve around wandering through the woods for miles. Some of the best hours of my life have been passed among coniferous companions. Some of the most useful thinking I've done has been while wedged in a seat among the branches. So I have no trouble accepting that there's some kind of primal attraction there. But as with meditation, I hate to see the pleasure of the experience being jostled aside to make room for considerations like improved efficiency and increased white blood cell count. Not every experience needs to be justified in terms of making you a more productive worker-drone with a greatly-extended warranty. True mental health, to me, needs to make room for precisely such "useless" activity. I don't wander in the woods or listen to music to become more accomplished in my career; rather, I perform my job in order to fulfill my societal obligations and thus be granted the leisure to do what really matters. As far as I'm concerned, every event in my life exists to make my wanderings among the trees more satisfying. That's the purpose of it all. That's why I was "put" here. To be all the woods-moseyer I can be.

Saturday, December 29, 2012

Seems Like There's No Time At All Between Lotteries Any More

Peez '06:

The comment section at Pharyngula is becoming a bit too wild west lately. I am all for vigorous, unhindered language and the expression of strong opinions, and I think dumb ideas need to be dealt with harshly, but we also need to allow opportunities for those ideas to be fully expressed. Too often, the conversations are beginning to go like this:

Stranger: I think…
Old hand: [Pulls out six-gun, shoots stranger down]I do believe I didn’t like your accent, stranger, and you were a bit cross-eyed.

I’m not at all keen on this. It makes the comments a very hostile place to new people (I like seeing new people here, don’t you?) and if it keeps up all we’re going to have left are the twitchiest, most psychopathic contributors. To encourage a little more restraint, I’m going to ask everyone to voluntarily impose a 3 comment rule on themselves. What that means is that if someone comes along and says something, no matter how outrageous, engage them in polite conversation first, give them a chance to clarify and expand on the idea, and then if it’s still utterly insane, you can cut loose.

Peez '12:

I am quite fond of most of the commentariat here, even when they’re turning their teeth on me — it is exactly what I want, a fierce legion of harsh, sarcastic, opinionated, ferocious critics who can unreservedly shred fools and assholes and who are unrestrained in their expression. I’m not going to back away from my comments section at all; you are the people I want here, and I affectionately regard you all as my local meatgrinder.

Please don’t change. And when necessary, unleash hell.

Let's be clear — there's nothing hypocritical about simply changing one's mind, if that's all this were. It's just funny to see another reminder that wisdom and morality are not the results of linear progression, whether on the macro- or micro-level. Individuals and cultures both have to keep relearning the same lessons over and over again. He's aged six years, but somehow managed along the way to forget what was previously clear to him. No doubt he's convinced himself that the uniquely monstrous nature of his enemies and the absolute urgency of his moral mission justify his volte-face. And no doubt he's confident that the power of his "rationality" will allow him to surefootedly navigate the well-trod path where other ideologues have fallen. His comment section combines the worst elements of both in-group cliquish posturing and "read the fucking manual" hostility, but his newfound missionary zeal means that he now sees the "twitchiest, most psychopathic contributors" as crucially important shock troops.

I repeat what I said just a few weeks ago about basic principles taught to anyone who ever took a Philosophy 101 course: Never allow your ego to land you in a position to be hostile to correction or questioning. The easiest way for that to happen is when you allow your self-image to get intertwined with your ideological stances — "I'm right about everything of importance, and I sure do look cool being that way." When you turn differences of opinion into zero-sum battles of the utmost moral significance, you make it impossible for anyone to back down gracefully. And when you inevitably put a foot wrong, the panic of perceiving your entire identity under attack makes you look even more foolish when you employ fallacy after fallacy to defend yourself at any cost. No intellectually honest person should ever want to encourage such group dynamics.

Saturday Shuffle

  1. Brant Bjork & the Operators -- Cheap Wine
  2. The Throbs -- Only Way Out
  3. Zero 7 -- Light Blue Movers
  4. Wolfmother -- 10,000 Feet
  5. Built to Spill -- Else
  6. Fever Ray -- Dry and Dusty
  7. Son Volt -- Slow Hearse
  8. Chad VanGaalen -- Wind Driving Dogs
  9. Skinny Puppy -- Magnifishit
  10. Pinback -- Sediment
  11. Iron & Wine -- Promise What You Will
  12. Godflesh -- Antihuman
  13. Rasputina -- Thimble Island
  14. Blur -- Jets
  15. Prong -- Prove You Wrong (Fuzzbuster Mix)
  16. Loretta Lynn & Conway Twitty -- You're the Reason Our Kids Are Ugly
  17. Populous -- Hip-Hop Cocotte
  18. Dead Can Dance -- Return of the She-King
  19. Rhea's Obsession -- Between Earth and Sky
  20. Twinemen -- Spinner

Thursday, December 27, 2012

Spear A Jewel

Phil Oliver:

"Spiritual"

I'm still not sure why some atheists have such revulsion for that word. The root just means "breath," as I live and breathe.

Philip Sheldrake:

However, in broad terms “spirituality” stands for lifestyles and practices that embody a vision of how the human spirit can achieve its full potential. In other words, spirituality embraces an aspirational approach to the meaning and conduct of life – we are driven by goals beyond purely material success or physical satisfaction. Nowadays, spirituality is not the preserve of spiritual elites, for example in monasteries, but is presumed to be native to everyone. It is individually-tailored, democratic and eclectic, and offers an alternative source of inner-directed, personal authority in response to a decline of trust in conventional social or religious leaderships.

Matthew Hedstrom:

Spirituality can mean many things, of course, and the language of spirituality is used by traditional religious adherents as well as the religiously unaffiliated. But only the “nones” have made it into a cliché: “spiritual but not religious.”

The history of American spirituality reveals that our commonplace understanding of spirituality—as the individual, experiential dimension of human encounter with the sacred—arose from the clash of American Protestantism with the forces of modern life in the nineteenth century. While religious conservatives fought to stem the tide, giving rise to fundamentalism, religious liberals adapted their faith to modernity, often by discarding orthodoxies in favor of Darwinism, psychology, and comparative religions.

The majority of today’s religious “nones”—those who claim no religion but still embrace spirituality—are engaged in the same task of renovating their faith for a new historical moment. And typically, they draw from this same liberal religious toolkit. Today’s unaffiliated, like the liberals of previous generations, typically shun dogma and creed in favor of a faith that is practical, psychologically attuned, ecumenical—even cosmopolitan—and ethically oriented.

This liberal spirituality, as it has evolved over time, has been deeply entwined with media-oriented consumerism. Of course Americans of all religious varieties have been deeply influenced by consumerism, but media and markets have particularly shaped the religious lives of those without formal institutional or community ties. The religiously unaffiliated might not attend services, but they “do” their religion in many other ways: they watch religion on TV and listen to it on the radio; find inspiration on the web; attend retreats, seminars, workshops, and classes; buy candles and statues, bumper stickers and yoga pants; take spiritually motivated trips; and, perhaps most significantly, buy and read books.

So, perhaps we can say that "spirituality" is the perennial religious impulse, formerly an organic outgrowth of traditional community and kin, now refracted through the prism of individualistic, market-dominated, globalized consumerism. I'm not saying that like it's good or bad; it is what it is. The times, they done a-changed. But whether you buy your beliefs via an independent bookstore or inherit them with your family name, complacency is always the danger to guard against, the cataract growing over your third eye, which is why, to answer Oliver's excerpt with Hedstrom's, I have such revulsion for the term: because being "spiritual" has itself become a thoughtless cliché that means everything and nothing simultaneously.

My Friend Says We're Like The Dinosaurs

Paul Kingsnorth:

Is it possible to read the words of someone like Theodore Kaczynski and be convinced by the case he makes, even as you reject what he did with the knowledge? Is it possible to look at human cultural evolution as a series of progress traps, the latest of which you are caught in like a fly on a sundew, with no means of escape? Is it possible to observe the unfolding human attack on nature with horror, be determined to do whatever you can to stop it, and at the same time know that much of it cannot be stopped, whatever you do? Is it possible to see the future as dark and darkening further; to reject false hope and desperate pseudo-optimism without collapsing into despair? It’s going to have to be, because it’s where I am right now. But where do I go next? What do I do? Between Kaczynski and Kareiva, what can I find to alight on that will still hold my weight?

...If you don’t like any of this, but you know you can’t stop it, where does it leave you? The answer is that it leaves you with an obligation to be honest about where you are in history’s great cycle, and what you have the power to do and what you don’t. If you think you can magic us out of the progress trap with new ideas or new technologies, you are wasting your time. If you think that the usual “campaigning” behavior is going to work today where it didn’t work yesterday, you will be wasting your time. If you think the machine can be reformed, tamed, or defanged, you will be wasting your time. If you draw up a great big plan for a better world based on science and rational argument, you will be wasting your time. If you try to live in the past, you will be wasting your time. If you romanticize hunting and gathering or send bombs to computer store owners, you will be wasting your time.

And so I ask myself: what, at this moment in history, would not be a waste of my time? And I arrive at five tentative answers:

You can check out his answers if you wish; I merely find it interesting that he tries to inhabit such a timeless "big picture" perspective while still using the temporal language of abstractions. He speaks of time like another famous symbolic abstraction, as if it's something to be invested wisely or "wasted", which makes me wonder if he hasn't quite fully accepted the logical conclusion of his convictions. Either cling to the false certainty of nihilism (another abstraction) and declare life itself to be a waste of time, or embody your principles right here and now and just live your life, come what may. Stop thinking of life as a linear progression toward a goal and just live.

Depraved New World

Ron Rosenbaum:

At last we come to politics, where I believe Lanier has been most farsighted—and which may be the deep source of his turning into a digital Le Carré figure. As far back as the turn of the century, he singled out one standout aspect of the new web culture—the acceptance, the welcoming of anonymous commenters on websites—as a danger to political discourse and the polity itself. At the time, this objection seemed a bit extreme. But he saw anonymity as a poison seed. The way it didn’t hide, but, in fact, brandished the ugliness of human nature beneath the anonymous screen-name masks. An enabling and foreshadowing of mob rule, not a growth of democracy, but an accretion of tribalism.

It’s taken a while for this prophecy to come true, a while for this mode of communication to replace and degrade political conversation, to drive out any ambiguity. Or departure from the binary. But it slowly is turning us into a nation of hate-filled trolls.

...And here’s where Lanier says something remarkable and ominous about the potential dangers of anonymity.

“This is the thing that continues to scare me. You see in history the capacity of people to congeal—like social lasers of cruelty. That capacity is constant.

“Social lasers of cruelty?” I repeat.

“I just made that up,” Lanier says. “Where everybody coheres into this cruelty beam....Look what we’re setting up here in the world today. We have economic fear combined with everybody joined together on these instant twitchy social networks which are designed to create mass action. What does it sound like to you? It sounds to me like the prequel to potential social catastrophe. I’d rather take the risk of being wrong than not be talking about that.”

It seems like two different things are being conflated here. I mean, I agree with the general sentiment in that last paragraph, as I've said many times. But in each one of those infuriating instances, it hasn't been some anonymous sociopath instigating flash mob rule; it's been a famous film director, or bloggers who post under their own names to an enormous audience. Even if most hateful trolls are anonymous, it doesn't follow that most anonymous commenters are hateful trolls. The grease on the skids to this particular hell is the fact that social media is designed to remove the onerous burdens of reflection and effort, the very things that might prevent people, during a rush of blood to the head, from retweeting vague rumors or attempting to casually wreak havoc with an opponent's job or personal life. People are the same stupid, impulsive primates they've always been; the problem is that so many of them now have the personal tools to drastically magnify, amplify and accelerate whatever petty thought or emotion flits through their mind. Sign your name to your opinion or don't, but much more importantly, stop and take some time to think before you start a snowball rolling.

Sure, you can force people to provide their full name, address, place of employment, etc. if they want to express an opinion online, but all you're going to do is create an anodyne environment where no one dares express anything remotely controversial (i.e., actually interesting) for fear of attracting the crusading attention of the Internet's Erinyes. Personally, I'm more scared of those who feel empowered to act by virtue of representing conventional, moralizing wisdom than I am of those who express subversive opinions under a cloak of anonymity.

Monday, December 24, 2012

A Sting In The Tattletale

Robin Varghese:

Erik Loomis blogs over at one of the smarter blogs around, Lawyers, Guns and Money. His position at the University of Rhode Island has come under attack, and there are a couple of statements in support. If you support him after reading the details, you may want to sign one of the statements in support. 

Oh, no, not at all. I've thrown in the towel on this issue. Now, I'm all for letting the logic of Mutually Assured Disingenuous Boycott Destruction take its inevitable course. By all means, keep on feigning outrage at impolitic comments by your political opponents if you think it might cost them their job and thus win you an irrelevant scalp.

Sunday, December 23, 2012

Santutthi Paramam Dhanam

Wayne Curtis:

What do we lose by walking less, and breaking up our walks into Halloween-candy sized missions? We lose that opportunity to tightly stitch together our world. A long walk — it takes about three hours to walk 10 miles, and without breaking a sweat — gives us time with our thoughts, and establishes the right speed to appreciate the complexity of the world around us. It gives us time to plait the warp of random observations and the woof of random thought. We create a narrative and a place. Americans drive an average of 13,400 miles each year, or about 36 miles a day. The one time people spend long periods alone with their thoughts tends to be in a car — on long drives or stuck in traffic. But it’s not the same. In a car, we’re cocooned, isolated from a complex environment that can engage us.

And as traffic historian Tom Vanderbilt has noted, our highway system today essentially mimics “a toddler’s view of the world, a landscape of outsized, brightly colored objects and flashing lights” as we speed along “smooth, wide roads marked by enormous signs.” Heading down an on-ramp to merge onto a highway, it’s as if we’re entering a day care center for adults. We push on pedals and turn a big wheel. We communicate with others by blaring a horn that plays a single note, or by employing a hand signal that involves a single finger.

“‘The pedestrian mind doesn’t get very far in a day, but it has the opportunity to see where it is going,” noted a writer in the Saturday Review of Books. That was written in 1928, and even then — when the gulf between walking and driving was scarcely a gully — he could see the outlines of two differing ways of thinking: The walking mind and the driving mind. (“Vehicular minds move under some other power than themselves and hence grow flabby and become crowd minds, standardized and imitative.”)

Like Robert Shaw's death scene in Jaws, I felt myself helpless with horror as the entire point of this essay tumbled, slipped and slid down the logical incline, dragging the reader toward the mindlessly gnashing teeth of the inevitable Nicholas Carr reference. And sure enough, crunch crunch munch gulp, the modern world is makin' us stoopid.

You see this a lot. I suppose "Hey, I enjoy casually strolling around and thinking 'bout stuff" doesn't have enough of a hook to hang an essay upon. Too straightforward. Gotta have some kind of peripheral angle, something topical, something to flatter engage the reader. Sighing about how shallow and frivolous most people are (not the author, and certainly not the discerning readers); now, that's a perennial theme. Wistful reflections on a lost age when things were otherwise, ditto. "Strolling around thinking about stuff makes you a better, deeper, more authentic person than all the stupid schlubs in their stupid cars," ah, that's the stuff we're looking for.

There's a receptive, contemplative state of mind that I feel is most conducive to genuine intelligence. Perhaps some people are more congenitally inclined to it, but you know what another necessary ingredient is? Time. Leisure. A principled refusal to Taylorize one's entire life for the sake of accomplishment and efficiency. You don't have to live in a walkable community. You don't have to handwrite all your letters with a quill on a scroll. You don't have to spend an hour every evening cooking your food with ingredients from the farmer's market, or make your own soap from scratch, or sit on an official meditation cushion. None of those affectations will make a profound difference in your life if you don't have that.

Saturday, December 22, 2012

And I Am An Anarchrist

Don't know what I want, but I know how to get it:

But what is the alternative? Mainstream Christianity has become complicit in the very injustices Jesus confronted. And conventional anarchism leaves me cold – lacking in beauty and mystery and hope that we can ever become more than we are.

Christo-anarchism is the way of repentance, of rejecting, like Jesus, the three temptations of Satan, saying no to our religious, economic and political dominance and working toward a world where nobody has power over another.

If you choose Christo-anarchism, be prepared for a hard road. Anarchists will reject your spirituality and Christians will reject your politics.

Where nobody has power over another? What does that even mean, practically speaking? Well, it doesn't mean anything in that sense; it's just an articulation of inchoate emotion. Lots of people like to make reference to the opium of the people, but they don't often include the lines immediately preceding that famous pull-quote, which change the tone from sneering contempt to sympathy: "Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions." That's what's being expressed here.

It's easy and understandable to enact an ad hoc response to a particular injustice, but it doesn't necessarily follow that "injustice" of all types can be reduced to an essential ingredient, summarized and classified, after which the entire economic/political/social system can be rationally revamped from step 1 to prevent it from ever forming again. "It's all about eliminating power!" "It's all about the patriarchy!" "It's all about controlling the means of production!" I know that sounds banal, but it's really amazing how many ideological edifices rest on a non-sequitur as their cornerstone.

Turning Every Good Thing To Rust; I Guess We'll Just Have To Adjust

Alone:

Imagine a large corporate machine mobilized to get you to buy something you don't need at a tremendously inflated cost, complete with advertising, marketing, and branding that says you're not hip if you don't have one, but when you get one you discover it's of poor quality and obsolete in ten months. That's a BA.

When we see a welfare mom we assume she can't find work, but when we see a hipster we become infuriated because we assume he doesn't want to work but could easily do so-- on account of the fact that he can speak well-- that he went to college.  But now suddenly we're all shocked: to the economy, the English grad is just as superfluous as the disenfranchised welfare mom in the hood-- the college education is just as irrelevant as the skin color.  Not irrelevant for now, not irrelevant "until the economy improves"-- irrelevant forever. The economy doesn't care about intelligence, at all, it doesn't care what you know, merely what you can produce for it.  The only thing the English grad is "qualified" for in this economy is the very things s/he is already doing: coffeehouse agitator, Trader Joe's associate, Apple customer.................................................. and spouse of a capitalist.

Of course I'm not happy about this, I like smart people, but that's the new reality.  There was a time where women went to college to get an MRS degree, and I am telling you that that time is today, there is nothing else of value in there.  Sure, some college women go on to become doctors and CEOs, and some go on to become child pornographers and Salon writers, none of those things have anything to do with what happened in college.  If you are going to college to get an education and not to meet guys, you are insane, literally insane, delusional, in reality one is never going to happen and the other is going to happen anyway, and you could have gotten both for free at a bookstore.  Worked for me.

..."I have a degree." No one assumes you're smart because of it, so what was the point?  You were tricked, your parents were tricked, your peers were tricked, your employers were not tricked at all.  "There's more to a college education than employability."  No there isn't.  I am not anti-liberal arts, I am all in on a classical education, I just don't think there's any possibility at all, zero, none, that you will get it at college, and anyway every single college course from MIT and Yale are on Youtube.  Is that any worse than paying $15k to cut the equivalent class at State?

I'm really loving this blog; thanks to Shanna for sharing the link.

Friday, December 21, 2012

For Me, They Were Steps

For me, they were steps, I have climbed up upon them – therefore I had to pass over them. But they thought I wanted to settle down on them.

— Nietzsche

Jessica Berry:

Pyrrhonian skeptics are those who adopt a certain practice with a nod to Pyrrho of Elis, a Greek figure of the mid-fourth to mid-third century BCE, who actually left no writings but with whom the practice is said to have originated. Our best knowledge of this variety of skepticism in fact comes from a second century CE physician named Sextus Empiricus. What makes the Pyrrhonists unique among other sorts of skeptics is that on all speculative, “philosophical” questions, they suspend judgment. This ephectic attitude is in fact the hallmark of Pyrrhonism. From their point of view, other skeptics turn out to be nothing more than negative dogmatists. And that’s a crucial observation, because it’s what gives the Pyrrhonist his claim to being a real skeptic – a word that originally means “inquirer.” If I’ve suspended judgment on an issue, it makes sense for me to continue investigating. If, on the hand, I’ve made up my mind that, say, there is no God or that values don’t exist or that knowledge is impossible, then I’ve closed off that avenue of investigation and come to rest with a position I have to defend no less vigorously than my dogmatic opponents.

A Pyrrhonian skeptic, then, is essentially someone with a peculiar talent for countering any argument with an opposing argument. It’s crucial to see that the Pyrrhonist is not a stubborn sort of person, unwilling to be convinced; it just happens to be devilishly hard to convince him, such is his talent for opposing one argument to another. In the face of his keen awareness of arguments on both sides of every issue, he suspends judgment, and a state of well-being — psychological equanimity – is said to follow this suspension “like a shadow follows a body.” And this is the end at which Pyrrhonian skepticism aims: psychological well-being and health.

The aim of these thinkers (or, more precisely, these practitioners) was not to advance theories, but was instead to cure, where they could, what they called the “conceit and rashness” of dogmatic philosophers. I read Nietzsche’s perennial concern with health as expressing fundamentally the same aim.

3:AM: So is it through the lens of Pyrrho that we should understand Nietzsche’s attitude to knowledge and truth?

JB: I do. And I think the metaphor of a “lens” is particularly helpful here. I want to be clear about the nature of the relationship between the Pyrrhonists and Nietzsche, because I think that philosophers often aren’t perspicuous enough about the nature of historical “influence.” I’ve in fact deliberately avoided, or at least have highly qualified, the claim that Nietzsche is “influenced” by Pyrrho or by Sextus. You won’t find in Nietzsche’s published work any reference at all to Sextus Empiricus, and you’ll find only a couple of allusions to Pyrrho himself. And Nietzsche certainly doesn’t identify himself as a Pyrrhonist. My own view is that if we really take the time to familiarise ourselves with this variety of skepticism, if we come to appreciate its motivations and recognize the moves standardly made by Pyrrhonian skeptics, then we cannot fail to see these motivations and many of the same moves in Nietzsche’s writing, and we’ll come to see Nietzsche’s work in a new light, one in which it becomes less opaque, more coherent, and even more subtle and interesting. So the best description of my interpretation of Nietzsche would be that I read him on the model of Greek skepticism.

Very interesting interview, as is often the case on 3:AM. Her book also looks interesting, but it's also currently going for a steep fifty bucks on Amazon. Of course, I'm sure that wouldn't present any problems for one of my fabulously wealthy readers possessed by the generous spirit of the season, nudge nudge hint hint.

...adding, Dave Maier has more.

Thursday, December 20, 2012

Lowbrow Nobility

I'm an academic, I defend high culture, but I think we must also propose other, different paths. Given the disorientation of the contemporary world, what we have to do is restore people's dignity and faith in action. Not just faith in the knowledge and the enjoyment of great works. High culture helps create an individual, but so does the fact that individuals are actors who construct their own world. Teaching must not stand still or fly in the face of television and the like. Teaching has to provide the tools for individuals to become creators, not just of art or literature but of everything. High culture, humanism, must work alongside other ways, but if we take it to be the central one we will have problems. In the society of entertainment, it's more difficult for the masses to participate in this cultural ferment. For people without the necessary education, reading Ulysses today is difficult – though not impossible. We can live, and live well, in a dignified manner, without knowing the classics.

We are in agreement on our diagnosis of the society of the spectacle originating with the collapse of aesthetic hierarchies. But here we have to step back a little and observe that the society of the spectacle is not the only culprit. It began with the highest culture: the avant-garde. It's there that the real attack occurs against academic art, the "beautiful". Duchamp was not part of the society of the spectacle, yet he was the one who opened the door to the idea that we could put anything in an exhibition and that it would be called "art". The seeds of the collapse of aesthetics and high culture are within high culture itself.

In the end, the society of the spectacle hasn't changed aesthetic hierarchies much. What has it done? Modern twentieth-century society has created something unheard of in history: an "art of the masses". Take cinema for example. A film is a work aimed at everyone, regardless of their cultural baggage; you don't have to have read the classics to appreciate it. The cinema hasn't changed aesthetics; it's created something different. It has created an art of entertainment that can give us mediocre pieces of work but magnificent things too. Increasingly, average movies, works that are neither great art nor bad, produce emotions and make people think.

The entire debate between Gilles Lipovetsky and Mario Vargas Llosa is interesting, but this part in particular makes me think of something Brian Leiter was struck by in reading Nietzsche, the suggestion that true "greatness" is impossible in democratic, egalitarian circumstances where altruistic concerns tend to trump the extremely self-centered drives and passions so often found in profoundly creative people.

Wednesday, December 19, 2012

The Migrant

Hobo Autumn hoists his bindle,
hitchhikes out to another year, a warmer clime,
hoping to catch up with Spring—
then Winter arrives, demanding entrance,
banging at the door with cold fists as if he lives here,
doffing his hat to show where he keeps long nights—
when he opens his suitcase in the dank hallway,
darkness spills onto the floor,
a few icy stars roll across the rug—
he hands out freezing rain as if it were candy,
and from his frozen pockets he draws forth
a penny-whistle for the children,
upon which he blows a chill wind.
We give the old miser the extra room,
the one with the leaky window
where the draft comes in,
counting the days until he moves on.

— Jack Peachum

Monday, December 17, 2012

Privilege Midden

Tom Midlane:

Privilege-checking plays into the dangerous postmodern fallacy that we can only understand things we have direct experience of. In place of concepts like empathy and imagination, which help us recognise our shared humanity, it atomises us into a series of ever-smaller taxonomical groups: working class transsexual, disabled black woman, heteronormative male.

...Privilege becomes an inescapable feedback loop: any attempt to critique privilege-checking is met with the retort: “You’re privileged enough to have the luxury not to think about privilege.” But that’s not it. I’ve always been aware that as a child of a white, middle-class family, I have life easier than some people – but that’s precisely what drives me on to seek social justice for those less fortunate than myself. Prejudice exists. We live in a radically unjust world. But turning our personal circumstances into some sort of pissing contest achieves precisely nothing.

In theory, it can be a useful analytic framework. Prejudice can be more subtle than outright discrimination; many core assumptions can be taken for granted until someone else calls them to your attention. But in everyday practice, as evidenced all over the blogosphere, it barely amounts to more than an acceptable version of ad hominem argumentation (possibly even the genetic fallacy). Accusing someone of unchecked privilege is a way to avoid answering any of their arguments on the merits. It's a way of keeping conversations on a short leash and choke chain, ready to be quickly yanked back into line. I have yet to see an accusation that allowed for any response other than a groveling apology — attempting to argue one's innocence is dismissed as "doubling down," the equivalent of Freudian "denial". All it does is further confirm one's guilt (especially in the zealous eyes of the newly-converted; I mean, all I could think of here was a Calvinist preacher delivering a pulpit-pounding, fire-and-brimstone sermon on original sin).

But again, it's not that the analysis is completely invalid, it's just that, ironically, privilege is often most clearly demonstrated by those so quick to appoint themselves the arbiters of correct thought and behavior.

...adding, Freddie continues insightfully addressing the topic:

...The essentialism rises from the absurdity of speaking about nonwhite people as some sort of unified bloc.

I brought up the fact that, if I'm going to abandon any particular perspective on race myself and merely adopt the positions of nonwhite people, I might choice nonwhite people whose views are deplorable. I brought up Allen West in our conversation. My point about Allen West is simple: when people say "you should give up your racial arguments and simply listen to what nonwhite people say," they are suggesting that all nonwhite people have the same views. Allen West is black, and he is an Islamophobe. So when he says vile things about nonwhite Muslims, am I obliged to keep quiet, because of his greater understanding of race and racism?

Q dug deeper: "I explicitly specified the kind of people that would be valuable to link and implicitly excluded people who've internalized white supremacy to anti-black, racist ends."

Which is to say (explicitly) that no nonwhite person could arrive at opinions on race that Q finds objectionable unless that person had internalized white supremacy. This is the height of liberal essentialism, the need to look on nonwhite people not as people, with individual agency and fully developed consciousness, but as symbols of purity, which dehumanizes and infantilizes them. I will admit to not always knowing exactly what is right or wrong when we talk about race. But I am damn sure that saying that nonwhite people can only disagree with me because they've internalized white supremacy is a terribly ugly idea.

Yes, precisely. And thus you encounter the spectacle of progressives like DougJ and TBogg, who will snark about those racist Tea Partiers all day long, brazenly calling a black conservative a "lawn jockey" and "Fox's house liberal" with no hint of shame. Boy, we've already decided what's best for you; now shut your mouth and get back where you belong. You owe us. Or that of PZ Myers, who uncharitably interprets remarks by a prominent feminist as being "sexist" before proceeding to instruct her on proper feminist concerns. (Performed by a member of the out-group, this would be dismissed as what the cool kids call "mansplaining".) It's taken for granted that the diversity of racial/gender viewpoints they claim to want to solicit will be in agreement with their own; the FTBers, I'm sure, have no interest in hearing about concerns from the communities of conservative Latino Catholics, homophobic black Baptists, or women opposed to abortion unless they deign to patronize them with explanations of how they are simply the victims of white supremacy-induced false consciousness.

A more cynical person than me might say that to certain white male social justice warriors, women and racial minorities are merely exotic vessels for holding and transporting beliefs and opinions which have already received the SJW seal of approval. A more knowledgeable person might even be able to reference which branch of critical theory would describe this itself as inherently racist/sexist.

Saturday, December 15, 2012

Educated Far Beyond Their Capacity To Undertake Analytical Thought

JKubie:

We live in a highly complex universe. Most of what we call our knowledge of the universe comes from culture; that is, it is passed from one individual to another, rather than having each individual learn on his/her own. We don’t feel that way at all. Each of us feels he/she has a deep understanding, and that we are clear-eyed observers. But we are kidding ourselves. Given any area of knowledge — literature, nuclear physics, climate change, psychology — the set of true experts is extremely small. Experts are the scientists who can perform and understand experiments whose outcome creates the cultural database of knowledge (or the database of a particular belief bubble).

I’ll finish with one pertinent (and extremely annoying) example: climate change. In the course of the last decade I’ve gotten into numerous arguments about climate change. I believe in anthropogenic climate change — the gradual warming of the earth due to human activity. What’s my evidence for it? Experts I trust told me so. That’s it. On two or three occasions I’ve learned the scientific basis for these opinions. But that’s irrelevant. I would have to study for, perhaps a decade to have a personal opinion that mattered. No amount of fancy talk could push me one way or the other. In such a complex area I rely totally, totally, on the opinions of people I consider experts. It is possible that my judgement of who is an expert may change, but that is the only thing that could change my opinion.

We consider ourselves experts, that we think for ourselves. But for 99% of our knowledge, we rely on the authority of others.

Heh. This has actually been on my mind for a couple months, ever since I read Razib Khan saying essentially the same thing:

(T)he smart know where to go to reinforce their biases. That is, they’re far better at motivated reasoning, and become progressively more polarized and ideological.

My point is that the reality is on many topics very few of us ‘reason for ourselves.’ Rather, we trust certain people who know better. On economics smart liberals trust Paul Krugman, and smart conservatives trust Greg Mankiw. Not only are these individuals gifted with a specialized knowledge of economics in relation to the typical smart person, but they’re much smarter than average. That’s one reason I’m usually not interested in talking politics in detail with people: why not just go to the source that they’re garbling?

Practically speaking, of course, very few of us are in any position to accurately assess those sources. I know I'm not. Like Socrates, all I know is how little I know, but unlike Socrates, I can't even console myself by smirking with disingenuous irony as I say that.

Isaiah Berlin distinguished between foxes and hedgehogs in one of his most famous essays, with foxes being those who know many things, while hedgehogs know one big thing. Me, I'm an intellectual magpie, collecting the jewels of other people's thoughts and bringing them back to my nest to set them in pleasing arrangements. At best, perhaps, a slightly eloquent parrot.

And I Swear That I Don't Have A Gun

Will Oremus:

Crowd-sourced reporting via social media can be invaluable, particularly in situations like the Arab Spring, the Haiti earthquake, or Hurricane Sandy, in which critical, actionable information about what’s going on is widely distributed among the populace. But as we also saw in the immediate aftermath of the Aurora shooting, we tend to expect too much of the Internet when it comes to quickly pinpointing the identity and motives of the perpetrators of crimes. In those cases, information is usually concentrated in the hands of a few, and most of them are already busy talking to authorities. Nice as it would be to have immediate answers, jumping to the wrong conclusion only compounds the damage. The truth may be out there, but it's not always just a click away.

There's this friend of mine. Years ago, her teenage son was accused by some friends of being the mastermind of a plot to carry out a Columbine-style attack on his school. A politically ambitious prosecutor got wind of this and took the opportunity to make a criminal case out of it. As it turned out, there was no plot, there were no weapons; there was nothing but some cryptic remarks on MySpace and some adolescent rumor-mongering. Nonetheless, his name and picture ended up on the front page of the newspaper under the shrieking headlines, despite his being underage. A year or so later, the court case had come to nothing, but the subdued retractions, of course, didn't rate the front page or even the slightly-larger font.

Even the relatively-glacial pace of "old media" didn't necessarily protect against that sort of potentially-disastrous misinformation being disseminated.

Most of the shiny-object fascination with social networking is a harmless, if intensely annoying, kind of stupidity. But these increasingly common gestures in the direction of flash mob justice make me want to take those whose myopic self-absorption is far out of proportion to their actual presence and grind their stupid fucking faces into their smartphones.

Caught Within Their Own Unraveling

Meng-hu:

For Marcus (Aurelius) lived a grand contradiction: an emperor reduced by wisdom to one skeptical of duty and purpose, to its inner vitality and beneficence. He was reduced to fideism (echoed centuries later by another fideist, Montaigne), wherein duty is followed not because it is good but because ethics consists in the necessity to carry out our duty, as if the carrying out alone was ethics, even when the duty is not intrinsically good and could no longer engender faith. Indeed, duty in this sense was Rome itself for Marcus, necessary to defend and maintain even when he no longer believed in its efficacy.

Duty and purpose are dubious worldly enterprises. Yet Stoicism is the philosophy resonating with the perception of failure, attracting intelligent Romans caught within their own unraveling, both of their person and their world.

This reminds me of something from John Ralston Saul:

But it is virtually impossible to maintain healthy scepticism when power is in your hands. To do so would require living in a state of constant personal conflict between belief in your public responsibilities and self-doubt over your ability to discharge them.

And something else from Nietzsche:

Dionysian man shares this affect with Hamlet: both have seen into the very essence of things, they have understood, and are repelled by, the thought of action, since no action of theirs can change anything of the eternal essence of things, and they consider it absurd, or even shameful, to be expected to be able to generate order in a world of chaos. Understanding destroys action, and action depends on a veil of illusion.

All The Phone-y People, Where Do They All Come From?

PewGlobal:

Social networking has spread around the world with remarkable speed. In countries such as Britain, the United States, Russia, the Czech Republic and Spain, about half of all adults now use Facebook and similar websites. These sites are also popular in many lower-income nations, where, once people have access to the internet, they tend to use it for social networking.

Meanwhile, cell phones have become nearly ubiquitous throughout much of the world, and people are using them in a variety of ways, including texting and taking pictures. Smart phones are also increasingly common – roughly half in Britain, the U.S., and Japan have one. Globally, most smart phone users say they visit social networking sites on their phone, while many get job, consumer, and political information.

Technologies like these are especially popular among the young and well educated. In almost every country polled, people under age 30 and those with a college education are more likely to engage in social networking and to use a smart phone.

Ryan Tate:

If you find yourself spending an inordinate amount of time on Facebook, Twitter, or Pinterest, thank your smartphone and tablet: Americans’ social media use is on track to spike 37 percent this year, driven by a near doubling of consumption on mobile apps, according to a new Nielsen study.

The biggest social networks are seeing mobile use explode while desktop computer use remains relatively flat, according to a year-end Nielsen survey of social media consumption. Mobile app usage spiked 88 percent on Facebook this year and 134 percent on Twitter. In contrast, desktop use shrank 4 percent on Facebook and grew just 13 percent on Twitter.

You know, in case you hadn't heard.

When I rise to power, those people will be sterilized.

— Sheldon Cooper

I'm sorry, that sounds drastic. Believe me, I'd never daydream about something as terrible as rising to power.

Seriously, though. Now, I just wonder: when is all this meta-fascination going to end? By that, I mean the fixation on the shiny objects themselves, the means by which we tell each other that Princess Adelaide has the whooping cough. When will the mere presence of a social network or a smartphone app no longer be enough of a hook to hang an entire non-story upon? By how many degrees can we amplify the basic changes initially wrought by the telegraph before the mere fact of doing so ceases to be noteworthy?

Most People Are Other People

Harvey Whitehouse:

Imagine you are a five-year-old being led into a small office. A woman with a warm smile shows you an assortment of strange objects. Some of them are shiny. You feel like playing with them. That’s OK, that’s allowed. Soon the friendly lady takes the objects away and says she wants to show you a video. On the screen is another woman. She has an identical set of objects lined up neatly in a row and she’s doing odd things with them — she lifts one and taps it on another, then puts it back and takes something else, twirling it in a peculiar fashion before replacing it. This goes on for some time. Then the strange objects are pushed back towards you and the lady says: ‘It’s your turn.’ What would you do?

If you were a five-year-old, you would imitate at least some of the actions you observed in the video. No instruction would be necessary. And yet, the behaviour doesn’t appear to achieve anything. The psychologist Cristine Legare and I have been working together for several years trying to understand why young test subjects bother to copy it. Our starting point is that they treat it as a convention of some kind. That is to say, they adopt what we call ‘the ritual stance’, imitating without questioning the purpose of the actions.

In our experiment, however, the behaviour of the woman in the video is ambiguous. Children can't be sure it if is oriented to a goal or not. A surprisingly simple shift helps them to decide: we just alter the last move in the sequence. If the woman puts the last object into a box, it looks like the whole procedure was just a ‘funny’ way of putting an object away. We call this the ‘instrumental condition’. On the other hand, if the objects all end up back where they were originally placed, the whole action sequence appears not to have any tangible purpose. We call this the ‘ritual condition’. When the start and end states are identical, children are more confident that the demonstration on the video should be interpreted as a kind of ritual. And guess what? They copy it much more faithfully, and are less inclined to try out variations on their own initiative.

My dad told me that when he was a kid, he complained to my grandfather about having to attend what he saw as pointless religious services. The answer he got was to the effect of, "Look, we do this because the way of life that goes along with it is the best one people have yet come up with." This is just the way we practice the rules, basically. On the practical level of lived philosophy, the basic answers don't change that much. Even if Thales and Anaximander don't have much to teach us about science, Socrates and Chuang-tzu are still worth engaging with. Like John Gray said:

History is not an ascending spiral of human advance, or even an inch-by-inch crawl to a better world. It is an unending cycle in which changing knowledge interacts with unchanging human needs. Freedom is recurrently won and lost in an alternation that includes long periods of anarchy and tyranny, and there is no reason to suppose that this cycle will ever end. In fact, with human power increasing as a result of growing scientific knowledge, it can only become more violent.

The core of the idea of progress is that human life becomes better with the growth of knowledge. The error is not in thinking that human life can improve. Rather, it is in imagining that improvement can ever be cumulative. Unlike science, ethics and politics are not activities in which what is learnt in one generation can be passed on to an indefinite number of future generations. Like the arts, they are practical skills and can be easily lost.

My sorta-noospheric-pantheist friend might despair at hearing that, but I think it's true. We might be able to live comfortable, productive lives by simply parroting the rules that others handed down to us, but much of what we call ethical wisdom is the ability to fully realize those precepts within the unique circumstances of our own lives, to transform them into more than trite platitudes by virtue of encountering the perennial ethical conundrums that originally gave rise to them. Jiddu Krishnamurti said that truth is a pathless land. I think what he meant was that, having attained ethical wisdom, I can't simply instruct you to read these books, watch these films, contemplate these sayings, and expect you to take the exact same lessons from them that I did. Those footprints don't necessarily lead to the exact place where I'm currently standing. That sort of wisdom isn't the end result of a diligent accumulation of facts or mathematical accuracy. It's hidden in plain sight.

Thursday, December 13, 2012

From Cold Unblinking Eyes

winter is primping
in the wings

beneath her booted feet
a troop of dillydallying leaves
caught in the wind scutter
like deserters down city streets

some lie packed in ugly
pyramidic mounds that border
neighborhood sidewalks
dying soldiers waiting
for trucks to cart them away

winter is blowing
into her hands

from cold unblinking eyes
she handcombs wisps of
silver hair and
all around her we brace ourselves
in the tremble of season change

we bury ourselves in defensive dress
for the tactless wave of her wand
the coolness with which
like a fickle lover
she dismisses fall

— Salvatore Buttaci

Wednesday, December 12, 2012

Trash And Treasure

Grace O'Connell:

Reading is, at its core, a leisure activity. We gain knowledge, joy, catharsis and empathy from reading. And yet when reading, a solitary pursuit, becomes interactive—when we talk about it with one another—it is suddenly fraught with anxiety and guilt. Everyone is in the same boat, minus the perhaps imaginary spectre of a few maddeningly well-read people. But if we’re all together in our anxiety, why do we torment ourselves and one another? Guilt is rarely a productive emotion; it is more immobilizing than motivating. 

What do you mean "we," paleface? Maybe you should just go ahead and straight-up read Fifty Shades of Grey if you want to, rather than outsource your BDSM needs to the collective opinion of your book club. I mean, one of us must be doing this reading thing all wrong, because I'm not well-read by any cultural sophisticate's standard, yet I get nothing but pure, unmitigated joy from reading and talking about what I've read. Hearing others talk about books I haven't read yet doesn't give me an inferiority complex, it only rouses my ravenous appetite:

“Oh, my greed! There is no selflessness in my soul but only an all-coveting self that would like to appropriate many individuals as so many additional pairs and eyes and hands – a self that would like to bring back the whole past, too, and that will not lose anything that it could possibly possess. Oh, my greed is a flame! Oh, that I might be reborn in a hundred beings!” – Whoever does not know this sigh from firsthand experience does not know the passion of the search for knowledge.

Here's a slightly amusing anecdote about the intersection of reading, character and status anxiety, though. I used to read a lot of comics when I was a kid. Not comic books, which I was never really into, just book-length compilations of newspaper comic strips, which my mom would get me when we went to the store. Beetle Bailey, Hagar the Horrible, Wizard of Id, stuff like that. My dad hated seeing me with those. Thought they were going to rot my brain. If he happened to be in a crabby mood, and if I happened to leave too many of them lying around throughout the house, he would occasionally gather up any he saw and throw them out in the trash. I'd go fish them out later when his attention was elsewhere, grumbling over the injustice of it all as I hid them more carefully in my room.

He also used to give me extra homework, sometimes during the summer. I guess he was trying to instill a strong work ethic in me as well as stave off the brain-rotting I was inflicting on myself. A few times, he assigned me the task of finding fifty new vocabulary words from my reading to look up and define.

You can guess where this is going, right? Yes, I took great subversive delight in finding the bulk of those words from those very comics which I'd rescued from the garbage. To this day, I remember that I learned the word "vindictive" from a Broom Hilda strip.

Heroes Just For One Day

Andrew Tripp:

It is difficult to find out that someone you have great respect for is not perfect. Many of us found this out when Richard Dawkins made his "Dear Muslima" comments, and indeed more recently when he said in a speech that teaching a child about hellfire is worse than a child being sexually abused. Fewer, unfortunately, have found this out about Dan Savage, who, while famous for the "It Gets Better" campaign and catty comments about relationships, spends a lot of time saying appalling things about trans* people, black people, and anyone who doesn't really fit his normative worldview. When this happens, we find our confidence shattered; we find particularly, as professed skeptics, that what we believed was a conclusion based on evidence has been complicated. This is a problem, and one that is not easy to fix.

Allow me to complicate your lives further, dear atheists, with Ayaan Hirsi Ali...

We want to think that Hirsi Ali is still a role model, someone to follow in our atheistic paths, a story to hold up as a warning against religious hatred and oppression. And indeed, she has faced great hardship in her life as a result of old patriarchal societies in which she had the misfortune to be raised. But this is not enough to earn our respect, or to hold someone up as a paragon of virtue. Any cursory student of history knows that many a freedom fighter has become a dictator upon gaining power.

Ayaan Hirsi Ali is not Stalin, but she is a person whose interests are not our own. Even for those of you reading this who think atheists should not be concerned with issues of social justice, I think that you still know that this woman is not your friend.

I admire a variety of people for a variety of reasons. I honestly can't remember the last time I felt crestfallen to realize that none of them were "perfect". I can't remember ever being concerned over whether I could think of them as my "friends". And I really can't imagine what it must be like to agonize over the "complication" of having to consider people as inconsistent, complex, fallible individuals.

I swear, some of these people seem like emotionally stunted children to me. Maybe they need to go back and nurse at the teat of religious belief for a while until they fully wean themselves off of the need for easy answers and two-dimensional role models.

Shouting "Bigotry!" In A Crowded Internet

Michael Shermer:

Perhaps unintentionally, Benson makes a strong case that something other than misogyny may be at work here, when she asks rhetorically if I would make the same argument about race. I would, yes, because I do not believe that the fact that the secular community does not contain the precise percentage of blacks, Latinos, Asians, and Native Americans as in the general population, means that all of us in the secular community are racists, explicitly or implicitly. A variance from perfect demographic symmetry does not necessarily correspond to racist attitudes. It just means that the world is not perfectly divided up according to population demographics, and people have different interests and causes. There is nothing inherently bigoted, racist, or misogynistic in the fact that the demographics of the secular community do not reflect those of the general population (in gender, in age and socio-economic class, or in height, weight, or any number of other variables for that matter), so short of some other evidence of bigotry, racism, and misogyny, there is no need to go in search of demons to exorcise.

OhI feel his pain. And let me tell you, that last link to the Austinist article by Terry Sawyer, man; if I ever write anything that good, that pitch-perfect, I might just have to retire from blogging, because it would be only downhill from that point on.

Monday, December 10, 2012

But Some Animals Are More Equal Than Others

Jessica Pierce:

What's surprising, though, is that close to 80,000 people have "liked" Ms. Apple's Facebook posting of her letter, and the vast majority of fans have supported her decision. Such expressions of support are unusual. People with strong bonds to animals often feel that the larger society in which they live assigns relatively little moral value to pets and other animals. The death of a pet is often dismissed as unimportant. And unlike Ms. Apple, most of us generally are not able to miss work because our animal is ill or dying.

The singer's decision and the reaction to it represent an emerging cultural shift, one noted by the sociologist Hal Herzog in his book "Some We Love, Some We Hate, and Some We Eat." More Americans now see themselves as living in a multispecies family.

It may not be reflected in animals' legal status, but I'd argue such sentiment isn't unusual per se. It's not difficult to find people who are similarly devoted to their animals. However, it's also easy to find things like this tweet I saw the other day:



Leaving aside whether there's actually anything specifically Myrrhkin about that, whatever moral force this quip may have is predicated on the supposedly outrageous, hypocritical irony of animals receiving empathy and living in comfort while humans do without, as if it's a moral failing for animals to get anything more than the crumbs and scraps after we seven billion naked apes have sated ourselves. It's a common trope; no doubt, you've seen some lazy article or another bemoaning how "People spent X-million on veterinary care last year while Y-million number of people live below the poverty line!" It's also an arbitrary distinction. There is no inherent reason why the boundary marking off homo sapiens should be privileged above any other moral consideration. Truthfully, I feel that animals better represent the quality we call "innocence" than humans do. I favor my dogs as loved ones, as individuals, over a generic abstraction called "humanity".

Sunday, December 09, 2012

Cognitive Dissidents

John Gray:

That is not to say Leiter’s argument is watertight. The claim that religion deserves no special exemptions from generally applicable rules may be right but not because there is anything particularly irrational or otherwise lacking in religious belief. After all, what counts as a religious belief? Aware of the difficulty of defining religion, Leiter devotes a section of the book to the question. His discussion is more sophisticated than many on the subject but he still draws a categorical distinction between religious and other beliefs that is difficult, if not impossible to sustain. Among the distinctive features of religious beliefs, he maintains, is their insulation from evidence. Religious believers may cite what they consider to be evidence in support of their beliefs; they tend not to revise these beliefs in the light of new evidence, still less to cite evidence against them. Instead, their beliefs are part of what Leiter describes as a “distinctively religious state of mind . . . that of faith”.

The trouble is that it is not only avowed believers who display this state of mind.

...Again, nothing infuriates the current crop of evangelical atheists more than the suggestion that militant unbelief has many of the attributes of religion. Yet, in asserting that the rejection of theism could produce a better world, they are denying the clear evidence of history, which shows the pursuit of uniformity in world-view to be itself a cause of conflict. Whether held by the religious or by enemies of religion, the idea that universal conversion to (or from) any belief system could vastly improve the human lot is an act of faith. Illustrating Nietzsche’s observations about the tonic properties of false beliefs, these atheists are seeking existential consolation just as much as religious believers.

If religion does not deserve a special kind of toleration, it is because there is nothing special about religion. Clinging to beliefs against evidence is a universal human tendency. The practice of toleration – and it is the practice, cobbled up over generations and applied in ethics and politics as much as religion, that is important – is based on this fact. Toleration means accepting that most of our beliefs are always going to be unwarranted and many of them absurd. At bottom, that is why – in a time when so many people are anxious to believe they are more rational than human beings have ever been – toleration is so unfashionable.

Years ago, I would have read this and bristled over the facile equation of atheism to religion. And if that were his main point, I'd probably still react that way. But the more interesting —and true — point here is the almost banal reminder that, insistence to the contrary notwithstanding, we don't actually have any meaningful idea what would happen if the whole world adopted western-style atheism. People might no longer be stupid in uniquely monotheistic ways anymore, but I think it's a safe bet that we would just find new ways to express our bottomless reserves of stupidity. The point is not that we shouldn't care about pursuing truth or making improvements; the point is just that we can observe how the same perennial themes of human nature reassert themselves even when, especially when, we pride ourselves on our supposed accomplishments. The Greeks were on to something with all that stuff about hubris, no less so for having expressed it in mythological story-form. As certain segments of the online atheist community have made brutally clear this year, reasoning your way to the nonexistence of God is not necessarily any protection against being insanely stupid in other ways.

On that note:

How do people react when they're actually confronted with error? You get a huge range of reactions. Some people just don't have any problem saying, "I was wrong. I need to rethink this or that assumption." Generally, people don't like to rethink really basic assumptions. They prefer to say, "Well, I was wrong about how good Romney's get out to vote effort was." They prefer to tinker with the margins of their belief system (e.g., "I fundamentally misread US domestic politics, my core area of expertise").

A surprising fraction of people are reluctant to acknowledge there was anything wrong with what they were saying. One argument you sometimes hear, and we heard this in the abovementioned episode, but I also heard versions of it after the Cold War. "I was wrong, but I made the right mistake."

More and more, I find the kind of issues explored by authors like Daniel Kahneman, Dan Ariely, the Brafman brothers, Thaler and Sunstein, Chabris and Simons, etc., to be far more interesting and pertinent than the details of ideological differences.

Saturday, December 08, 2012

Jesus Lived His Life In A Cheap Hotel On The Edge Of Route 66, Yeah

Joerg Rieger:

Dangerously, some Christians in the movement were reminded of where and how Jesus had actually lived. Occupiers camping in the streets could relate to Jesus’ deep solidarity, not with the elites of his time, but with the multitude. Jesus had stayed among those who struggled with life: with the sick, the social outcasts, strong women of ‘dubious’ reputation and working people such as fishermen. He himself was a construction worker, and would have been in touch with the many unemployed of his time, who quite regularly experienced layoffs. Perhaps he was even unemployed himself.

Again, I say, going by what sparse evidence there actually is, Jesus would have been disrupting drum circles to preach excitedly about the end of the world on December 21st, and, assuming he didn't kill himself after the subsequent disillusionment — shall we speculate? why not? — perhaps he would have, at age 33, decided he was still young enough to turn his life around, cut his hair, gone back to school, and become a hedge fund manager.

Now I Know Nothing That My Mind Can't Create

Speaking of those who hate the constraints of style:

At this moment, I, the writer, and you, the reader, are partaking at a banquet of language that we did not create—a system superimposed on our consciousness. The raw material of our minds is rendered by the symbolic aspect of language, and there is no escaping it; unless one takes psychedelics, descends into madness, attains a hightened non-symbolic spiritual state, or disrupts the historical, psychological superstructure of language.

Breton and the other Surrealists realized that language, its traditional structure (syntax, morphology, semantics and phonology, to varying degrees) and expectations, needed to be destroyed and rebuilt. While the group’s efforts in automatic writing never produced writing as famous as T.S. Elliot “The Wasteland,” for instance, automatism accomplished something far more important: it struck a blow to the politics of language.

By politics of language, it should be taken to mean the inherited system of thought and communication. We are defined by the words we use, yet we had no part in the construction of the system. This word means such and such. This is how one writes a sentence, a paragraph, an essay, a poem, a novel, a letter, etc. What we think is heavily influenced by the signs and signifiers we use in the form of words (to say nothing of visual cues), and when we attempt to express a thought verbally or through the written word, we must again revert to an imposed system to do so.

In his quest for absolute certainty, Descartes noticed that our senses can frequently mislead us. Had he followed that line of thinking a bit further, he might have realized that it's only through further application of those very same senses that we discover our initial mistake and correct it. The real problem was that there's no such thing as absolute anything — certainty, freedom, perfection, whatever. No tool, however useful, can help achieve an impossible, incoherent goal.

I kind of feel the same way in this context. Anarchic gibberish is not the antipode of lies and propaganda.

Friday, December 07, 2012

I Need A Touch Of Everything; We've Lost Touch With Everything

Kevin Hartnett:

The Slow Listening movement, if it can be called a movement, shares the spirit of the make-it-by-hand Slow Food movement and also Slow Art Day, which each year invites participants to visit a museum with a group of friends and commit to looking at only five pieces of art for ten minutes each. These initiatives share the belief that the self-conscious imposition of limits is the best way to live authentically in an age of boundless choice.

In all of these areas, it’s finding the balance that’s the hardest part: too much choice feels like anarchy while too many rules leads to tyranny.

Yes, a wise foolosopher once said that boundaries enhance creativity. The 47th chapter of the Tao Te Ching also said words to the effect that the world could be known by the sage without ever leaving the house. The way I like to interpret that one is in the sense that deep, careful attention paid to a restricted number of things is more rewarding than indulging in frenetic sensory overload.

By most standards, I'm confident that I'm a very boring person. I work a few different part-time jobs. My girlfriend and my dogs provide my regular companionship. I don't like being away from home very often, or for extended periods of time, and when I am home, I'm usually reading, writing, watching soccer games, or puttering around doing chores. Viewed from outside, there would seem to be very little to recommend it. Yet bounded within the nutshell of that short description, I can count myself a king of infinite space. If anything, I sometimes feel overwhelmed by how much I want to do and experience within such confines. Bonsai cultivation as a lifestyle philosophy, I'm telling you.

What Happens To You Here Is Forever

I'm bothered by the fact
You cannot take it back
It goes on record and multiplies at that

— Virgos Merlot

Jeffery Rosen:

In theory, the right to be forgotten addresses an urgent problem in the digital age: it is very hard to escape your past on the Internet now that every photo, status update, and tweet lives forever in the cloud. But Europeans and Americans have diametrically opposed approaches to the problem. In Europe, the intellectual roots of the right to be forgotten can be found in French law, which recognizes le droit à l’oubli—or the “right of oblivion”—a right that allows a convicted criminal who has served his time and been rehabilitated to object to the publication of the facts of his conviction and incarceration. In America, by contrast, publication of someone’s criminal history is protected by the First Amendment, leading Wikipedia to resist the efforts by two Germans convicted of murdering a famous actor to remove their criminal history from the actor’s Wikipedia page.

European regulators believe that all citizens face the difficulty of escaping their past now that the Internet records everything and forgets nothing—a difficulty that used to be limited to convicted criminals. When Commissioner Reding announced the new right to be forgotten on January 22, she noted the particular risk to teenagers who might reveal compromising information that they would later come to regret. She then articulated the core provision of the “right to be forgotten”: “If an individual no longer wants his personal data to be processed or stored by a data controller, and if there is no legitimate reason for keeping it, the data should be removed from their system.”

Link via this roundup. I'm honestly not sure what I think about this. Even when in doubt, I usually err on the side of being a free speech extremist, but part of me also feels like there's something uniquely insidious about the inability to escape the all-seeing, all-knowing judgment of the Borg.

A Hundred Roots Silently Drinking

But when I lean over the chasm of myself—
it seems
my God is dark
and like a web: a hundred roots
silently drinking.

This is the ferment I grow out of.

— Rilke

William Deresiewicz:

Thinking well takes time: time for doubt, time for analysis and synthesis, time to let your intuition operate, time to have a second thought. The faster we write, the faster we respond (on Twitter or Facebook, in discussion threads or texts), the more superficial the level of consciousness we’re working from. We’re skimming the surface of our minds (which, like the surface of other things, is mostly foam and crud), forgoing reason, judgment, artistry, craft. That is not the place from which the most intelligent forms of communication, the ones that used to play a larger role in our lives—novels, essays, serious journalism—originate. But it is the place our public discourse, and our private discourse, too, increasingly inhabits.

The Rilkean imagery appeals to me in this context: roots, darkness, silence, fermentation. The type of thought I find most worthwhile needs time to develop, and it can't always be hurried along by conscious prodding. Those connective webs of understanding just seem to spontaneously emerge in the shadowed corners of our thought while our attention is elsewhere.

Complaints about the social web tend to take the form of curmudgeons lamenting the lack of spelling and grammar in text messages and tweets, but again, it's a mistake to consider the spectacle of people spending every waking moment typing pidgin-English on their phones as a deformation of writing. What they're doing is talking, in text. The problem is not that people in the last decade have suddenly gotten stupider and unable to learn basic writing skills; the problem is that those hours that used to be spent in solitude, where boredom or a mere lack of stress can come into play, are now filled with nonstop chattering. Those times when we used to find ourselves alone with nothing to do, when we might pick up a book out of curiosity, or when we would take a deep breath, relax, and stare out the window while letting our thoughts idly meander, or when we could just simply rest in that receptive state in which the seeds of new thoughts could begin putting down roots, in which the fragments of sensory stimulation acquired throughout the day, unimportant in themselves, could decay into organic material to serve as nourishment for those more substantial thoughts; those times, for a variety of reasons, are becoming more rare.

The rootless, constantly-in-motion nature of activity on the social web only puts me in mind of tumbleweeds rolling across sun-baked clay.

Thursday, December 06, 2012

The Snowman

One must have a mind of winter
To regard the frost and the boughs
Of the pine-trees crusted with snow;

And have been cold a long time
To behold the junipers shagged with ice,
The spruces rough in the distant glitter

Of the January sun; and not to think
Of any misery in the sound of the wind,
In the sound of a few leaves,

Which is the sound of the land
Full of the same wind
That is blowing in the same bare place

For the listener, who listens in the snow,
And, nothing himself, beholds
Nothing that is not there and the nothing that is.

— Wallace Stevens

Tuesday, December 04, 2012

Delirious With Fevers Of Faith

I was unaware of the work of Robinson Jeffers until I read his poem "Thebaid" courtesy of Michael Gilleland. Learned something new and interesting today, especially the part about his concept of "inhumanism" (which seems complementary to another perspective recently discussed).

Monday, December 03, 2012

And The Present Is Trivia, Which I Scribble Down As Fucking Notes

Everything we've said and done
Can be so easily forgotten
You can always change who you are

— Against Me!

Matt Haughey:

Twitter put simply is fun, fantastic, and all about the here and now. The fact that I can’t even search my own feed for past things I’ve said makes it exist almost entirely in the present tense. The people I follow are people I know, people I work with and live near, but also a good dose of random comedians, musicians, and celebrities I’ll never meet. The things everyone tweets about are mostly jokes or things that make you smile, either random things that popped into the writers’ heads or comments on current events.

There’s no memory at Twitter: everything is fleeting. Though that concept may seem daunting to some (archivists, I feel your pain), it also means the content in my feed is an endless stream of new information, either comments on what is happening right now or thoughts about the future. One of the reasons I loved the Internet when I first discovered it in the mid-1990s was that it was a clean slate, a place that welcomed all regardless of your past as you wrote your new life story; where you’d only be judged on your words and your art and your photos going forward.

Kottke has some interesting remarks as well. But as much as I'd like to be sympathetic to this perspective — as much as I am sympathetic to the transformative potential of the vast information on the Internet — this glorification of novelty for its own sake just sounds more to me like something akin to the movie Memento. It also makes me think of D'Angelo Barksdale's impression of The Great Gatsby:

The past is always with us, and where we come from, what we go through, how we get through it; all this shit matters. I mean, that's what I thought he meant. Like at the end of the book, you know, boats and tides and all, it's like you can change up, right, you can say you somebody new, you can give yourself a whole new story, but what came first is who you really are and what happened before is what really happened, and it don't matter that some fool say he different cause the only thing that make you different is what you really do, what you really go through. Like, you know, like all the books in his library, now, he frontin' with all them books but if we pulled one off the shelf ain't none of the pages ever been open. He got all them books and he ain't read one of them. Gatsby, he was who he was and did what he did, and because he wasn't ready to get real with the story, that shit caught up to him. That's what I think, anyway.

The only thing that makes you different is what you really do, what you really go through. You don't have to be permanently defined by your upbringing and your high school friends, of course, but if you never allow anything from your Twitter feed to stay with you long enough to deeply engage with, if you're too compulsively seeking stimulation to ever spend time shaping context and perspective to put it in, you're conflating perpetual motion with transformation.

Jack Of All Trades, Master Of None

Costica Bradatan:

A quiet revolution may have taken place over the last three decades in our understanding of the history of Western philosophy. So quiet, in fact, that few have noticed it. Three recent books give us a sense of the significance and extent of this paradigm shift: Examined Lives: From Socrates to Nietzsche, by James Miller; How to Live: Or A Life of Montaigne in One Question and Twenty Attempts at an Answer, by Sarah Bakewell; and The Hemlock Cup: Socrates, Athens and the Search for the Good Life, by Bettany Hughes. What has this revolution brought forth? The realization that some of the most influential Western philosophers (primarily the ancient philosophers, but also Montaigne, Rousseau, Schopenhauer, Nietzsche, and others) intended their philosophy to be not just a body of doctrines, of pure intellectual content, but to be above all an “art of living.”

...While predominant among the ancient philosophers, as well as among some modern ones (Montaigne and Nietzsche, for example), the understanding of philosophy as an “art of living” is far from characterizing mainstream academic philosophy in the twentieth or twenty-first centuries. Now philosophy is primarily a “job.” When they are done with it, philosophers don’t take it home with them; they leave philosophy at the office, behind locked doors. The work they produce, outstanding as it may be, is not supposed to change their lives.

I was captivated by precisely that practical, life-transforming aspect of philosophy when I first encountered it as an adolescent. Now, I just call it "foolosophy" to differentiate it from the academic profession.

Sunday, December 02, 2012

Dependence On No One, Best Distrust And Oppose

Tim Black:

As Lukianoff explains: ‘If there’s any risk whatsoever that a person can get into trouble for their opinion, people don’t change their opinion, they just talk to people they agree with, and they don’t bother talking to people they disagree with. And talking with people you disagree with is precisely what people should be doing in higher education.’

‘People not disagreeing with professors’, Lukianoff continues, ‘people not talking to people they disagree with… this all leads to group polarisation. And as far as the research on group polarisation is concerned, if you surround yourself with people you agree with, you tend to become much more certain, and in some cases much more radical in your beliefs, whether conservative, liberal or neither. And you tend, therefore, to have a polarised understanding of where the other side is coming from. And that’s a big problem in the US today. We have these very tight echo chambers and sort of cartoon-like pictures of what the other side is like.

...With rigorous debate discouraged throughout higher education, and people seeking out only those they already agree with, it is unsurprising that many find it difficult to explain why what they believe to be right is right. After all, they have never had to test their beliefs. And the inability to explain why we are right ‘makes us even more emotional and hostile when anything questions our certainty’, says Lukianoff - hence the shrill, overemotional inarticulacy of so much public discourse.

Noel once said that one of the things he appreciated about commenting here is that I don't get defensive when challenged. Well, since we're on the topic of the benefits of open dialogue with opponents, I will grant to Christianity that I've always appreciated the proverb about a soft answer turning away wrath. Deflate your ego enough, and you won't feel the need to escalate when someone initiates hostilities. We're not making policy or influencing anything important here, we're just passing the time; take a deep breath and relax. (Even if you prefer Machiavellian strategizing, you might consider that an even-keeled reaction could unbalance a hostile interlocutor, making them feel unsure of themselves and silly for coming on so fiercely, thus bestowing an advantage upon you.)

Of course, you may also be aware that I've written many posts about all the things I hate about the social web, especially as it concerns the quality of writing and thinking. The excerpt above is another complaint to add to the list, I feel. In a medium that encourages and rewards lightning-quick reactions, tweet-sized opinions and egotistical performance art above nuance and contemplation, where it's more important to be seen holding correct opinions that have "always already" been settled, or joining in on the reinforcement of those opinions through upvoting, liking, retweeting, etc., disagreements quickly turn into hollow displays of choosing sides and shaking fists at each other, before dispersing and reforming somewhere else over some other ostensible issue.

I was already congenitally disposed toward anti-sociability, but I'm becoming ever more resolved to avoid the sorts of tribal loyalties that compromise intellectual activity. I don't want friends, allies or fans; I want people to think with.

He Blindsided Her With Science

Edward Clint:

Watson’s talk is peppered with snark and sarcasm. Also, it should be clear by now she seems to have spent very little time researching the topic. She doesn’t treat the topic seriously. I do not merely mean that she does not take evolutionary psychology seriously— but the entire topic, including her own contentions, is more performance art than education lecture.

Watson sees evolutionary psychology as being on par with creationism (she makes this comparison at  8:28) and therefore finds it fit for ridicule. She even says mocking it “never gets old”. Even so, what about the impact evolutionary psychology might have? That seems less than amusing. For the sake of argument, let us imagine everything Watson believes is correct: Those who conduct research in the field are mostly misogynists who are dedicating many years to the pursuit of justifying harmful stereotypes and oppressing women. They’ve succeeded in compromising peer review, and the professional journals which publish them are mouthpieces of the patriarchy and scientific rigor is gone. They’ve infiltrated the top universities in the world- UCLA, Harvard, MIT, Oxford, Yale, and so on. They’ve established growing departments at said locales and have their own conferences and ever-larger presences at others. They’ve even succeeded in having much of their literature and research perspective accepted by mainstream social science.

If I believed that all of this were true, I would be horrified. The potential harm to society and to behavioral science would be almost incalculable. Were I to give a talk on it or write about it, I would dig deep. I would cite mainstream sources so that no one could dismiss me as cherry-picking. I would conduct or locate reviews of dozens or hundreds of studies, instead of citing one or two in tabloid newspapers easily dismissed as outliers, or taking the word of an author trying to sell books. I would read full published papers and foundational literature, not blurbs from the Telegraph about unpublished studies so that my understanding would become robust and accurate. I wouldn’t make an unserious, sarcastic tone my main presentational style because the stakes would be so high, the human cost so tragic.

Watson wants us to believe this great dark power is working, inhibiting social justice, hurting real people and the advancement of science, and that it is entertaining to talk about. She says (for example) that it is working to justify rape. To make rape OK.  …But hey, no big deal, right? Not big enough to research properly or to stop making jokes about for two minutes. This flip attitude lacks empathy, and I find it ethically repugnant. If even close to true, her claim isn’t funny. It deserves real skeptical inquiry and serious investigation and she gave it none of this.

...So, I formally criticized a theory in evolutionary psychology that has stood for years. I did it, in part, because I love evolutionary psychology. I know that it’s a good science and that a good science gets better with robust criticism. I am excited to be able to play a tiny part in that, if I can. It was also an exercise in skepticism toward something I cared about. We need to engage in this kind of skepticism because as we try to figure out how the world works and how it got to be the way it is, commitments to ego and politics tend to get in the way. All of her skep-nomenclature trappings to the contrary, I do not think that Rebecca Watson understands this.

Oof. If this were a boxing match, the referee would have stopped it long before this point.

Snark, sarcasm, performance art, ego and politics. Coincidentally enough, those are the very things that often make reading the blogosphere such an unenlightening, unrewarding chore.