Monday, April 30, 2012

Lovin' Mens, Lovin' Womens, Lovin' All God's Creatures

Paul Oestricher:

Jesus was a Hebrew rabbi. Unusually, he was unmarried. The idea that he had a romantic relationship with Mary Magdalene is the stuff of fiction, based on no biblical evidence. The evidence, on the other hand, that he may have been what we today call gay is very strong. But even gay rights campaigners in the church have been reluctant to suggest it. A significant exception was Hugh Montefiore, bishop of Birmingham and a convert from a prominent Jewish family. He dared to suggest that possibility and was met with disdain, as though he were simply out to shock.

After much reflection and with certainly no wish to shock, I felt I was left with no option but to suggest, for the first time in half a century of my Anglican priesthood, that Jesus may well have been homosexual. Had he been devoid of sexuality, he would not have been truly human. To believe that would be heretical.

Heterosexual, bisexual, homosexual: Jesus could have been any of these. There can be no certainty which. The homosexual option simply seems the most likely. The intimate relationship with the beloved disciple points in that direction. It would be so interpreted in any person today.

You keep using that word, "evidence"...

Sunday, April 29, 2012

Value Dwells Not In Particular Will

Russell Blackford:

Speaking very generally, ordinary people are most likely to deny the existence of free will when they see our deliberations, choices and actions being overridden or bypassed in some way or another. For the folk, or most of them, the dominant idea in attributing free will to themselves and others seems to be a denial of fatalism.

...Harris appears, then, to think that free will means acting (1) in circumstances such that I could have done otherwise (in the strong, mysterious sense), and (2) by means of a process of deliberation that is entirely conscious. Since, this does not happen, he concludes, we do not have (what he calls) free will.

...Importantly, the concept of free will that Harris attacks so relentlessly bears little resemblance to either the dominant folk ideas (roughly speaking, that fatalism is false, and that we commonly act without coercion, with adequate time to think) or the technical concept used by most philosophers (we have the capacity to act in such a way that we are morally responsible for our conduct).

...Furthermore, the folk (and perhaps philosophers) are not worried only by outright coercion but also by other circumstances, such as whether there was adequate time to think. But where do we draw the line with something like that - for example, how much time is "adequate"? Again, how should we handle such things as compulsions and phobias - are they just another part of our desire-sets, or are they more analogous to external barriers to our actions?

...But even if we press such points as hard as possible, folk ideas of free will might survive. Perhaps whether we act freely becomes a matter of judgment and degree, and the question of whether we do so in various particular cases does not have an entirely compelling answer. Nonetheless, it might remain more false than true if we tell the folk, "You do not have free will."

It was hard to pick just a few paragraphs to excerpt, so you'll have to take my word on it when I say that the whole article is interesting. I recommend checking out the related stories in the sidebar as well (he and Sam Harris have a little back-and-forth going).

Saturday, April 28, 2012

Rest With a Fermata

Carl Honoré:

First, let’s underscore what the whole Slow culture quake is about. It’s not anti-speed. It’s not about doing everything in slow motion. It’s about doing things at the right speed – what musicians call the tempo giusto. Every act has the right rhythm for it, and if you find that rhythm you’re going to do it better and enjoy it more. Particularly in cities, we get infected by this virus of hurry, where our default mode is to do everything as fast as possible. We fall into the trap of trying to do more and more things in less and less time, putting quantity before quality in everything we do.

Some of biology is essentially a pause: sleep, for example. Pauses serve a purpose, breaking the flow. Like rests in music or caesuras in verse. Like the old nightly break in the news cycle and the financial markets, gone in our 7 X 24 era. Even a confirmed atheist and Sunday driver must believe that the Sabbath served a therapeutic purpose, too, in the epoch when people observed it. Now, of course, Puritanical blue laws are mostly long gone, and Federal Express boasts of delivering on Sunday "because the world works seven days a week." Haydn may have been the first great master of the rest in musical composition; he used rests for surprise, rests for tension, and even rests with fermatas. Silence indefinitely prolonged. Rest and pause. A rest with a fermata is the moral opposite of the fast-food restaurant with express lane.

Thursday, April 26, 2012

It Is Enough; Now Stay Your Hand

Marah Eakin:

As we get deeper into the Internet age, and in particular the Twitter age, it’s getting easier to become less thoughtful. One-liners fly out into the ether and then disappear forever. For some, every single thing they do is broadcast online. Mourning, on the other hand, is traditionally a deeply private practice. If a loved one or family member dies, you feel a pain that only you know. Taking to Facebook or Twitter to express that kind of hurt seems trite, almost like a slight to the deceased. If you really cared that much, would you be able to sum up your thoughts in 140 characters, or with a sad face emoticon? Does it matter what other people thought about exactly how sad you really were?

...The Internet can get a bad rap for being a wasteland of dumb content and dick shots, all of which it has in spades. What it also has, though, is potential. With its unlimited capacity and nearly unlimited number of contributors, the Internet has the ability to spin out in a variety of different directions. Sure, it can go brusque and glib, and that works sometimes. It can also spark real conversation or host excellent essay work and heartfelt rememembrances of those we’ve lost. What we have to figure out as content producers—and any one of us with a Facebook or Twitter account is a content producer—is what kind of message we want to put out into the ether. It’s not easy to do, especially when dealing with something as emotional as death and mourning, but it’s that modicum of thought that can mark the difference between a tribute and a shrug.

Well, call me an optimist, but I'd like to think it shouldn't take the paralysis of grief to prompt some reflection and carefully-crafted prose from my fellow netizens; couldn't it just be a general ideal to aim for? That quibble aside, yes, I concur, well said, huzzah!

I have to be fair, though, and mention that cheap public displays of mourning aren't exclusive to social media—in recent years, I've noticed a strange trend of memorializing dead loved ones in a rear-window decal, like there's a kit you can buy at AutoZone or something (if there is, I don't want to know about it, thanks). I wonder if these people proudly sport a t-shirt with iron-on photos and text while they're at it. Why not head over to CafePress for all your thanatosian needs and emblazon your dearly departed's face and lifespan dates on everything from coffee mugs to fridge magnets? Let's put the fun back in funeral.

Call of the Mild

Do what now?

The University of Southern California has been given $40,000 by the National Endowment for the Arts to develop Walden, in which “the player will inhabit an open, three-dimensional game-world, which will simulate the geography and environment of Walden Woods”. With the game drawing from the detailed notes Thoreau wrote about the area and its landscape, flora and fauna, users will be able not only to walk in the author’s footsteps but also, said the university, “discover in the beauty of a virtual landscape the ideas and writings of this unique philosopher, and cultivate through the gameplay their own thoughts and responses to the concepts discovered there”.

...The team behind the video game Walden said it “posits a new genre of play, in which reflection and insight play an important role in the player experience”. While the player travels through the virtual world of Walden, and deals with everyday life at Walden Pond, they will also be asked, the team said, to “focus on the deeper meaning behind events that transpire in the world. By attending to these events, the player is able to gain insight into the natural world, and into connections that permeate the experience of life at Walden.”

Now, I know what you're thinking—it seems like an incredibly ironic joke to use a video game to gain insight into the natural world. But this does sound like the sort of roughing it that Henry was most comfortable with:

The inestimably priggish and tiresome Henry David Thoreau thought nature was splendid, splendid indeed, so long as he could stroll to town for cakes and barley wine, but when he experienced real wilderness, on a visit to Katahdin in 1846, he was unnerved to the core. This wasn't the tame world of overgrown orchards and sun-dappled paths that passed for wilderness in suburban Concord, Massachusetts, but a forbidding, oppressive, primeval country that was "grim and wild…savage and dreary." fit only for "men nearer of kin to the rocks and wild animals than we." The experience left him, in the words of one biographer, "near hysterical."

Speak to Me in a Language I Can Hear

John McWhorter:

At this point, two forms of language coexist in societies: choppy speech and crafted prose. No ancient Roman spoke the way Virgil and Cicero wrote. Even today, only about 100 of the world’s 6,000 languages are written much, and none of the 5,900 unwritten ones are spoken in Gibbonesque paragraphs.

...Yet the brevity, improvisation and in-the-moment quality of e-mails and texts are those grand old defining qualities of spoken language. Keyboard technology, allowing us to produce and receive written communication with unprecedented speed, allows something hitherto unknown to humanity: written conversation. In this sense, they are not “writing” in the sense we are accustomed to. They are fingered speech.

A sense that e-mail and texting are “poor writing” is analogous, then, to one that the Rolling Stones produce “bad music” because they don’t use violas. Note that one cannot speak capital letters or punctuation. If we accept e-mail and texting as a new way of talking, then their casualness with matters of case and commas is not only expected but unexceptionable.

Even if we call this new usage a form of writing, we can see Anglophone societies as now having two kinds of writing. For formal contexts, there is the long-lined kind that requires schooling and practice to master. Then there is a more natural form paralleling the way we speak for informal contexts – which are, after all, most of our lives.

For curmudgeons like me, the problem isn't that texting and tweeting is an unfortunate diversion from the nuanced thinking and writing that people would be doing normally. Most people are trite and boring and I don't waste a valuable moment lamenting the possibility that it could have ever been otherwise. No, I agree fully that it makes sense to see texting as a form of speech rather than writing. But as in speech, so in print: I'm irritated by people who never shut the fuck up long enough to allow contemplation to work its magic and produce thoughts that are interesting and worth hearing. This is why I complain about the hyperactive, overstimulated environment of social media—it's an incubator for the qualities I despise in relationships, like superficiality and haste. There's little time to do anything but react reflexively, which ensures that your output is primarily going to be banal and inchoate.

Yesterday's post, for example, I worked on all day. I started on it in the morning, let it sit while I went to work, proceeded to mull it over while on the job, came home, worked on it a little more, did chores outside while mulling it over some more, and finally finished it in the evening. And I still think it turned out to be a pretty weak effort, despite all that. But I felt like there was something more to be said on the topic, something less obvious and possibly more penetrating, so I kept wrestling with it, hoping that maybe lateral thinking would kick in at some point and provide me with an insight that would make the difference between a good post and one that was just okay. I could have just blurted out the first thought that came to mind and been done in five minutes, quickly on to the next shiny object, but then I would have likely forgotten about the topic before I had a chance to explore any nuance in it.

It's not that social media and texting inevitably doom their participants to drooling idiocy, or that only drooling idiots would want to use them to begin with. But even I, an unambitious blogger, know that the further you allow yourself to get lured into that environment, the more difficult it becomes to do the sort of writing and thinking that gives us such joy. You have to seek out that open, quiet space away from the clamor and din of nonstop chattering, where ideas formerly lurking hesitant and unseen out on the margins might finally be able to approach you.

Wednesday, April 25, 2012

Symbolic Capture

Noel Murray:

I get annoyed whenever anyone slaps a label on something and then presumes that the label itself says all that needs to be said. Whenever a critic or a potential audience member sniffs about “dad rock” or “chick lit” or “one for the fanboys,” it raises my hackles. If you’d rather not engage with what a piece of art actually is—as in, what it expresses and how well it is expresses it—then fine. But don’t presume some kind of superiority because of that choice. One of the biggest fallacies in the way we talk about art is this idea that somehow personal taste equates to quality: That each of us miraculously only enjoys movies and music that are the best of their respective medium, and ergo, any movies and music we don’t enjoy must be terrible. It’s a standard we generally only apply to art. (Well, and politics.) If we dislike salmon, we don’t presume salmon itself to be bad; we just understand we don’t have a taste for it, and we’re generally willing to acknowledge that if prepared properly, we might even be capable of enjoying the occasional piece of salmon. It’s not that degrees of “good” and “bad” don’t exist, but ultimately our taste in art isn’t so different from our taste in food, in that it’s personal, and—if we’re being honest with ourselves—fairly malleable.

I heard the phrase "symbolic capture" used recently in a similar context—the tendency to reduce a person, a work of art, a movement to some generic characteristic for the purpose of belittling and dismissing it straightaway without seriously engaging. Anyway, I'm not sure food preference is quite the contrast he thinks it is; there are plenty of foodie snobs who look down on people in a moralizing way for an unsophisticated palate. And getting back to the article's theme of using "white" as a derogatory term, let us not forget white bread, offensive to the cognoscenti on both the literal and symbolic level.

Moralizing insinuates itself into practically everything we do. If we're elevating personal taste into a standard for objective quality, it's only so that we can enjoy lording our superiority-by-association over those unfortunate outsiders. The point isn't to establish that a certain style of music or film is objectively better, it's to imply that I am a better person than you for appreciating it. But sometimes it's the fact that a certain style of art doesn't conform to accepted standards of quality that gives it its cachet. There is plenty of music for which the near-unlistenability is the entire point. Fans identify with it for the pleasure of haughtily differentiating themselves from anything even remotely mainstream.

It all reminds me of something Freddie deBoer wrote:

Contemporary strivers lack the tools with which people in the past have differentiated themselves from their peers: They live in a post-virtue, post-religion, post-aristocracy age. They lack the skills or inspiration to create something of genuine worth. They have been conditioned to find all but the most conventional and compromised politics worthy of contempt. They are denied even the cold comfort of identification with career, as they cope with the deadening tedium and meaninglessness of work by calling attention to it over and over again, as if acknowledging it somehow elevates them above it.

Into this vacuum comes a relief that is profoundly rational in context—the self as consumer and critic. Given the emptiness of the material conditions of their lives, the formerly manic competitors must come to invest the cultural goods they consume with great meaning. Meaning must be made somewhere; no one will countenance standing for nothing. So the poor proxy of media and cultural consumption comes to define the individual. In many ways, cultural products such as movies, music, clothes, and media are the perfect vehicle for the endless division of people into strata of knowingness, savvy, and cultural value.

These cultural products have no quantifiable value, yet their relative value is fiercely debated as if some such quantifiable understanding could be reached. They are easily mined for ancillary content, the TV recaps and record reviews and endless fulminating in comments and forums that spread like weeds. (Does anyone who watches Mad Men not blog about it?) They are bound up with celebrity, both real and petty. They can inspire and so trick us into believing that our reactions are similarly worthy of inspiration. And they are complex and varied enough that there is always more to know and more rarefied territory to reach, the better to climb the ladder one rung higher than the person the next desk over.

Tuesday, April 24, 2012

I'm a Loser, Baby, So Why Don't You Kill Me

Ahahaha. It's a good thing I didn't go to college; everything I would have been interested in studying turns out to be a financial dead end. Oh, it's the life of an autodidact for me, where the status is none, but the learning is free...

Saturday, April 21, 2012

The World Is Too Much With Us

The past is always with us, and where we come, what we go through, how we get through it; all this shit matters. I mean, that's what I thought he meant. Like at the end of the book, you know, boats and tides and all, it's like you can change up, right, you can say you somebody new, you can give yourself a whole new story, but what came first is who you really are and what happened before is what really happened, and it don't matter that some fool say he different cause the only thing that make you different is what you really do, what you really go through. Like, you know, like all the books in his library, now, he frontin' with all them books but if we pulled one off the shelf ain't none of the pages ever been open. He got all them books and he ain't read one of them. Gatsby, he was who he was and did what he did, and because he wasn't ready to get real with the story, that shit caught up to him. That's what I think, anyway.

- D'Angelo Barksdale

The Guardian:

Poole, who was voted Time's most influential person of 2008 – two years before Facebook's Mark Zuckerberg was declared the magazine's Man of the Year – believes Facebook's commercial motivations shut down the online experience: "Mark and Sheryl have gone out and said that identity is authenticity, that you are online who you are offline, and to have multiple identities is lacking in integrity. I think that's nuts." 

"We went from a web that was interest-driven, and then we transitioned into a web where the connections were in-person, real-life friendship relationships," adds Poole. "Individuals are multifaceted. Identity is prismatic, and communities like 4Chan exist as a holdover from the interest-driven web." 

Allan believes such attitudes are naive. The millions who have gone online over the past decade want a safe place where they won't experience bad behaviour, have their identities stolen or be duped by impostors, he says: "Pretend identities don't work very well now that the web has moved from a minority sport for geeks to a mainstream occupation."

I was checking in on the progress of the Mark Sandman documentary this week, and read through a few related articles in the process. One thing that struck me was that, in addition to the general umbra of mystique he cultivated around his personal life, he was particularly sensitive about keeping his age a secret. Having been born in 1952, he was already on the cusp of turning 40 at the beginning of Morphine's recording career, which, for a rock musician, is an age more commonly associated with long-since-exhausted creativity, "Vegas Elvis" irrelevance and unwitting self-parody. Marketing executives worry about whether the youth demographics are going to identify with someone old enough to be their parent, that sort of thing. Anyway, it was kind of quaint to read about a reporter for Rolling Stone trying fruitlessly to find out his real age. Really? As recently as the mid-nineties, it was possible for a minor celebrity to keep something like that a secret without a reporter simply looking up his high school yearbook or something? How long would he be able to pull that off today before half of his acquaintances from adolescence would be tweeting pictures and anecdotes about him?

My affection for anonymity stems from my impatience with the irrelevance of the quotidian details of life. I love the interest-driven web; I love being able to find discussions of ideas that I would never hear otherwise; I love the creative stimulation; I love the excellent writing. Most of all, I love the fact that participation in all this revolves around what you think, not who you are, in that small-town sense of your identity being cemented in place by the opinions of everyone else. And I resent the fact that social media has brought all that small-town idiocy and superficiality back to center stage.

Friday, April 20, 2012

The Fool on the Hill

Continuing with a theme, here's Gary Gutting:

The Sermon on the Mount, however, does not offer a clear view of what makes for a good life.  Many seem to think Jesus is saying little more than be nice to everybody.  Others see a call to a heroic life of total non-resistance or self-sacrifice.  Still others hear him as requiring little more than an enhanced version of the Ten Commandments  (e.g., avoiding not only murder but also anger, not only adultery but also lustful desires).

Almost all Christians ignore many of the things Jesus said on the Mount.  Who literally takes no thought for their lives or for tomorrow?  Who never resists evil?  Who gives to anyone who asks?  Who says “Hit me again” to an unjust attack?  There may be ways of integrating such injunctions into our morality without reducing them to banalities, but the bare text of Jesus’ sermon doesn’t tell us how to do this.

Yeah, it's true; a literal belief in the imminent end of the world does tend to render the advice you offer people a bit short on nuance and practicality. Seriously, it amazes me how much effort people put into trying to salvage something inspirational from this dreck. Even if you set aside the irrelevance of his preaching once the centrality of the apocalypse is removed, there's just nothing profound there. As J.L. Mackie said:

Richard Robinson has examined the synoptic gospels as the best evidence for Jesus's own teaching, and he finds in them five major precepts: "love God, believe in me, love man, be pure in heart, be humble." The reasons given for these precepts are "a plain matter of promises and threats": they are "that the kingdom of heaven is at hand," and that "those who obey these precepts will be rewarded in heaven, while those who disobey will have weeping and gnashing of teeth." Robinson notes that "Certain ideals that are prominent elsewhere are rather conspicuously absent from the synoptic gospels." These include beauty, truth, knowledge and reason:

As Jesus never recommends knowledge, so he never recommends the virtue that seeks and leads to knowledge, namely reason. On the contrary, he regards certain beliefs as in themselves sinful...whereas it is an essential part of the ideal of reason to hold that no belief can be morally wrong if reached in the attempt to believe truly. Jesus again and again demands faith; and by faith he means believing certain very improbable things without considering evidence or estimating probabilities, and that is contrary to reason.

Robinson adds:

Jesus says nothing on any social question except divorce, and all ascriptions of any political doctrine to him are false. He does not pronounce about war, capital punishment, gambling, justice, the administration of law, the distribution of goods, socialism, equality of income, equality of sex, equality of colour, equality of opportunity, tyranny, freedom, slavery, self-determination, or contraception. There is nothing Christian about being for any of these things, nor about being against them, if we mean by "Christian" what Jesus taught according to the synoptic gospels.


David Sirota:

These findings are important to America for two reasons. First, they tell us that, contrary to evidence in the United States, the intersection of religion and politics doesn’t have to be fraught with hypocrisy. Britain is a Christian-dominated country, and the Christian Bible is filled with liberal economic sentiment. It makes perfect sense, then, that the more devoutly loyal to that Bible one is, the more progressive one would be on economics.

...Meanwhile, the organization Faith in Public LIfe has highlighted new academic research showing that even in America there is growing “correlation between increased Bible reading and support for progressive views, including abolishing the death penalty, seeking economic justice, and reducing material consumption.”

Of course, many Americans who cite Christianity to justify their economic conservatism may not have actually read the Bible. In that sense, religion has become more of a superficial brand than a distinct catechism, and brands can be easily manipulated by self-serving partisans and demagogues. To know that is to read the Sermon on the Mount and then marvel at how anyone still justifies right-wing beliefs by invoking Jesus.
Yes, it's yet another one of those "Jesus would so totally have been a progressive Democrat" hack pieces that Salon specializes in. Put on your hip waders; the bullshit's flowing thick and fast, and we're going in.

"The Christian Bible is filled with liberal economic sentiment." Uh... well, I guess you could say it was somewhat forward-thinking to use two hundred Philistine foreskins as currency for buying a wife, and I guess thirty silver coins was a pretty fair price for a slave, but I suspect Sirota is actually referring to the occasional generic call for people to care for the poor, though why we're supposed to take those verses seriously while coughing politely and ignoring the far more numerous examples of primitive barbarism permeating the Bible, our theologian-for-hire doesn't say. 
Anyway, I realize that economics is not actually a science, but still, I think it's just a tad unfair to those who attempt to engage in rigorous analysis of the production, distribution and consumption of goods and services to equate their discipline with moral commands whose only justification is an appeal to divine authority (which I thought was kinda anathema to classical liberal thought).

Okay, seriously. Let's be as plainspoken as we can here. If you familiarize yourself even a little bit with biblical scholarship, you will realize that you simply cannot read the Bible as a straightforward, modern narrative, ignorant of the historical context in which it was written. Jesus, as a literary character, is hopelessly incoherent by our standards. Some verses make him seem like a peacenik hippie. Some make him look like a typical authoritarian cult leader. Neither view is wrong, technically. You can selectively read the Gospels and come away with the impression that what matters most is blind faith in authority and enforcing an inflexible moral code among the community. You can also selectively read it and come away with the impression that we're just supposed to be excellent to each other, peace out. Both views require turning a blind eye to the other.

I mean, for fuck's sake, Bart Ehrman has been a bestselling author for more than a decade now, and he hammers on this theme repeatedly: after three centuries of scholarship, the historical-critical method of biblical scholarship is in unanimous agreement on the basics and has been for decades. The most fire-and-brimstone fundamentalist and the most liberal Unitarian learn the same things in seminaries and divinity schools: assuming there was a historical figure such as Jesus, he would have been just another apocalyptic
 Jewish prophet. You simply cannot make sense of his ramblings and seemingly schizophrenic character if you don't take into account that he seriously expected and desired the violent end of the world. He may have thought that it was a good thing to take all you have and give it to the poor—because he thought the world was about to end and you wouldn't need it anyway. He may have urged you to consider the lilies of the field in lieu of preoccupying yourself with worldly concerns—which is good advice for someone who doesn't have much of a future to worry about. He may have offered up a bunch of toothless platitudes about the meek, the peacemakers and the justice seekers—but their reward for believing was going to arrive with the apocalypse, not in some future liberal welfare state. He didn't want to create a stable, equitable society for future generations—he wanted to destroy the one he lived in.

What I am saying is that inhabiting this mindset radically reorganizes one's priorities, and that if you don't believe that the end of the world is something both imminent and desirable, perhaps you should ignore this jabbering lunatic and look for sage advice elsewhere.

Also, as Lenny Bruce and Bill Hicks have noted before, I'd like to point out the amusing fact that the death penalty is, ah, kind of an integral part of Christianity; after all, they use an instrument of execution as the very symbol of their faith. 

Bearded Weirdo

Justin Peters:

The beard’s absence from modern American politics can be partially blamed on the two scourges of the 20th century: Communists and hippies. For many years, wearing a full beard marked you as the sort of fellow who had Das Kapital stashed somewhere on his person. In the 1960s, the more-or-less concurrent rise of Fidel Castro in Cuba and student radicals at home reinforced the stereotype of beard-wearers as America-hating no-goodniks. The stigma persists to this day: No candidate wants to risk alienating elderly voters with a gratuitous resemblance to Wavy Gravy.

...And yet, though beards might not be all that common, they’re actually well received among the general population. "I do a lot of work with visual communications, facial expressions, how people read faces," says Jeff Jacobs. "Facial hair poses no distraction or causes no aversions whatsoever." Academic research bears this out: In a 1990 paper for the journal Social Behavior and Personality, J. Ann Reed and Elizabeth M. Blunk reported "consistently more positive perceptions of social/physical attractiveness, personality, competency, and composure for men with facial hair." More recently, researchers Barnaby J, Dixson and Paul L. Vasey rejected the notion that "facial hair decreases a male's perceived social status because it is associated with traits such as vagrancy." In fact, participants in their study "rated bearded men as having higher social status than clean-shaven men."

Whatever, man, whatever. I'm the type of fellow who shudders in horror at an assertion like Eric Hobsbawm's that politics could ever have anything to do with being a "key to our truths as well as our myths." I mean, I suppose it's true, though I would suggest a less-aspirational phrase like "the distillation of all base convention" instead. There are many who affect an ironic embrace of their own marginalization while still yearning to see it validated and affirmed by someone with power and influence, but I'm not one of them. The last thing I would want to see is a president who shares my taste and appearance, ye gads. 

Thursday, April 19, 2012

Oh, I'm So Very, Very, Very Ssssssssssss... FUCK YOU!


The Catholic League, a religious advocacy group, has threatened to launch a boycott against the sponsors of Daily Show host Jon Stewart over a segment that featured “vagina mangers.” 

“We are asking Stewart to apologize,” Catholic League president (and angry teakettle—TVS) Bill Donohue said in a statement. “If he does not, we will mobilize Protestants, Jews, Mormons and Muslims to join us in a boycott of his sponsors.” 

“Moreover, we will not stop with a boycott; there are other things that can be done to register our outrage,” he added. “We are prepared to spend the money it takes to make this a nationwide issue, and we are prepared to stay the course.”

I've given up hope that people of all political persuasions will finally realize what stupid jackasses they look like when they do this. Now I'm eager for more boycotts. I encourage them. I want boycotts to provoke counter-boycotts, escalating rapidly, until global capitalism finally collapses as all economic activity grinds to a halt with everyone mortally offended by everyone else and no one willing to spend a goddamned dollar until someone finally apologizes. It's sort of like Marx reinterpreted by clown college academics.

Tyrannosaurus Sux

You've heard they may have been feathered. You've heard they may have been scavengers. We all know that Adam and Eve rode them to church, of course. But I, after venturing out into the wild today, bring you photographic evidence to settle the debate that has long split the field of paleontology over whether the tyrant lizard enjoyed sucking on huge, juicy wieners.

Wednesday, April 18, 2012

Jesus Just Left

Ross Douthat:

Now, there are defenses one can make of such a world. Maybe widespread abortion is necessary for female advancement. Maybe it isn’t a big deal if Europeans aren’t having enough children to replace themselves. Maybe children don’t need a mother and a father; maybe marriage itself is an outdated institution. Maybe the human embryo is no different, morally and commercially, from a fingernail clipping or skin-cell sample.

But these aren’t arguments that serious Christians can accept. A society that does not have enough children to even reproduce itself is not a Christian society. A culture that kills 40 percent of the offspring it conceives in utero (that would be the abortion rate in New York City, where Planned Parenthood’s influence over public policy is ever-so-slightly larger than the Vatican’s) is not a Christian culture. A culture that practices a kind of de facto polygamy, with men fathering children with many different women—the state of affairs in underclass America, and increasingly in working-class America as well—is not a Christian culture.

Oh, well, good, glad we agree on this! All right, then, I guess, uh, feel free to fuck directly off to a monastery whenever it suits you and await the Parousia. Should be any day now, I'm sure.

Tuesday, April 17, 2012

I Do Not Like Many People, Love; They Bore Me, or Attack Me, or Talk Too Much When There is Nothing to Say

Robert Lane Greene:

As Facebook reaches further into every corner of our lives, it also engenders confusion, annoyance and concern. The litany of complaints is familiar. “People are going to be so busy writing about their lives that they forget to live them,” as a friend complains to me, is perhaps the most typical. This “Facebook isn’t real life” trope spans many sub-complaints. The word “friend” is being devalued by having hundreds upon hundreds of “Friends”. Users’ pages are not a genuine portrait, but a careful selection of photos and updates that amount to an illusion. People should be enjoying their vacation, not taking hundreds of pictures of it and putting them on Facebook. People should spend more time curling up with real books, not waste time bragging about what they read via GoodReads. The birthday messages that pour in because Facebook told your “Friends” it was your birthday are no substitute for real friends who actually remember. And so on.

...Bosworth is merrily impatient with these complaints. “The things people complain about in real life, it’s like they rediscovered them on Facebook. It’s like gossip never existed before, as if your history never followed you around before. I’m not saying there’s not some differences—but these aren’t Facebook problems, they’re just fundamentally human problems.” The philosophy is simple, he says: “Humans talk. Maybe we should let them talk online.”

So “talking” is neither good nor bad. But Facebook means that what people are saying will never again be far away. Long ago, everyone was in regular physical contact with most of the people they would ever know. Everyone knew everyone’s business, but “everyone” was not many people. Then urbanisation, cramming together people from far-flung places, allowed us to vanish into the crowd. Now Facebook is mashing today’s vast crowds into the small town of old, making a world that is both exhilarating and unsatisfying, with more people than ever to keep up with, and more people than ever keeping tabs on you.

Bosworth's actually right. Relationships are developed and maintained through sustained attention and effort, not through some sort of magic resulting from physical proximity. There's no reason you can't put attention and effort into communicating with someone online, just as there's no reason that an in-person relationship can't be superficial. Different circumstances mean different aspects of the relationship come to the fore, that's all. One is not necessarily more "real" than the other, if by "real", we mean vital, substantial, meaningful.

That said, I find it extremely difficult to give what I consider adequate attention to more than a few close friends. Feeling obliged to keep up with a couple dozen would mean spreading myself too thin, and meaningful communication would get reduced to hasty, reflexive comments. Without some form of contemplative withdrawal from the incessant bombardment of stimuli, there's no time and space for scattered impulses and experiences to coalesce and develop into the kind of thoughts that might actually be interesting and worth sharing.

Even I am not completely immune to romantic nostalgia, as it turns out—I still fondly remember when the web hadn't gotten quite so standardized, when you didn't feel like your once-secluded neighborhood had become stranded in between three or four different big-box stores with eight-lane highways connecting all of them. I know, I know; it was inevitable, but there was a time when the Internet seemed like an interesting alternative to everyday, provincial life, and I loved it that way.

Monday, April 16, 2012

What Fresh Hell is This?

From The Millions blog, which I thought was my friend:

One good way to spend your Sunday: reading a 7,834-word Atlantic profile of Kanye West.

How do you spell that cartoon noise where someone shakes their head from side to side in a violent blur of disbelief? Are you fucking kidding me? If ever there were a valid need for a 140-character limit, it would be in summarizing, in its vapid entirety, the nadir of anti-talent and zenith of overweening self-importance that come together in a deafening thunderclap of out-canceling, leaving us with the utter waste of oxygen conventionally known as Kanye Fucking West.

Didn't the Atlantic used to publish Mark Twain? Spengler, did you hear about this one?

Saturday, April 14, 2012

Silently Through and Out of the World

Free spirits, those who live by knowledge alone, will soon attain the supreme aim of their life and their ultimate position towards society and State, and will gladly content themselves, for instance, with a small post or an income that is just sufficient to enable them to live... He, too, knows the weekdays of restraint, of dependence and servitude. But from time to time there must dawn for him a Sunday of liberty, otherwise he could not endure life... In his mode of life and thought there is a refined heroism, which disdains to offer itself to the veneration of the masses, as its coarser brother does, and tends to go silently through and out of the world.

- Nietzsche

You know, I believe I will incorporate this beautiful passage into my ongoing project to redefine slackerhood.

In one of my jobs, I get paid to write. It only makes me a little money right now, and it's only corporate copyspeak, but I'm okay with it. That kind of writing is sort of like a math problem— how to achieve a desired result within strict limits by using purely grammatical skills. It occupies my attention for as long as it takes to do the job, and then it's easily forgotten. I have other writer friends who find that sort of work soul-crushing. They envisioned themselves writing important novels; all they can think about while churning out press releases and newsletter content is that they must have failed somewhere along the way, and they resent every minute of it.

Maybe the difference is that I've never subscribed to the idea that your work should be life-affirming, a distillation of all that makes your heart sing. Punch the clock, do what you must to earn enough to get by, and leave it all behind when you're done for the day. I like the idea of being able to use my skills in a detached, cerebral way for mercenary reasons, while reserving a private space for the kind of writing that gives me joy, and never the twain shall meet.

A friend of mine is a fan of my writing, and he wonders why I don't use my blogging more ambitiously. Why don't I try to network, be seen around the scene, strive for more readers, put out a tip jar and maybe make some money at this? Because I am congenitally incapable of being comfortable in crowds, whether in a room or in a comment section. Because I don't want to be widely read or appreciated. Because I don't want transactional relationships, obligations and expectations getting mixed up in what is a purely selfish labor of love. Because, like Auden said, we are changed by what we change, and I already know when I'm happy.

The Sun Has Already Set

Morgan Meis:

The great symphonies of the 19th century were not inspired by science, even though they were composed in a scientific age. The secular artists of the contemporary Western world seem, likewise, to find little inspiration from science. Only but rarely do these individuals create art in homage to science. Even less frequently do they create their work out of an inspiration derived from scientific method. Perhaps this is because the mood of objectivity and distance required for scientific analysis is incompatible with the expressionistic mood necessary to create art.

It would be absurd, of course, to claim that an artist must have religious faith in order to make art. But the two states of being are fully compatible. For much of human history, the religious impulse and the art-making impulse were deeply tied together. Most of the great works of art from every civilization are testimony to this basic fact. The same cannot be said of science and no amount of fine rhetoric from Richard Dawkins or anyone else will prove otherwise. It is a thing to consider, that science does not seem to go together with the kind of wonder that moves the artists. It is an incompatibility that seems to go deeper than any question of funding or who pays for the art. Is it, actually, something deeper?

...The men who carved those statues attained an extraordinary state of religious and aesthetic contemplation. It was not science that had inspired them to get there. Take that fact as you will.

While you're considering that, perhaps consider these aphorisms from Nietzsche as well:

It is not without deep pain that we acknowledge the fact that in their loftiest soarings, artists of all ages have exalted and divinely transfigured precisely those ideas which we now recognize as false; they are the glorifiers of humanity's religious and philosophical errors, and they could not have been this without belief in the absolute truth of these errors. But if the belief in such truth diminishes at all, if the rainbow colors at the farthest ends of human knowledge and imagination fade, then this kind of art can never re-flourish, for, like the Divina Commedia, Raphael's paintings, Michelangelo's frescoes, and Gothic cathedrals, they indicate not only a cosmic but also a metaphysical meaning in the work of art. Out of all this will grow a touching legend that such an art and such an artistic faith once existed.

It is true that with certain metaphysical assumptions, art has a much greater value—if it is believed, for example, that one's character is unchangeable and that the essence of the world is continually expressed in all characters and actions. Then the artist's work becomes the image of what endures eternally. In our way of thinking, however, the artist can give his image validity only for a time, because man as a whole has evolved and is changeable, and not even an individual is fixed or enduring. The same is true of another metaphysical assumption: were our visible world only appearance, as metaphysicians assume, then art would come rather close to the real world; for there would be much similarity between the world of appearance and the artist's world of dream images; the remaining difference would actually enhance the meaning of art rather than the meaning of nature, because art would portray the symmetry, the types and models of nature. But such assumptions are wrong: what place remains for art, then, after this knowledge? Above all, for thousands of years, it has taught us to see every form of life with interest and joy, and to develop our sensibility so that we finally call out, "However it may be, life is good”. This teaching of art—to have joy in existence and to regard human life as a part of nature, without being moved too violently, as something that developed through laws—this teaching has taken root in us; it now comes to light again as an all-powerful need for knowledge. We could give art up, but in doing so we would not forfeit what it has taught us to do. Similarly, we have given up religion, but not the emotional intensification and exaltation it led to. As plastic art and music are the standard for the wealth of feeling really earned and won through religion, so the intense and manifold joy in life, which art implants in us, would still demand satisfaction were art to disappear. The scientific man is a further development of the artistic man.

...Soon the artist will be regarded as a wondrous relic, on whose strength and beauty the happiness of earlier times depended; honors will be shown him, such as we cannot grant to our own equals. The best in us has perhaps been inherited from the feelings of former times, feelings which today can hardly be approached on direct paths; the sun has already set, but our life's sky glows and shines with it still, although we no longer see it.

Friday, April 13, 2012

Otium Cum Dignitate

Charles Horton Cooley:

Haste and the superficiality and strain which attend upon it are widely and insidiously destructive of good work in our day. No other condition of mind or of society—not ignorance, poverty, oppression or hate—kills art as haste does. Almost any phase of life may be ennobled if there is only calm enough in which the brooding mind may do its perfect work upon it; but out of hurry nothing noble ever did or can emerge...But ours is, on the whole, a time of stress, of the habit of incomplete work; its products are unlovely and unrestful and such as the future will have no joy in.

I came across that paragraph in Doing Nothing: A History of Loafers, Loungers, Slackers and Bums in America. Great book. I'll probably follow it up with the related topic of James Gleick's Faster:The Acceleration of Just About Everything.

My conception of slackerhood isn't really about being sluggish—I find it difficult to sleep for longer than six or seven hours; I keep a tidy house; I currently juggle four part-time jobs (which I actually enjoy, honestly); I derive satisfaction from getting tasks done, and I have a voracious appetite for books and music that will never be sated before I die. I just don't want the sort of assembly-line life that's so full of activity there's no time to do things carefully and well, without impatiently glancing ahead to the next several chores on the list. Unfortunately, common standards of success and the attendant respect of others require just such an existence, so I'll gladly accept being overlooked and condescended to by movers, shakers and would-be revolutionaries as they rush past, gettin' all carpe mundum wit' it. Go get 'em, tiger. Bum-bum-bum ba bum-bum, I feel free.

Haste and superficiality... Nothing kills art as haste does... Incomplete works, unlovely products... Come to think of it, that has a lot to do with why I'm so disgusted with the disproportionate attention lavished on social media. Prepackaged empty-calorie thoughts, made to be microwaved and wolfed down as quickly as possible. A whole lotta nobodies with nothing to say, but eager to share it as quickly and widely as possible. Feh. As Li Po said, I return to my rod and my fishing line.

Never Mind the Fat Ones, Just Go for the Slow Ones

Suzanne Goldenberg:

Meat eaters in developed countries will have to eat a lot less meat, cutting consumption by 50%, to avoid the worst consequences of future climate change, new research warns.

The fertilisers used in farming are responsible for a significant share of the warming that causes climate change.

A study published in Environmental Research Letters warns that drastic changes in food production and at the dinner table are needed by 2050 in order to prevent catastrophic global warming.

Clearly, the time is right to switch to a "humanitarian" diet. Protect the environment, solve the overpopulation problem, and enjoy a meat-rich diet, all at once.

Thursday, April 12, 2012

The Meanings We Made, Our Need

Does Facebook make us lonely by stoking the embers of our narcissistic tendencies and forcing us to engage in a 24/7 cycle of self-presentation? Well, opinions differ, but the important thing is, at least it gives us an opportunity to yammer about Facebook some more.

Maybe the clichés about the cold, impersonal, alienating nature of technology are true. Or maybe, as the philosophical minstrels Judas Priest sang, it's just the human condition to never be satisfied with what we have. Maybe you're lonely because you don't know yourself well enough to find truly complementary people to form close friendships with. Maybe it's because you have unrealistic expectations about what role other people can and should play in your life. Or maybe you're like so many other people whose lives are dominated by work to such an extent that even if you had the time to cultivate a group of meaningful relationships, you wouldn't have the slightest clue how to go about it.

The Things Which are Caesar's

Hussein Ibish:

What is most disturbing is that it is almost impossible to imagine an Islamist-influenced system protecting the religious rights of skeptics, agnostics and atheists. Blasphemy, satire, independent scholarly investigation of early Islamic history, or merely a profession of fundamental skepticism about faith in general (and not simply Islam) are all likely to remain criminal offenses. Protection for apostasy and conversion are another key test of real religious freedom.

Religious freedom was not generally well protected by the old dictatorships, and all the evidence suggests that the policing of independent thinking will intensify in the new systems. This means that there is a whole class of citizens virtually guaranteed of being denied its fundamental rights, and of being persecuted by Islamist-influenced regimes: agnostics, atheists, apostates and skeptics. Unless, of course, these individuals keep their mouths shut.

Professed commitment by Islamists to pluralism and tolerance is almost always framed in terms of faith. It seems beyond the scope of their imagination that, while people may belong to various religions, any sane person would question the very notion of religious belief, and view all religious claims with rational skepticism.

Yet without genuine religious freedom and pluralism, real freedom and equal citizenship will be illusory. What Islamists, and many other Arabs, have yet to accept is that in order for freedom of religion to be genuine, it must allow the freedom to reject faith entirely and to promote non-religious perspectives.

Islamists paying lip service to ideals of tolerance and pluralism while only accepting monotheism in practice is no more incoherent than Western Christians professing belief in the stark choice offered by the one and only son of the one true God while championing the equal validity of all religious paths; less so, actually, because at least in the Islamists' case, political opportunism doesn't directly conflict with the higher law. This is why I say that I can have a certain respect for hardcore believers; at least they're capable of following the internal logic that proceeds from accepting what they think is the ultimate truth according to the ultimate authority.

Wednesday, April 11, 2012

Got to be a Joker, He Just Do What He Please

James W. Hall:

At this point in my life and career, I simply can’t understand or abide literary snobbery. How can anyone who loves books not take heart in seeing so many new readers huddled up with a novel? Whether it’s “Harry Potter,” “The Hunger Games” or “Infinite Jest”—does it really matter? These days, when reading fiction seems like an endangered activity, why should we begrudge the success of any book, especially one that stirs such passion with younger readers?

Sure, Katniss Everdeen might be a bit less nuanced and compelling than, say, Scout Finch, another adolescent caught in a hostile and alien world. And when it comes to a story about a character fighting for survival in the war-torn wilderness, it doesn’t quite match the haunting beauty of Charles Frazier’s “Cold Mountain.” But maybe when the millions reading “The Hunger Games” have finished the trilogy and are searching the shelves for their next foray into a dystopian universe, they will discover another great bestseller of the past, perhaps Cormac McCarthy’s “The Road” or Orwell’s “Nineteen-Eighty-Four.” You never know – it could happen.

That’s the beauty of reading for pleasure. When you turn the final page and shut the book, that heady blend of sadness and joy you feel can quickly ripen into a hunger for more. I like to think of bestsellers as a gateway drug. Once you’ve found one you love, books will forever hold a special allure. All comers welcome. No special education required.

I inherited my love of reading from my mom, who as far back as I can remember always had thousands of books, mostly new age stuff and pop fiction. The easy access to books and the regular trips to bookstores probably had more of a lasting, formative effect on my character than the particular deficiencies of the countless hack authors I read, so yeah, I tend to be one of those who are just glad when people perform a solitary, contemplative activity like reading.

But that's about as far as I take it. Defenders of popular taste like Hall and the kitschfinder generals who take haughty exception to his argument both seem to agree on the desirability of a progressivist ideology in literature; the critics sound like martinets in their insistence that reading is a self-improvement chore, an exercise in moral instruction and empathy, an intellectual eating of vegetables, and the apologists largely agree, hesitantly raising a finger to add only that popular novels can serve as a booster seat for those whose intellectual development doesn't yet allow them to see out into the full horizon of human potential, and as such shouldn't be completely dismissed.

One of the commenters on the article made me laugh with his purse-lipped insistence that escapist books are not harmless, because they isolate readers from having an "authentic" human experience. I would humbly suggest to the gentleman that he is the one living in a fantasy world if he thinks superficiality, falsity and escapist desires are not omnipresent, authentic aspects of everyday life among the ham-and-eggers; perhaps he should put down the classic literature and go hang out in a shopping mall for a while if he really wants to peer into the heart of the human condition.

I'd just like to see someone point out that the majority of human beings are unreflective and dull, always have been and always will be; that even professors of literature can be inept messes in their personal lives; and that maybe just maybe we should rethink our habit of yoking together morals and aesthetics. Be sincerely glad for the presence of those whose taste completely differs from yours; they provide you with more opportunity to differentiate yourself.

Tuesday, April 10, 2012

It is the Cause, It is the Cause, My Soul

John Horgan:

The man with a tumor has no choice but to do what he does. I do have choices, which I make all the time. Yes, my choices are constrained, by the laws of physics, my genetic inheritance, upbringing and education, the social, cultural, political, and intellectual context of my existence. And as Harris keeps pointing out, I didn’t choose to be born into this universe, to my parents, in this nation, at this time. I don’t choose to grow old and die.

But just because my choices are limited doesn’t mean they don’t exist. Just because I don’t have absolute freedom doesn’t mean I have no freedom at all. Saying that free will doesn’t exist because it isn’t absolutely free is like saying truth doesn’t exist because we can’t achieve absolute, perfect knowledge.

Okay, that's good, right there. Good points. Now, hold up a minute; let's just look closer at what we mean by "choice" and "free" will, and then we should be— wait, what are you...? Hey, no, don't... aww, c'mon, really, don't say th...! Aww, man...

But the strange and wonderful thing about all organisms, and especially our species, is that mechanistic physical processes somehow give rise to phenomena that are not reducible to or determined by those physical processes. Human brains, in particular, generate human minds, which while subject to physical laws are influenced by non-physical factors, including ideas produced by other minds.

...We are physical creatures, but we are not just physical. We have free will because we are creatures of mind, meaning, ideas, not just matter. Harris perversely–willfully!–refuses to acknowledge this crushingly obvious and fundamental fact about us.

Sigh. It looked so promising for a moment there.

Sunday, April 08, 2012


The fundamental faith of the metaphysicians is the faith in opposite values. It has not even occurred to the most cautious among them that one might have a doubt right here at the threshold where it was most surely necessary – even if they vowed to themselves, "de omnibus dubitandum".
It might even be possible that what constitutes the value of these good and revered things is precisely that they are insidiously related, tied to, and involved with these wicked, seemingly opposite things – maybe even one with them in essence.

- Nietzsche

Jason Gots:

Another recent Big Think guest, philosopher Alain de Botton, might disagree with the metaphysics of Buddhism, but he shares this core belief – that beneath our often horrible outward behavior toward one another, there exists a set of shared human values such as kindness, compassion, and value of children – and that our biggest challenge as a species is not losing track of them.

Of course if you believe that, at their core, people are violent and competitive and cruel, then neither argument is likely to interest you much. But if you agree that hatred, anxiety, greed, and jealousy are secondary and deeply destructive aspects of our nature, then – after survival – finding some reliable method to control or eradicate them – and thereby liberating our better angels –becomes pretty much the only worthwhile human pursuit.

Metaphors, they're so interesting. The neatness and poetic symmetry of a well-constructed one can create an intellectual glamour that we mistake for truth if we're not careful.

Like here: what does it mean to say that our positive actions and values lie "beneath" our horrible outward behavior? Why would "good" and "bad" behaviors be so conveniently two-tiered? Or again, with the idea that at our "core", we are either positive or negative, with the other being a "secondary" group of characteristics. Either way, we get the implication that originally, human nature was simpler, purer, "better". Our negative tendencies are accretions that have built up over time and need to be scraped away. Depth equates to profundity.

Simplicity and purity are often found together in these metaphors, and they usually signify truth, as does the equating of "ancient" with "wisdom". Long ago, things were pure and uncomplicated and everyone was content, but somehow... of course, all these tropes are just derivatives of one of the sturdiest myths that mankind ever invented, that of the fall from grace. There's even a hint of Gnosticism in this particular case, with the idea that our better angels are trapped in this fallen state, awaiting liberation.

Like you may have learned in geometry class, when the given you start with is flawed, none of the steps in your proof are going to make up for it. I kind of feel that way about metaphorical constructs that are so obviously deficient.

As for the ridiculous idea that good and bad are so neatly distinguished and capable of being separated in order to elevate one and eradicate the other, well, there's an old story of a Taoist farmer that's pretty good as far as myths go. Good turns to bad and bad turns to good; the changes have no purpose and no end.

Thursday, April 05, 2012

The Skin of Three Centuries

And while I shall keep silent about some points, I do not want to remain silent about my morality which says to me: Live in seclusion so that you can live for yourself. Live in ignorance about what seems most important to your age. Between yourself and today lay the skin of at least three centuries. And the clamor of today, the noise of wars and revolutions should be a mere murmur for you.

- Nietzsche

Born in 1978, I’m a millennial in name only. I’m really a Luddite. I don’t get technology, and for a long time I tried to convince myself I didn’t want to get it. My view on the latest cyber advances was lack of interest and occasionally hostility. I imagined that this rejection marked me as an iconoclast or a rugged individualist. A real man listens to Led Zeppelin and doesn’t listen to Led Zeppelin on iTunes — that sort of thing. Now, thanks to that mulishness and vanity, I feel like a clamshell of a man, outdated and struggling to communicate with the rest of my cohorts’ fancy smartphones. At the age of 33, I’ve been left behind.

...From Friendster to PDAs, iPods to Facebook, I avoided dialing up or jacking in like my jean jacket and Marlboros depended on it. It was an image cultivated to look cool. But now the only image I’m left with is a deeply uncool one. I’m missing out on cultural conversations. I’m missing out on music and videos. I’m missing out on ideas that can be fired around the globe at the speed of thought. I’m missing out on social change that’s been enabled from Tahrir Square to Zuccotti Park. I’ve never even seen an Angry Bird.

...The truth is that all the beepers and cellphones and video game systems and VHS (and DVD) “movie machines” weren’t the vain consumerist crap I pretended they were. They weren’t the passing fads of the bourgeois. They were the foundation of a language that almost everyone in my generation has learned to speak and one that younger members of our cohort were simply born knowing. It’s the language of adaptability, of being so willing to learn and discover a new device that you never need directions to it. All of this stuff was about communicating. With each other. With machines big and small. With people in other countries. Come to think of it, communication was never my strong suit, either.

Dude, if it's any consolation, you're every bit as melodramatic and emo as any teenager, so maybe there's life in your old bones yet. Settle down, stop wailing and learn what the term "Luddite" really means. In fact, look it up on that there newfangled Internet machine you use to write your monthly column on.

Histrionics aside, it sure does amuse me that anyone old enough to know better would actually think that there's nothing interesting to talk about in this great big world of ours but the latest soon-to-be-obsolete gadget, or the trendy means of conversation. Really, man? You feel isolated because you don't walk around all day with your flickering attention span confined to the alphabet soup messages floating across a palm-sized screen? Be glad. Verily, I say unto you: the world doesn't actually revolve around the relatively tiny cliques of the Internet-famous and their hangers-on, hard as that may be to believe. People aren't any more witty or intelligent on Facebook and YouTube, and it's highly doubtful that you'll be looking back in twenty years, sighing happily over fond memories of workplace small talk about this or that TV show.

Simon Says

Sean O'Neal:

And yes, of course we can give Simon the benefit of the doubt that, as always, he’s only trying to protect the integrity of his show by arguing against premature evaluation, being reductive, and distracting silliness, all of which can indeed sometimes get in the way of the message. But maybe he could do all of us the same courtesy by not presuming those who may amuse themselves with a “who’s cooler?” tangent don’t also recognize that this sort of thing is not the point of the show.

There’s room for all kinds of Wire fandom, after all, and if he’s arguing for first-time viewers to ignore those tangents and stick with a show until its entire structure can be built, it’s probably not a good idea to discourage them from even bothering by telling them to straighten up and dictating the terms of what he thinks that appreciation should be. Otherwise, it’s not a TV show you should be making; it’s a civics class.

Wednesday, April 04, 2012

Jefferson's Razor

Let's return briefly to a different part of that deluded Andrew Sullivan article:

If you go to the second floor of the National Museum of American History in Washington, D.C., you’ll find a small room containing an 18th-century Bible whose pages are full of holes. They are carefully razor-cut empty spaces, so this was not an act of vandalism. It was, rather, a project begun by Thomas Jefferson when he was a mere 27 years old. Painstakingly removing those passages he thought reflected the actual teachings of Jesus of Nazareth, Jefferson literally cut and pasted them into a slimmer, different New Testament, and left behind the remnants (all on display until July 15). What did he edit out? He told us: “We must reduce our volume to the simple evangelists, select, even from them, the very words only of Jesus.” He removed what he felt were the “misconceptions” of Jesus’ followers, “expressing unintelligibly for others what they had not understood themselves.” And it wasn’t hard for him. He described the difference between the real Jesus and the evangelists’ embellishments as “diamonds” in a “dunghill,” glittering as “the most sublime and benevolent code of morals which has ever been offered to man.” Yes, he was calling vast parts of the Bible religious manure.

History is never so precise and tidy, of course, but I'm awestruck by the extraordinary symbolism of Jefferson's action. And since I've been on the topic a couple times recently, I thought I'd elaborate on why that is.

I've been reading Stephen Greenblatt's The Swerve this week. With the historical record of the events in question being somewhat sparse, a lot of the book consists of speculation and creative writing, and as such, some of the parts that have struck me the most have been the background details that Greenblatt uses to flesh out the story — brief references to things like the arrest and execution of Giordano Bruno, or passing mentions of ordinary villagers arrested or denounced to the Inquisition for nothing more than a sarcastic joke that could be construed as heretical. The stifling intellectual climate is chilling to contemplate, and yet, it would continue for another few centuries from the point described. We all know the basic facts of the history of religious persecution, but that kind of knowledge can quickly become a superficial recitation of dry facts that cast no shadow across the mind. You really have to immerse yourself in detailed examples every so often to feel the weight of it in your bones.

As I've stressed a few times, the intellectual freedom and security that allowed someone like Jefferson to confidently trim the Bible to his liking was not a natural outgrowth of Christianity; it was a product of Enlightenment ideals giving primacy to reason. Whatever the simplification here for the sake of clear narrative, whatever the criticism of rationality that could be offered, the important point is that in the two and a half centuries since, it has become accepted as the most basic common sense that you are free to attempt to persuade others of the truth of your religious views all you want, but it's unthinkable that you should try to use coercive force in the event that you fail. Christianity had almost 1500 years to establish that state of affairs, should it have been desired, and yet fire and sword were still the main instruments of persuasion. At some point, if you're honest, you have to acknowledge that this was a feature, not a bug.

The separation of church and state, enshrined in the highest law of the land, was a seismic shift, one of the most important events in Western history, as far as I'm concerned. Religion had been domesticated by the state. Christian might no longer made right. They were forced to swallow any antinomian urges they had and accept the right to existence of heretics and infidels.

Safe to say, it was a wild success and a popular one as well, so much so that tolerance is widely presumed by many progressive, educated people to have been an original component of religious belief per se, all along, and Jesus is presumed to have been the earliest proponent of just such an enlightened, non-sectarian, universal humanism, despite ample evidence to the contrary. You get the impression humankind once lived in harmonious agreement on something like Aldous Huxley's perennial philosophy, but power-hungry priests and ignorant zealots led people astray into sectarian strife, which was never what any founder of a religion intended.

Now, some have said to me, as long as the end result is a kinder, gentler society where everyone gets along, what's the harm if they're fuzzy on the details of how they got there? Why kick up a fuss about it and risk giving offense? I respond: because truth matters for its own sake. Because it's not innocently mistaken to sweep aside the unseemly parts of Christianity or the significance of the religious/secular divide, it's flat-out dishonest. Because this isn't even an irresolvable argument about the finer points of metaphysics; this is a simple case of what did and didn't happen in history, simple enough to be explained in grade-school textbooks. Because good-natured geniality predicated on mistakes and falsehoods is not as stable or enduring as I would like it to be.

Theologians and garden-variety apologists for religion frequently resort to the cuttlefish defense against the criticisms of atheists; i.e., spilling an tremendous amount of ink in the hope that their attackers will get lost in the impenetrable abstractions and convoluted knots of logic. But unfortunately for them, the basic principles of Christianity have been boiled down to their simplest form for the sake of reaching across language barriers and converting heathens the world over, and it's pretty clear what it means to be a Christian: All humans carry original sin. God sacrificed his only son to offer you a way to escape this burden. Accept this offering, proclaim Jesus as your savior, and profess your belief in the reality of his death, resurrection and eventual return. If you refuse, there will be the most severe consequences imaginable.

See, there's no wiggle room there. This is the essence of Christianity, the very thing that gives it an identity, that makes it not-Islam, not-Judaism, not Hinduism, etc. Jesus's death and resurrection were historical events that only happened once. There is no other path to free yourself from the wages of sin, and only false prophets will tell you otherwise. You cannot change these core elements without doing violence to the story. There is no honest way to square such a provincial, dogmatic worldview with the broad-minded Enlightenment inheritance we're so fond of.

Thus when I hear professed Christians like Mary Elizabeth Williams talk about respecting the equal validity of other faiths like it's a self-evident truth, along with touchy-feely postmodern glurge about everyone's personal truth being different and yet still true, I think, well, you don't appreciate what a chasm Jefferson opened up in the intellectual landscape. You're trying to have it both ways without understanding what either of them really mean. You can't legitimately call yourself a Christian if you don't accept the very basic rules of membership. If you find them distasteful, if you have to remove or reimagine the core aspects to make them palatable, then have the fucking integrity to admit you've outgrown your religion, and put it away along with all other childish things. Quit dressing up the old bones in new clothes and give them a decent burial.

I respect Christianity as a worthy opponent in the Nietzschean sense and so make the effort to take it seriously even if it means the impossibility of reconciliation; it stuns me to see the insouciance with which its supposed adherents treat it.

Tuesday, April 03, 2012

Pining for the Fjords? What Kind of Talk is That?

Speaking of bridges, I fully support Chris Clarke's idea to collapse the Rainbow Bridge and replace it with a metaphysically sturdier one.

I got a different card with my dog's ashes a few weeks ago, actually. It started off with a paraphrase of Tennyson's line -- "To have loved and then said farewell is better than to have never loved at all" -- which I suppose was reworded because saying "To have loved and lost" sounds too painfully final. Euphemisms; never a good sign for intellectual honesty.

It went downhill from there. Today he is as he was in his youth (hopefully not his early puppyhood when he had parvo and served briefly as bait for dogs learning to fight). Green grass, butterflies flitting among flowers, shining sun and other assorted awesome alliterations. He awaits my arrival, of course, but knowing how worried he got whenever I was gone for even several hours, I hate to think that he's going to have to possibly wait another few decades. Plus, if it's eternal summertime there, I'm going to be pissed. I fucking hate hot weather. Are we sure this isn't hell? How is this supposed to be comforting?

Anyway. It's signed, "Your pet in heaven."

I don't mind it too terribly, even when people try to offer schmaltzy condolences in person. I don't pay close attention to whatever strangers and casual acquaintances are babbling about anyway, and grief gives me a solid excuse to be even more taciturn and aloof. The sheer awkward ineptitude of such generic, prepackaged attempts at sympathy almost makes me laugh, if anything. But I do find myself gritting my teeth when my vet, who I consider to be a friend, always whispers "Now you're at peace" to them. No, he's dead. He's not anything. No more, bereft of life, an ex-dog. Peace has nothing to do with it. There's only me and the maelstrom of emotions, warlike in their intensity, that rush to fill the sudden void.

Just Water Under the Milvian Bridge

Andrew Sullivan:

What were those doctrines? Not the supernatural claims that, fused with politics and power, gave successive generations wars, inquisitions, pogroms, reformations, and counterreformations. Jesus’ doctrines were the practical commandments, the truly radical ideas that immediately leap out in the simple stories he told and which he exemplified in everything he did. Not simply love one another, but love your enemy and forgive those who harm you; give up all material wealth; love the ineffable Being behind all things, and know that this Being is actually your truest Father, in whose image you were made. Above all: give up power over others, because power, if it is to be effective, ultimately requires the threat of violence, and violence is incompatible with the total acceptance and love of all other human beings that is at the sacred heart of Jesus’ teaching. That’s why, in his final apolitical act, Jesus never defended his innocence at trial, never resisted his crucifixion, and even turned to those nailing his hands to the wood on the cross and forgave them, and loved them.

Let's set aside the numerous examples of violence both threatened and implied for those who refused to pay heed to this prophet, as well as the curious idea that one can truly accept and love all other human beings while raging and recoiling in disgust at practically everything that makes one actually human. Because unlike so many proponents of this Jesus-as-ur-hippie rhetoric, Sullivan is at least cognizant of the impractical zealotry from whence it came:

Jesus never spoke of homosexuality or abortion, and his only remarks on marriage were a condemnation of divorce (now commonplace among American Christians) and forgiveness for adultery. The family? He disowned his parents in public as a teen, and told his followers to abandon theirs if they wanted to follow him. Sex? He was a celibate who, along with his followers, anticipated an imminent End of the World where reproduction was completely irrelevant.

Sullivan laments the politics and power that stepped in and corrupted what had been such a pure, simple message of world-renunciation and resentful apocalyptic desire, childlike in its innocence. Of course, that worldly political structure is the only reason we're even still talking about what should have been just another forgotten group of mystery cultists in the first place. The Church's consolidation of power, rather than being some regrettable aberration that interfered with "true" Christianity's natural progression toward universal acceptance, was the answer to the question of how to live in the world once it had become readily apparent that the Parousia, the entire raison d'être of Christianity, the only thing that gave it meaning and coherence, was not going to happen. Why, "misinterpreting" the message is a venerable tradition, going all the way back to, well, Constantine himself! Maybe some people are just too close to it to appreciate the wonderful irony of that.

Monday, April 02, 2012

Gazing Enraptured Into the Pool

Razib Khan:

The fact that Ahmed could write something as unhinged from reality as the concluding paragraph to his Salon piece tells us more about his own educational-cultural milieu than it does about the literature and the authors of the literature in question itself. The real pre-modern period was soaked in xenophobia and racism.

...It’s a fact that a lot of fantasy fiction assumes the existence of supernatural agents, of gods. Many of the protagonists are depicted as pious and godly, as if these are good things, rather than mental delusions. As an atheist who rejects the supernatural I wonder why there are no atheist viewpoint characters, or worlds where there isn’t a reference to supernatural agents and activities?

If I did wonder these things I’d be a narcissistic fool.

Ahahaha. I saw that "Is Game of Thrones too pale?" article and rolled my eyes, just like I did over the other one about what sort of parenting lessons you can draw from the series (no, seriously). Sites like Salon, Slate and the like publish some worthwhile stuff, but ye gads, to get to it, you have to wade through so much jejune treacle about life lessons, obsession with superficial racial/sexual diversity, and whatever other garbage that appeals to the typical milquetoast multiculti progressive helicopter parents who make absolutely everything about them.