Monday, March 31, 2014

Give Me Something to Believe In

David Goldman:

Today’s American liberalism, it is often remarked, amounts to a secular religion: it has its own sacred texts and taboos, Crusades and Inquisitions. The political correctness that undergirds it, meanwhile, can be traced back to the past century’s liberal Protestantism. Conservatives, of course, routinely scoff that liberals’ ersatz religion is inferior to the genuine article.

Joseph Bottum, by contrast, examines post-Protestant secular religion with empathy, and contends that it gained force and staying power by recasting the old Mainline Protestantism in the form of catechistic worldly categories: anti-racism, anti-gender discrimination, anti-inequality, and so forth. What sustains the heirs of the now-defunct Protestant consensus, he concludes, is a sense of the sacred, but one that seeks the security of personal salvation through assuming the right stance on social and political issues. Precisely because the new secular religion permeates into the pores of everyday life, it sustains the certitude of salvation and a self-perpetuating spiritual aura. Secularism has succeeded on religious terms. That is an uncommon way of understanding the issue, and a powerful one.

M.B. Dougherty:

Bottum's thesis is that there really isn't a new American caste. This "class" that has outsize influence on America's moral and spiritual life is roughly the same class that has always had it: Mainline Protestants, only now without the doctrinal Protestantism or the churchgoing.

Of course, on one level, the startling truth about the past 50 years of American social life is the collapse of Mainline Protestantism. In 1965, more than 50 percent of Americans belonged to the country's historic Protestant congregations. Now less than 10 percent do, and that number continues to drop. But Mainline Protestantism long existed as a column of American society, able to support the American project and criticize it prophetically at the same time. It would be even more startling if the spiritual energies it captained, and the anxieties it defined, ceased to exist the moment people walked out the door.

...The post-Protestants Bottum identifies have just that, "a social gospel, without the gospel. For all of them, the sole proof of redemption is the holding of a proper sense of social ills. The only available confidence about their salvation, as something superadded to experience, is the self-esteem that comes with feeling they oppose the social evils of bigotry and power and the groupthink of the mob."

...Can we not hear in the progressive's soul-searching examination of his own "privilege," as well as his unconscious participation in structural injustice, an echo of Rauschenbusch's words? Whereas Catholics make an examination of conscience before confession, and confess their personal sins before promising to amend their life, today's progressives examine their place in the social structure of oppression, and then vow to reform society. That is what it means to have a "social gospel without the gospel" — to be motivated by religious impulses, but believe it is entirely secular.

Saying that something is "like a religion" is like comparing statesmen to Hitler — too general to be useful or illuminating, and more likely to provoke a lot of useless tangents. Assuming that Bottum is more interested with tracing the history of secularism and liberalism in America than pursuing equivalence for its own sake, this does sound like it would be an interesting book.

Friday, March 28, 2014

You Are No Good to Anybody

Conor Friedersdorf:

In the aftermath of a gun tragedy, there isn't anything wrong with proponents of gun control trying to persuade Americans to change their position in light of what happened. But after Newtown, many gun-control advocates tried to shame rather than persuade, as if the "correct" position was obvious to everyone save retrograde idiots.

On guns, that strategy has never worked. 

How often has it ever worked on anything, basic psychology being what it is? Eh, whatever. Convincing people to change their minds was never the goal anyway. The sorts of snarky social media performances he's talking about are entirely for the benefit of the performer's peer group. Proud gun owners in this case might as well be pantomime villains or pro wrestling heels; they only exist to make the hero look better. Which brings us to the main point I wanted to make: Friedersdorf, like a lot of people who spend their professional lives online, seems to be making the mistake of treating the effluvium in his Twitter feed as representative of anything other than the useless chattering class that produces it. Some people recognize that true sociopolitical activism, especially for unpopular causes, requires patience, tact and a massive amount of exhausting, thankless, anonymous work, to say nothing of time spent actually engaging with people who disagree. Other people, well, they write anguished blog posts bitching at Madonna and maybe retweet some snark about gun owners and their sexual inadequacy. The latter aren't "advocates" of anything but their own righteousness; they aren't interested in anything but the usual insular status-seeking.

Monday, March 24, 2014

The Expression of the General Will


I know some people will assume I’m speaking to some sad fringe here. But I have been amazed at how mainstream these anti-free speech efforts have become. I have been amazed not just because of the immorality of trying to ban free though, free expression, and free assembly, or because these efforts reverse centuries of the assumed work of the left, but because of how easily this could backfire, in a world where our movements against sexism and racism and homophobia are still so fragile and contested. Ten years ago, the Republican party ran on a platform of opposition to gay marriage, and enjoyed enormous electoral success, and yet people trust the majority so deeply that they are willing to hand it the power to ban unpopular speech. My people: we are not nearly so popular or powerful as it can sometimes seem, when we engage with those we agree with online. Sometimes, the people who are arguing against free expression know that; they recount in terrible detail all the ways in which this remains a deeply unjust world. And yet when it comes to these kinds of political debates, they seem to forget, arguing always for a retrenchment back to the already convinced, and responding angrily to the notion that it is our responsibility to argue publicly and effectively for what is right. It’s a central contradiction of this movement, and something I’ll never understand.

A right-wing British guy I know explained it as well as I've ever heard — whether through a lack of imagination or memory, or just due to impatience, people who advocate such illiberal stances always envision themselves as the ones in power, making the rules. As if on cue, one of Freddie's commenters offers up a convoluted, weasel-worded attempt to justify the heckler's veto in a private institutional setting, such as a college campus. According to this special pleading, "members of that institutional community" might have a right to "occupy" that space themselves if they object to the speech being offered. As the aforementioned Tory would sardonically expect, it apparently never occurs to our hero that "members of that institutional community" who don't agree with the mob consensus even exist, let alone deserve consideration.

Empty Free Unplugged

Casey N. Cep:

That is why, I think, the Day of Unplugging is such a strange thing. Those who unplug have every intention of plugging back in. This sort of stunt presents an experiment, with its results determined beforehand; one finds exactly what one expects to find: never more, often less. It’s one of the reasons that the unplugging movement has attracted such vocal criticism from the likes of Nathan Jurgenson, Alexis Madrigal, and Evgeny Morozov. If it takes unplugging to learn how better to live plugged in, so be it. But let’s not mistake such experiments in asceticism for a sustainable way of life. For most of us, the modern world is full of gadgets and electronics, and we’d do better to reflect on how we can live there than to pretend we can live elsewhere.

It seems like a fairly innocuous and sensible article to me. And yet, Freddie's jimmies have gotten all a-rustled by it. He seems to think that Cep and people like her are overreacting out of feeling threatened by alternative behavior, in which case I can only say, blogger, critique thyself. Really, I don't get any sense whatsoever that she's acting to stamp out heresy. Not everything is an issue of insecure powerbrokers trying to maintain their sweaty grip upon the levers of control. People are just having a conversation, dude, relax.

It's funny, because, looked at from a different angle, this would seem to be the kind of thing he usually criticizes: an impotent yet self-congratulatory little niche movement, another roundabout method of social sorting and status-seeking. Don't get me wrong; I don't have any problem with the Unpluggers myself, and I'm not accusing them of failing to live up to any revolutionary ideals. I'm just agreeing with Cep's basic question: Why bring it up? Why make a "thing" out of it? Most of us cross the digital/meatspace border back and forth on a regular basis without feeling the need to Instagram the occasion. What does it mean to call attention to it?

In other words, rather than simply shut the laptop and go work in the garden for a while with no fanfare, the Unpluggers are consciously differentiating between the two experiences. As the poets might say, they're naming the experiences in order to signify something about them. For example, walking. It's something most of us have been doing since we were toddlers and continue to do when necessary, something we don't often think about. But to someone like Wayne Curtis, walking is an ideological subject in itself. It is a very specific, purposeful action, perhaps even a way of life, with its attendant behaviors, attitudes and values. But by naming something, you're also, if only by implication, calling attention to what it is not. And so, especially for a subject about which not much of interest can be said to begin with, one easy way to create an identity is to contrast the subject with its opposite. Thus walking, rather than being an unremarkable method of locomotion, also becomes a way of contrasting oneself favorably to those who don't walk for pleasure.

What, then, are the Unpluggers signifying by the conspicuous way they go about their business? Why have they decided, like dilettante Amish, to draw the line at technology developed in the last two decades or so? These are perfectly valid questions for someone like Cep to ask. What, this is part of your identity? This is an important ritual behavior for you? Well, okay, that's fine; I'm sure the marketing department is already preparing some targeted ads for your demographic to aid in your quest for an authentic, analog experience. I hope you didn't have any higher aspirations than that, though.

Sunday, March 23, 2014

I Wish I Was Like You, Easily Amused

Saw this glorious fellow during the Real Madrid/Barcelona game and just knew there would have to be a gif of him by the time the game was done. Thank you, Internet. I can't explain why he brings me such joy; he just does. (I do believe he's calling someone a puta as well.)

Saturday, March 22, 2014

Having Glimpsed Resolution

Morris Berman:

I remember, a number of years ago, having a leisurely lunch with my then girlfriend at an outdoor café in Philadelphia. We were the only customers, and it was a nice balmy afternoon. Suddenly, some bozo runs up to the café, his cell phone rings, and he yells: “This is Joe Blow! What can I do for you?” This is what I mean by a Degraded Buffoon—a man reduced to nothing but hustling. He doesn’t say, “Hi, this is Joe Blow, how are you? What’s happening in your life?” No, it’s “Let’s do business!” Nor does it bother him to be disturbing a couple having a quiet lunch six feet away from him—fuck everybody else, I’m Joe Blow! Hard to describe how stupid he looked: crew cut, hatchet face, a bundle of tension. And I thought: yes, this is America, my friends; this rude, stupid piece of trash is who we really are.

...When I left the U.S. (thank god), I think I had a total of three or four genuine friendships, after all of those decades of living there. One thing I discovered about Americans was that they have no idea of what friendship really is, and that it does take love, endurance, and discipline (effort, in short). These are alien concepts to Degraded Buffoons. “Lightweight” is precisely the right word here. Over the years, I noticed that it was not uncommon for people to disappear from my life, and the lives of others, overnight, and without so much as a word of explanation. In a few cases this even happened after a year or two of knowing someone, having had dinners together, having had (I thought) meaningful discussions. And then: poof! They’re gone, and apparently could care less. If you live in a world of noise, cell phones, and hustling, why would any one person mean anything to you? And this is the norm, in the U.S., my friends; what I’m describing—you all recognize this—is hardly aberrant.

You probably know that I have no great affection for the human race in general. Like Louis C.K., I may have even been known a time or two thousand to hate strangers for sheer amusement and recreation. Nonetheless, seeing what a relentlessly vituperative fellow Berman has become, I was reminded of a line from one of Nietzsche's letters to his friend Peter Gast where, feeling despondent and lonely after the end of his friendship with Richard Wagner, he confessed that "even now the whole of my philosophy totters after one hour's sympathetic intercourse even with total strangers! It seems to me so foolish to insist on being in the right at the expense of love..."

It's funny to entertain the notion that Nietzsche's philosophical brilliance could possibly have been the result of a sublimated need for love and camaraderie, but there you go. Likewise, many of the ways in which people present themselves to the world are nothing but brave faces and compensatory gestures. I suspect that Berman might find himself similarly chastened if he were to actually take the time to get to know people like Joe Blow rather than harshly judging them by a few superficial characteristics gathered in one moment's acquaintance. I mean, goddamn, I like to think of myself as at least a mild misanthrope, and I've had friendships where the burden seemed to be entirely on me to keep the correspondence going, but this ranting old coot makes me feel positively touchy-feely by comparison. Sometimes those friends who seem to vanish from your life without a care were actually suffering in some way and afraid to make themselves vulnerable and ask for help, and you'll never realize that if you're too busy dwelling on how much they've let you down or what they supposedly owe you. An Auden couplet provides a good rule of thumb: "If equal affection cannot be/Let the more loving one be me." Berman would apparently rather be "right" than forgiving of others' faults, and you don't have to be an aspiring bodhisattva to see what a sad thing that is.

Of course, we're talking about a guy whose latest book is Why America Failed, the third part of a trilogy lamenting/gleefully excoriating our cultural inability to exist in the Romantic/Luddite mishmash of his historical imagination. The historian Steven Ozment accurately described the mindset of those who get too attached to such ideals:

The belief that momentary feelings of unity or visions of perfection can survive permanently into everyday life this side of eternity is the ante-room of nihilism and fascism. Such beliefs give rise to ahistorical fantasies, which can never materialize beyond the notion. To the extent that they are relentlessly pursued, they progressively crush the moments of solace that precious moments of grace can in fact convey. Historically such fantasies have spawned generations of cynics, misanthropes and failed revolutionaries who, having glimpsed resolution, cannot forgive the grinding years of imperfect life that still must be lived.

Thursday, March 20, 2014

But I'll Break Before I'll Bend


Honestly, I shudder to think sometimes that I could have been one of those alarmingly numerous people who never grow out of sixth-form leftism – who decide early in life that they have all the theories they need and no further reflection is required.

Two noteworthy things that have repeatedly appeared in Left-wing paeans to Tony Benn: that he had ‘unshakeable beliefs’; and the idea, as one Tweeter put it, that “principle always outshines policy”.

So: a refusal to change one’s mind no matter what the evidence; and a belief that principles and ideals are separable from, and more important than, practicalities and consequences. Can anybody think of any modes of human thought that have led to more suffering and murder than those two?

Eighteen year-old me would have probably been awed by such steadfast conviction. Current me is just repulsed by it. Arthur and I were just commiserating over our various experiences with fanatical ideologues and how they'd soured us on radicalism (it was amusing to try to fill in a complete outsider on what's gone on in the online atheist environment). He said:

It sounds to me as if you've given yourself a good healthy Oakeshott in the arm, with a bit of Burke and Berlin on the side, to fortify you against the arrogant theoreticians who pull abstract utopias out of their butts and try to impose them on everyone else. That's pretty much where I'm at. So our guideposts, in the absence of any divine sanction, are a hopeless mix of tradition, intuition, taste, personality quirks, common sense (whatever that is) and the survival instinct, individual and collective--with some Plato and Nietzsche thrown in to make it all seem intellectually respectable. Nothing to write home about, but look at what the alternatives have been.

Wednesday, March 19, 2014

So Play Is Worse Than Work?

Heather Havrilesky:

The vigorous exhortation to “play” now haunts every corner of our culture. Typically issued as an imperative along with words like breathe and meditate and dance and celebrate, the word play, in its catchall generic form, has a curious way of repelling the senses, conjuring as it does all manner of mandatory frivolity, most of it horribly twee and doggedly futile. Yet Johan Huizinga, the Dutch cultural theorist who tirelessly examined “the play element in culture,” asserted that the one defining feature of play is that it’s voluntary. “Play to order is no longer play,” he declared flatly. “It could at best be a forcible imitation of it.”

...A second-order definition of play, Huizinga notes, is its close correspondence to the serious adult activities of work. “Play must serve something which is not play,” he observes—which is why so many children’s pastimes openly mimic adult pursuits, from the near-universal rituals of doll nurture to games that reenact the aims and provisional alliances of war-making.

But in a consumer culture committed to prolonging adolescence at all costs, the boundaries demarcating child and adult experience have blurred to the point that it’s no longer obvious just who is imitating whom. The American state of play is terminally confused. Much of it feels grimly compulsory, and carries with it a whiff of preemptive failure to achieve the target level of revelry. 

Monday, March 17, 2014

Consistently Inconsistent In Insisting

Terry Eagleton:

Nietzsche sees that civilization is in the process of ditching divinity while still clinging to religious values, and that this egregious act of bad faith must not go uncontested. You cannot kick away the foundations and expect the building still to stand. The death of God, he argues in The Gay Science, is the most momentous event of human history, yet men and women are behaving as though it were no more than a minor readjustment. Of the various artificial respirators on which God has been kept alive, one of the most effective is morality. “It does not follow,” Feuerbach anxiously insists, “that goodness, justice and wisdom are chimeras because the existence of God is a chimera.” Perhaps not; but in Nietzsche’s view it does not follow either that we can dispense with divine authority and continue to conduct our moral business as usual. Our conceptions of truth, virtue, identity, and autonomy, our sense of history as shapely and coherent, all have deep-seated theological roots. It is idle to imagine that they could be torn from these origins and remain intact. Morality must therefore either rethink itself from the ground up, or live on in the chronic bad faith of appealing to sources it knows to be spurious. In the wake of the death of God, there are those who continue to hold that morality is about duty, conscience, and obligation, but who now find themselves bemused about the source of such beliefs. This is not a problem for Christianity—not only because it has faith in such a source, but because it does not believe that morality is primarily about duty, conscience, or obligation in the first place.

Dear gods, what a septic tank of fallacious reasoning. Look, I think Nietzsche's insistence on the "bad faith" of humanists in thinking that they can just carry on with business as usual after God's death is one of his weakest points, perhaps even betraying the love-portion of his love/hate relationship with Socrates (and I don't find it at all surprising that Christians like Eagleton are so fond of gleefully repeating it; it lends credence to their favorite false dichotomy of "either monotheism or nihilism!"). Think of it from a lowercase-c conservative perspective: this way of life we have, this cultural morality, however it may have developed, it just works. However bass-ackwardly we reasoned our way into it, we've kept with it because it seems to suit our most pressing needs, and we adjust it as needed rather than throw it away and start over from scratch. Why is it a problem if we don't have a consciously rational justification for every single bit of it? Isn't that what Nietzsche attacked Socrates for in a different context? For acting as if only conscious knowledge is meaningful or noble? As Auden said, we are changed by what we change — the reasons we fall in love with our spouses may not be the most important things about why we're still with them thirty years later while many other positive facets of the relationship only revealed themselves with time and experience; that doesn't invalidate the original impulses or the evolution of the project. I don't see why cultures should be any different. Maybe we originally behaved this way because we thought somebody named God said to do so, but maybe we decided along the way that there were good pragmatic reasons for it too. The world evolves, it doesn't proceed like steps in a geometric proof. Some stages will look like neither fish nor fowl, and it's only ever a small percentage of neurotic intellectuals who will tie themselves in anguished knots over the proper taxonomy. Feuerbach wasn't anxious; you're just projecting, and Nietzsche was always melodramatically distraught after a breakup, whether with a Russian blonde or a deity.

Arthur and I were sharing a laugh at Eagleton's article. I sent him the above paragraph, and he responded with this:

Yes, Nietzsche is being inconsistent, as he consistently is, in insisting that we have to take the death of God to its rigorous logical conclusion and change everything within and without to accommodate the catastrophic knowledge that we're all alone out here. Elsewhere, as you know, he ridicules rationalism and logical consistency and claims life can only be justified as an aesthetic phenomenon (and in this connection Wilde is appropriately mentioned by Eagleton). On the other hand, "Poets lie too much." Wait a minute, Freddy, didn't you just say that truth and lies are only metaphors, that the point is not to discover truth but to create values? Which is it going to be, art or philosophy, creativity or knowledge? The answer depends on which book, or chapter, or page of Nietzsche you're reading. But you gotta love the man's style. He was outrageous, and that's saying something, especially for a philosopher. He is an aristocratic anarchist, like Yeats, who was a sort of disciple. I'm fine with that, as long as the aristocratic part doesn't mean knouting poachers and running over peasants in your coach and six. There I draw the line. But on what grounds? Religious? Moral? Prudential? (I don't particularly want to spend time in jail.) Is it possible that we have stared too long into the Abgrund, the abyss over which we tightrope-walk every day while juggling philosophical interpretations of what we are doing, and the abyss has stared back, and we all now carry within us the haunted emptiness of an abandoned mansion that we used to call Soul? Naah...

Marginal Practices

Razib Khan:

Does any of this matter? Why is that humanists have to judge their intellectual forebears by the standards of a modern Oberlin seminar? Would any of us withstand critique and deconstruction a generation down the line? Instead of grappling with the ideas, it seems that in much of the humanities there is grappling with personality’s who can no longer argue, and inveighing against ages long dead. I can compute Pearson’s correlation coefficient without being troubled by Karl Pearson’s socialism and white supremacism. Obviously it is too much to ask the humanities to be view their intellectual production in a similar manner, but it strikes me that they have gone too far down the road of putting the dead through ghostly show trials meant to solidify conformity in the ranks. As I stated on Twitter, the problem with fashionable intellectuals is that they need to be careful not to outlive the fashions of their age.

He's talking about a Slate article that frets over whether Heidegger is fit to be taught in polite philosophical company. Meh. I've never found him all that worthwhile, and as for Slate, well, I mean, you know, when you go visit that site and scroll down through the latest godawful redesign, one of the most prominent features on the entire page is the link to their voyeuristic advice column (judging by the most-shared list, the most popular feature of the site) where people find help with such pressing existential dilemmas as what to do when your boss poops in the office shower (yes, seriously, that was the topic du jour last time I was there; no, I'm not digging up the link, go find it yourself, you disgusting coprophiliac). What I'm saying is, you kind of get a sense of the demographic they're pitching to, and thus you shouldn't be surprised that they would produce an article tackling the equally pressing issue of whether a long-dead philosopher is merely a bad man for having joined the Nazi party, or a bad, bad man for having expressed anti-Semitism in his recently-discovered private letters.

According to Peter Watson, though, Heidegger at least had some redeemable insights:

Heidegger also had the idea of "marginal practices," what he called the "saving power of insignificant things — practices such as friendship, backpacking in the wilderness, and drinking the local wine with friends." This was his idea of "radical pastoralism." All these things remain marginal, he maintained, "precisely because they resist efficiency." They remain outside — beyond — the reach of the modern attitude. This is not quite true, of course — backpacking can be harnessed to our concern with health and training, making us more efficient in that way. But Heidegger meant "marginal practices" to be refuges from modern life, and used them as metaphors for his approach.

Saturday, March 15, 2014

Television, You Got a Vision

Let me try to intuit what a traditional conservative would take away from this level of watching television. “The people have lost their moral center, and lack appreciation for the edifying arts of yore, debasing themselves to partake of the passive hedonism of our fallen age.” Perhaps I stated it pompously, but I suspect you get the picture. What about a liberal? “The people lack the disposable income to avail themselves of the outdoors and finer pleasures of life, and so must make do with the accessible joys of television.” In other words, for the conservative the passive television watching public have missed the mark of their own free will, they have sinned against what their life was meant to be. For the liberal television occupies the role in the lives of the proletariat it does because they lack the economic wherewithal to enjoy all the finer things they obviously must yearn for.

But there may be a different answer. The people watch television because they prefer television to what the cultural elites, Left and Right, would term the “higher arts.” The soul of man is not noble, and it is not made in the image of a divine being on high. It is that of a squalid savanna ape rutting in the open and greedily thrusting sweets into its mouth until waves of satiety wash over its corpulent physique.

Friday, March 14, 2014

We, Too, Are Still Pious

John Gray:

There can be little doubt that Nietzsche is the most important figure in modern atheism, but you would never know it from reading the current crop of unbelievers, who rarely cite his arguments or even mention him. Today’s atheists cultivate a broad ignorance of the history of the ideas they fervently preach, and there are many reasons why they might prefer that the 19th-century German thinker be consigned to the memory hole. With few exceptions, contemporary atheists are earnest and militant liberals. Awkwardly, Nietzsche pointed out that liberal values derive from Jewish and Christian monotheism, and rejected these values for that very reason. There is no basis – whether in logic or history – for the prevailing notion that atheism and liberalism go together. Illustrating this fact, Nietzsche can only be an embarrassment for atheists today. Worse, they can’t help dimly suspecting they embody precisely the kind of pious freethinker that Nietzsche despised and mocked: loud in their mawkish reverence for humanity, and stridently censorious of any criticism of liberal hopes.

Ahahaha, it's almost like Gray's been reading the atheist blogosphere. Obviously, there's been a certain faction of New Atheism who have devoted themselves in recent years to evangelizing for the good news supposed necessary logical connection of godlessness to New Left identity politics, and to whom this passage fits like a comfortable shoe. Here's an amusing blast from the past — I didn't realize the implications at the time, but Adam Lee, whom we last saw making an utter fool out of himself by serving as the hype man for the embarrassing abortion of a media phenomenon known as Atheism+, was already making identitarian noises in this post from four years ago, in which he displayed such a complete ignorance of Nietzsche's philosophy that I was compelled to set aside my usual rule about not commenting at other people's blogs long enough to voice my objection (which went unacknowledged).

Boss, Just Make Sure I Get My 50% of 50% of 50%

Thomas Frank interviewing Adolph Reed:

And you use the word “egalitarian.” That’s sort of what’s completely missing today. All of these victories on these other fronts, largely matters of identity politics, and where is the egalitarian left?

Right, and my friend Walter Michaels has made this point very eloquently over and over again . . .  that the problem with a notion of equality or social justice that’s rooted in the perspectives of multiculturalism and diversity is that from those perspectives you can have a society that’s perfectly just if less than 1 percent of the population controls 95 percent of the stuff, so long as that one percent is half women and 12 percent black, and 12 percent Latino and whatever the appropriate numbers are gay. Now that’s a problem.

I agree with those who think the egalitarian left is moribund and unlikely to make a comeback. The identitarian left is predominant, largely, I think, because it's much easier to do politics that way. Take the example from just the other day. Will any of these recent journalism startups amount to a genuine challenge to corporate media? Only time will tell, but at least people like Greenwald and Taibbi are trying to make it happen. The whole venture may end up co-opted and toothless, but either way, having an informed opinion about its progress will, as always, require attention, consideration and effort. An identitarian like Emily Bell, though, already knows, at a mere glance, all she needs to know. It's just a bunch of white men, she sniffs. Nothing positive could possibly come of that. Likewise, there's no need to consider vexing details about Arianna Huffington's business model when all you need to know is that she's a media mogul with ladyparts, so she's a checkmark in the win column!

Much like their "radical" brethren in academia, who are content to rule their petty fiefdoms, the identitarians have a comfortable niche in popular media where they can produce a steady supply of simplistic clickbait. Personally, I think they'd be just as threatened by the possibility of genuine radical change as any conservative, and I suspect they know that, too.

Thursday, March 13, 2014

You Just Roll With It, Baby

Key and Peele:

To not make fun of something is, we believe, itself a form of bullying. When a humorist makes the conscious decision to exclude a group from derision, isn’t he or she implying that the members of that group are not capable of self-reflection? Or don’t possess the mental faculties to recognize the nuances of satire? A group that’s excluded never gets the opportunity to join in the greater human conversation.

At work today, I listened to a group of white and black people joking around. A black guy talked about his love of Popeye's fried chicken and joked that the reason there weren't any franchises nearby is because the residents think "we've already got enough black people around here!" A white guy talked about walking into a Waffle House in Atlanta and wondering why everyone was staring at him. "Oh. I'm the only white guy here. Yeah, guess I'll go ahead and leave, then." A black guy talked about being pranked at an event by a friend who set it up so that the (white) band stopped playing and all the (white) guests turned to stare at him when he walked in.

I guess it's a sign that I've spent too long on the Internet that I was struck by how friendly and easygoing everyone was, even while discussing potentially sensitive topics. I laughed to myself, thinking about how your average twitosphere SJW would have been having an absolute hyperventilating shitfit while trying to monitor and regulate the conversation to their fragile standards.

Wednesday, March 12, 2014

Prorated Equal

The funniest thing, in my opinion, about this paint-by-numbers Guardian column kvetching about the lack of gender diversity among the recent spate of journalism startups is the invocation of Arianna Huffington as some kind of revolutionary trailblazer. Well, as Emma Goldman would no doubt have apocryphally said, if your revolution isn't being led by a ditzy heiress who's built a fortune on the backs of countless unpaid writers, I don't want it. I mean, my god, I think I've seen teenage virgins on Reddit who aren't as awestruck by the mystical powers of female genitalia than feminists like Bell.

Besides, here's a prominent woman of color to tell you and your superficial notions of diversity to go fuck yourselves.

Ain't That Buff Enough

Eliana Dockterman:

The average guy wants 15-27 more pounds of muscle and a three to four percent decrease in body fat. And a new study published in JAMA Pediatrics in January found that 18 percent of boys are very concerned about their weight and physique. Failure to attain these unrealistic body goals can lead to depression, high-risk behaviors (like drinking and drugs) and eating disorders. Though about 15 percent of boys concerned with their weight are worried about thinness, about half are concerned with gaining more muscle and an additional third are concerned with both muscle gain and thinness.

Many of these changes are thanks to media images—and the 300 movie series is leading the way in the promotion of unrealistic male body standards (buttressed by video games and clothing ads featuring scantily clad men).

Now I feel guilty for admiring my abs after a round of crunches the other day!

I've put on about ten pounds of muscle since last fall when I started lifting weights regularly again. I mix that with treadmill walking (got a new one, no thanks to you cheapskates) and running through old soccer drills outside, with some yoga and Pilates thrown in as needed. Still, if any of those angsty young men are reading this, looking for advice, I'd say the same thing as before — it doesn't really matter what exactly you do, just make sure to cultivate your stick-to-itiveness and do it consistently. Slow and steady and all that.

Sunday, March 09, 2014

Doomsday Averted

The lack of recognition was unmerited; Hong apparently captured the workings of the Malthusian trap better than Malthus. (I use the hedge word "apparently" because he never worked out the details.) The Englishman's theory made a simple prediction: more food would lead to more mouths would lead to more misery. In fact, though, the world's farmers have more than kept pace. Between 1961 and 2007 humankind's numbers doubled, roughly speaking, while global harvests of wheat, rice and maize tripled. As population has soared, in fact, the percentage of chronically malnourished has fallen — contrary to Malthus's prediction. Hunger still exists, to be sure, but the chance that any given child will be malnourished has steadily, hearteningly declined. Hong, by contrast, pointed to a related but more complex prospect. The continual need to increase yields, Hong presciently suggested, would lead to an ecological catastrophe, which could cause social dysfunction — and with it massive human suffering.

Exactly this process is what researchers mean today when they talk about the Malthusian trap. Indeed, one way to summarize today's environmental disputes is to say that almost all boil down to the question of whether humankind will continue to accumulate wealth and knowledge, as has been the case since the Industrial Revolution, or whether the environmental impacts of that accumulation — soil degradation, loss of biodiversity, consumption of groundwater supplies, climate change — will snap shut the jaws of the Malthusian trap, returning the earth to pre-industrial wretchedness.

Friday, March 07, 2014

History Never Repeats Itself but It Rhymes

The world has become much more economically interconnected since the last global war. Economic cooperation treaties and free trade agreements have intertwined the economies of countries around the world. This has meant there has been a huge rise in the volume of global trade since World War II, and especially since the 1980s.

Today consumer goods like smartphones, laptops, cars, jewelery, food, cosmetics, and medicine are produced on a global level, with supply-chains criss-crossing the planet. An example: The laptop I am typing this on is the cumulative culmination of thousands of hours of work, as well as resources and manufacturing processes across the globe. It incorporates metals like tellurium, indium, cobalt, gallium, and manganese mined in Africa. Neodymium mined in China. Plastics forged out of oil, perhaps from Saudi Arabia, or Russia, or Venezuela. Aluminum from bauxite, perhaps mined in Brazil. Iron, perhaps mined in Australia. These raw materials are turned into components — memory manufactured in Korea, semiconductors forged in Germany, glass made in the United States. And it takes gallons and gallons of oil to ship all the resources and components back and forth around the world, until they are finally assembled in China, and shipped once again around the world to the consumer.

In a global war, global trade becomes a nightmare. Shipping becomes more expensive due to higher insurance costs, and riskier because it's subject to seizures, blockades, ship sinkings. Many goods, intermediate components or resources — including energy supplies like coal and oil, components for military hardware, etc, may become temporarily unavailable in certain areas. Sometimes — such as occurred in the Siege of Leningrad during World War II — the supply of food can be cut off. This is why countries hold strategic reserves of things like helium, pork, rare earth metals and oil, coal, and gas. These kinds of breakdowns were troublesome enough in the economic landscape of the early and mid-20th century, when the last global wars occurred. But in today's ultra-globalized and ultra-specialized economy? The level of economic adaptation — even for large countries like Russia and the United States with lots of land and natural resources — required to adapt to a world war would be crushing, and huge numbers of business and livelihoods would be wiped out.

In other words, global trade interdependency has become, to borrow a phrase from finance, too big to fail.

There must be an infectious strain of forced optimism going around lately, as we just saw a couple people earlier this week come down with a similar overwhelming urge to tell themselves what they desperately want to hear. Ah, well. Give this poor sap his medicine, Margaret MacMillan:

Globalization—which we tend to think of as a modern phenomenon, created by the spread of international businesses and investment, the growth of the Internet, and the widespread migration of peoples—was also characteristic of that era. Made possible by many of the changes that were taking place at the time, it meant that even remote parts of the world were being linked by new means of transport, from railways to steamships, and by new means of communication, including the telephone, telegraph, and wireless. Then, as now, there was a huge expansion in global trade and investment. And then as now waves of immigrants were finding their way to foreign lands—Indians to the Caribbean and Africa, Japanese and Chinese to North America, and millions of Europeans to the New World and the Antipodes.

Taken together, all these changes were widely seen, particularly in Europe and America, as clear evidence of humanity’s progress, suggesting to many that Europeans, at least, were becoming too interconnected and too civilized to resort to war as a means of settling disputes. The growth of international law, the Hague disarmament conferences of 1899 and 1907, and the increasing use of arbitration between nations (of the 300 arbitrations between 1794 and 1914 more than half occurred after 1890) lulled Europeans into the comforting belief that they had moved beyond savagery.

The fact that there had been an extraordinary period of general peace since 1815, when the Napoleonic wars ended, further reinforced this illusion, as did the idea that the interdependence of the countries of the world was so great that they could never afford to go to war again. This was the argument made by Norman Angell, a small, frail, and intense Englishman who had knocked around the world as everything from a pig farmer to a cowboy in the American West before he found his calling as a popular journalist. National economies were bound so tightly together, he maintained in his book, The Great Illusion, that war, far from profiting anyone, would ruin everyone. Moreover, in a view widely shared by bankers and economists at the time, a large-scale war could not last very long because there would be no way of paying for it (though we now know that societies have, when they choose, huge resources they can tap for destructive purposes). A sensational best-seller after it was published in Britain in 1909 and in the United States the following year, its title—meant to make the point that it was an illusion to believe there was anything to be gained by taking up arms—took on a cruel and unintended irony only a few short years later.

What Angell and others failed to see was the downside of interdependence.

Tumbleweed, Sow the Seed

Dougald Hine:

When the internet arrived, it seemed to promise a liberation from the boredom of industrial society, a psychedelic jet-spray of information into every otherwise tedious corner of our lives. In fact, at its best, it is something else: a remarkable helper in the search for meaningful connections. But if the deep roots of boredom are in a lack of meaning, rather than a shortage of stimuli, and if there is a subtle, multilayered process by which information can give rise to meaning, then the constant flow of information to which we are becoming habituated cannot deliver on such a promise. At best, it allows us to distract ourselves with the potentially endless deferral of clicking from one link to another. Yet sooner or later we wash up downstream in some far corner of the web, wondering where the time went. The experience of being carried on these currents is quite different to the patient, unpredictable process that leads towards meaning.

The latter requires, among other things, space for reflection – allowing what we have already absorbed to settle, waiting to see what patterns emerge. Find the corners of our lives in which we can unplug, the days on which it is possible to refuse the urgency of the inbox, the activities that will not be rushed. Switch off the infinity machine, not forever, nor because there is anything bad about it, but out of recognition of our own finitude: there is only so much information any of us can bear, and we cannot go fishing in the stream if we are drowning in it.

Although it's been said many times, many ways...

Thursday, March 06, 2014

The Moralizing of the Story

Et tu, Rebecca Goldstein?

People take literature seriously, especially in moral philosophy, as thought experiments. A lot of the most developed and effective thought experiments come from novels. Also, novels contribute to making moral progress, changing people’s emotions.

Right—a recent study shows how reading literature leads to increased compassion.

Exactly. It changes our view of what’s imaginable.

The poor panda just can't take much more of this.

This made him laugh, though.

She Wasn't Born With Enough Middle Fingers

So, when the atheist patriarchy has finally been dismantled, what sort of intellectual gravitas can we expect from the progressive revolutionaries who replace them? How do they plan to fill the void left by such intellectuals as Dawkins, Harris and Hitchens? By pretending that unglamorous selfies subvert the dominant paradigm or some such thing. I swear, it would take an actual misogynist to script this sort of thing.

Wednesday, March 05, 2014

Everybody's Someone Else's Trigger

Jenny Jarvie:

Trigger warnings are presented as a gesture of empathy, but the irony is they lead only to more solipsism, an over-preoccupation with one’s own feelings—much to the detriment of society as a whole. Structuring public life around the most fragile personal sensitivities will only restrict all of our horizons. Engaging with ideas involves risk, and slapping warnings on them only undermines the principle of intellectual exploration. We cannot anticipate every potential trigger—the world, like the Internet, is too large and unwieldy. But even if we could, why would we want to? Bending the world to accommodate our personal frailties does not help us overcome them.

Michael J. Kramer:

Many at the time missed the point of Lasch’s dark, brooding analysis, which applied psychoanalytic theory to the broader cultural setting of American life, arguing not so much that Americans had grown self-involved during the so-called “Me Decade” as that the modern institutions of what he called the “therapeutic state” and of consumer capitalism had infantilized them. The term “narcissism” meant more than simply self-involvement for Lasch: it indicated a frail sense of self, weak ties to one’s community and feelings of despair. The result, Lasch suggested, was a population of clinical narcissists, oscillating between outsized fantasies of their own grandiosity—dreaming of their own celebrity—and recurring anxieties about even getting by.

I just found it amusing to read these articles in direct succession.

Tuesday, March 04, 2014

Where Do We Go, Oh, Where Do We Go Now

Michael Dirda:

If, as Nietzsche proclaimed, God is kaput, then what? In a highly readable and immensely wide-ranging work of intellectual history, Peter Watson surveys and summarizes the various answers to this question that have been proposed during the past 125 years. “The Age of Atheists” is, in effect, an account of 20th-century philosophical and moral thought, focused, as its subtitle explains, on “how we have sought to live since the death of God.”

Bear that subtitle in mind because it might otherwise be easy to mistake the character of Watson’s book. This is neither a polemic about the horrors of traditional religion nor an apologia for a rationalistic, scientific attitude to our place in the universe. You can go back to the work of Christopher Hitchens, Richard Dawkins or S.T. Joshi for sustained arguments about the virtues of atheism. What interests Watson is the proposed alternatives to religion, those systems, aesthetic beliefs and modes of life that have taken, or might take, its place. He thus ranges from Marx and Freud and Max Weber through symbolism and surrealism; describes theosophy, Bloomsbury, phenomenology, Nazi ideology and existentialism; discusses self-improvement and Samuel Beckett, as well as sex, drugs and rock-and-roll.

My ardor has dimmed somewhat when it comes to "intellectual" as a concept and identity, but Watson's writing made me aware that intellectual history was an actual field, and also introduced me to the Journal of the History of Ideas. I'm grateful on both counts.

Monday, March 03, 2014

Whistling Past the Graveyard

Rossa Minogue:

Historically, when a resource has been depleted, more often than not, societies have found an alternative rather than simply choosing to commit societal Hari Kari as Diamond would have us believe. In Europe, when forests could no longer meet our fuel needs, we moved on to coal, and then on to oil and someday we will move on to something else.

Alexander Richey:

I do not mean to suggest that the western world is doing fine or to minimize the significance of our current environmental and economic problems. On the contrary, these problems are substantial, if not epoch defining. All I mean to say is that, even though these problems are enormous, they do not justify the kind of pessimism expressed by Franzen, Mamet, and others. All of our problems are soluble and will indeed be solved.

Oh. Well, then. Simple and certain. That's sure a relief!