Wednesday, May 29, 2013

Look Away, Look Away

Sue Shellenbarger:

Adults make eye contact between 30% and 60% of the time in an average conversation, says the communications-analytics company Quantified Impressions. But the Austin, Texas, company says people should be making eye contact 60% to 70% of the time to create a sense of emotional connection, according to its analysis of 3,000 people speaking to individuals and groups.

...Culture can be a factor. In many Eastern and some Caribbean cultures, meeting another's eyes can be rude. Asians are more likely than Westerners to regard a person who makes eye contact as angry or unapproachable, says a 2013 study in the online scientific journal PLOS ONE.

Ah, that's it! I'll start describing myself as trans-Navajo. It's a deep-rooted cultural tradition of mine to not look people in the eye when talking, you imperialists!

Tuesday, May 28, 2013

Teenage Angst Hasn't Paid Off Well

Mandi Woodruff:

For the most part, generations in America have followed a similar storyline: kids grow up to earn more and become more financially secure than their parents and grandparents before them. But that story ended with those born between 1965 and 1981, known as Generation X.

Using charts from recent studies on generational wealth gaps by the Pew Research Center and the Urban Institute, we've put together a clearer picture of what's gone wrong for Gen X.

So I guess the question is whether we should kill off our parents and seize their assets, or beg for charity from our younger siblings. Or both.

Monday, May 27, 2013

We Got Spirit, Yes We Do, We Got Spirit, How 'Bout You?

David Ferguson:

Humanist groups and atheists across the U.S. have banded together to help a fellow atheist who survived the massive tornado that struck Moore, Oklahoma earlier this week.

...At the fund-raising website Indiegogo, atheists have set up a relief fund for Vitsun, who lost her house in the storm and most of her possessions.

“Let’s show the world that you dont need to believe in a god to have human compassion nor does all charity fall under the banner of religion,” says the site. “Let’s get this courageous woman and her family back in their own home.”

The site has already passed its goal of $50,000 by nearly $700 at press time, with 60 days in the fund-raiser still to go.

Additionally, their site says:

The impact of getting Rebecca and her family properly housed by the atheist community will do far more good than sitting in bars or chat rooms mocking people of faith. Like religion, free-thinking will be more easily spread through compassion and decency. ...More importantly, the more money we raise the better the example we set.

Yes, how inspiring that atheists can use natural disasters as opportunities for public relations and point-scoring just as well as any other ghoul who resolutely gazes into swirling chaos in order to discern the potential for increased market penetration. If you're anything like me, you may have been concerned that atheists, being so über-rational, might lack that reflexive tribal solidarity that has served humanity so well through the ages, so this will come as a welcome relief. Send those checks to help give a big ol' middle finger to all those religious nuts and their antiquated in-group/out-group thinking!

Humans in the Raw

Cullen Murphy:

The United States of America in the twenty-first century is about as different from late-medieval Spain as a country can be. And yet a controversy during the summer of 2010 demonstrated how little effort is required to whip up popular fervor on issues of "otherness."

...Within a matter of weeks, birthright citizenship had moved from something that people took for granted to something that, according to opinion polls, nearly half of all Americans had decided they opposed. In January 2011, a group of state legislators unveiled a proposal to create what some described as a two-tiered system of birth certificates, one tier for babies born to citizens, the other for children of illegal immigrants. Shortly afterward, two U.S. senators proposed a constitutional amendment that would deny birthright citizenship outright to children born to illegal aliens, regardless of the consequences. As one commentator pointed out, "Without the concept of birthright citizenship, it is possible for someone to be born without having citizenship in any country at all."

The point is not to make a facile comparison between incomparable regimes. It is simply to note that dangerous passions — about social contamination, about religious incursion — can be found anywhere. It does not take much to arouse them.

Andrew Wright, reviewing John Gray's newest book:

In a nutshell his argument runs that there's no redemption for the human animal, in fact, we delude ourselves if we think we are anything beyond the animal. We are, he argues, violent, brutish and barbaric.  And whilst our 21st century lives in developed nations are no longer dominated by these three forces, it is deep-written in our nature to have these forces just below the surface and that fact is as immutable as the mountains. Whilst our technology and science progress and develop our basic character and nature cannot. We are not improving as a species, becoming less violent or nasty and inexorably moving on to some dreamy Star Trekkian future of benign benevolence. We are doomed to repeat the barbarism of history, civilisation only paper-thin.

...To prove his point Gray takes us to humans in the raw, to Naples in 1943 where starvation and disease took a modern city and tossed it back into the dark ages and people fought to live in the most desperate of conditions. In such terrible circumstances anything goes and the civilised life is seen to be the myth it is. Humans are animals that will do anything to remain alive, ditching values and philosophies long held simply to keep their hearts beating.

As I said it is a bleak and honest analysis and it is a powerful warning about human hubris, urging us to be ever humble and watchful for our own "nasty" instincts. Our institutions, our mythological civilised life where many of us have the privilege to live - where indeed John Gray himself gets the opportunity to de-construct it all - can and often is, washed away in an instant by the forces of dark disaster.

Hey, it's Memorial Day, isn't it? I just prefer to spend it reflecting on the ways in which other people, not just soldiers, have sacrificed so that we might live comfortably, and on the threats posed by, as one review of Murphy's book put it, “some vision of the ultimate good, some conviction about ultimate truth, some confidence in the quest for perfectibility,” whereby "as long as there are righteous people in the world, the danger of widespread persecution will persist."

Sunday, May 26, 2013

Being a Flimpest is Meaningless

"There's a passive-aggressive victim status that religious people now have, which is that they go, 'Oh these fundamental atheists!' Well, I'm not a fundamentalist atheist — being an atheist doesn't mean anything to me, I don't want anybody else to be an atheist, I've no interest in spreading atheism any more than I have... if there was a word to describe someone who doesn't believe in the Tooth Fairy — a 'flimpest' — then I would have to say I'm a flimpest. But being a flimpest is meaningless; it just means I don't believe in the Tooth Fairy. It doesn't involve a set of values; it doesn't... my set of values comes from humanism, from the Enlightenment, from the very set of values that built this country, the set of values of Thomas Paine, and Thomas Jefferson, and people who actually thought for themselves and didn't take any text as gospel... which Marxists did, and which many religions do. I just think it's terrible to be told what's the case."

Oooh, I can think of a few fundamentalist atheists who will likely have one of their two-minute-hategasms when they hear this. Then again, Fry may very well have already been an unperson to them based on other inconveniently sensible things he's said.

Saturday, May 25, 2013

Quot Libros, Quam Breve Tempus (IX)


pronunciation | tsUn-dO-kU (tsoon-doh-koo) submitted by | chrysalismm submit words | hereJapanese script | 積ん読 kanji, つんどく hiragana

Who does that? Not me! After today's Champions League final, I've got almost three months of hell-spawned weather to look forward to with hardly any fútbol to watch, so you better believe I'm going to get some book-reading done. Here's the most recent additions to the stack since last time, which no, of course I haven't finished, just like I hadn't finished all the ones from the picture before that, because shut up, that's why, I don't have a problem, I am so going to get them all read sooner or later; here, quit harassing me and just look at this:

Wednesday, May 22, 2013

Glass Walls on Slaughterhouses

Ahahaha. Now, we already knew Hamilton Nolan is a goddamned moron, and we already knew that Gawker's brand of progressive woo-girl politics is as superficial as its audience, but shut my mouth and palm my face, this do beat all. Apparently, it never occurred to our progressive heroes while agitating for the overturn of DADT that they were, in fact, agitating for people to have the opportunity to be psychologically broken down in order to be rebuilt as perfect, unquestioning killing machines. Tugging uncomfortably at the collar, a strained expression, a finger held aloft: "This seems a bit, well, retrograde." You don't say.

It's always amusing to see naïve campus progressivism run face-first into a reality-based brick wall; it tickles my slapstick funny bone. But there's also an unpleasant dissonance in considering the kind of person who can blithely accept the idea of humans being shaped into lethal weapons while blanching at the thought that the process of shaping them thusly might involve some impolitic language. Like George Carlin said about rules of combat, it seems just a way to reassure ourselves that we're quite civilized even as we devote our efforts to signing up as many people as possible to this institution of mass killing. Faced with the grave significance of war, presented with a pile of corpses, what kind of procedural mindset massages away the cramps of conscience by consulting a properly ticked-off checklist? Well, let's see — proper declaration of hostilities, signed in the right place, mm-hmm; official uniform, name tag clearly displayed, yep; cause of death: regulation military weaponry, legally obtained, properly registered, good; aaaaand....nope, doesn't appear anyone was called a faggot or a raghead in the process. All righty, everything looks good here, carry on.

One of the things that nauseated me during those wasted years spent reading the political blogosphere was the presence of so many amateur spin doctors, those process-minded moral homunculi who only thought in terms of optics and marketing, divorced from considerations of value. The Democrats were the brand, and achieving maximum market share was the goal. Repealing the ban on gays in the military looks to be polling well, so let's go with it! Meta-questions about the worthiness of any such goals would only confuse them, like a Roomba stuck in the corner between the bookcase and the nightstand. Vroom, bonk. Vroom, bonk. Should gays or anyone else care about being discriminated against by such an institution? What about the implications of seeing the world's mightiest death-dealing colossus as first and foremost a vehicle for job training and career advancement? Vroom, extending civil rights, bonk. Vroom, the most important election of our time, bonk.

People who can overlook gruesome reality with the help of sanitized language are not to be trusted.

Tuesday, May 21, 2013

One Who Glimpsed It and Fled

Bill Watterson:

Creating a life that reflects your values and satisfies your soul is a rare achievement. In a culture that relentlessly promotes avarice and excess as the good life, a person happy doing his own work is usually considered an eccentric, if not a subversive. Ambition is only understood if it's to rise to the top of some imaginary ladder of success. Someone who takes an undemanding job because it affords him the time to pursue other interests and activities is considered a flake. A person who abandons a career in order to stay home and raise children is considered not to be living up to his potential -- as if a job title and salary are the sole measure of human worth.

You'll be told in a hundred ways, some subtle and some not, to keep climbing, and never be satisfied with where you are, who you are, and what you're doing. There are a million ways to sell yourself out, and I guarantee you'll hear about them.

To invent your own life's meaning is not easy, but it's still allowed, and I think you'll be happier for the trouble. Reading those turgid philosophers here in these remote stone buildings may not get you a job, but if those books have forced you to ask yourself questions about what makes life truthful, purposeful, meaningful, and redeeming, you have the Swiss Army Knife of mental tools, and it's going to come in handy all the time.

Somehow, until I saw Maria Popova's link, there had been a Watterson-commencement-speech-shaped hole in my experience of which I had been unaware. Glad that's fixed now.

Future Imperfect

Jörg Friedrich:

We don’t always strive for perfection. To the contrary: as a species, we often embrace imperfect conditions instead of attempting to better them. A desire for perpetual progress isn’t encoded in our genes. Large periods of human history were relatively static. For many generations, our forefathers lived contently without desiring radical change. We also know of contemporary tribes in inaccessible regions of the earth that appear to be quite happy with the world as it is and with their place within it. These tribes don’t necessarily aspire to “be different” or to be more like us.

In the modern West, many people are similarly sceptical of radical change even if it promises great technological, social or medical benefits. As Shakespeare already knew, it’s often easier for us to endure existing hardship than it is to aspire to an unknown future.

Why does this matter? Because attempts to develop new technologies and to propel engineering forward often fail to account for the diversity of views on progress. We are presented with shining examples of scientific possibilities that will change our everyday lives whether we desire it or not. Yet from the perspective of an objective observer, the introduction of many new technologies is unnecessary. Their development isn’t driven by some innate need for progress and survival but by our own curiosity and enthusiasm for new gadgets and by economic interests.

This is something I think about a lot. It's only been in the last few hundred years that knowledge and technological prowess have begun to exponentially increase, but we've quickly become accustomed to thinking of this as something inevitable. As is often the case, Hume is there wagging his finger at us, reminding us that we ultimately have no solid foundation upon which to rest our blithe assumption that tomorrow will be a teleological improvement upon today. 165 million years of dinosaurs existing as a species ultimately meant nothing. Whatever we might hope or think will happen to a species of hairless ape that has only been around for a fraction of that time, we don't know. There is no precedent here. And so, what might it be like if humanity were forced to reacclimate itself to such a static, cyclical way of life?

Monday, May 20, 2013

The Audience is Listening

Adam Gurri:

The point is, if you start blogging thinking that you’re well on your way to achieving Malcolm Gladwell’s career, you are setting yourself for disappointment. It will suck the enjoyment out of writing. Every completed post will be saddled with a lot of time staring at traffic stats that refuse to go up. It’s depressing.

Instead, the perfect balance is committing to only those crafts that you can perform with satisfaction even if you have to do so in utter obscurity. Then, put your work out in public as part of the process itself—if you’re making homebrew beer or an Arduino hack, make a video or write about the process as a means to think harder about the details of it. If you’re a writer, think of putting it online as simply having the work backed up in one more place.

In this way, you open yourself up to the spectrum of possibilities, ranging from utter obscurity at one end to global fame at the other. Far more likely is something closer to the obscurity end but much more satisfying—that you will draw the attention of a relative few who share your interests.

In my mind, this is the best way to take advantage of modernity while minimizing its costs. We are an affluent enough society that we’re able to make enough money as individuals to have time to devote to doing something we love for its own sake. We are also an interconnected society where some artisans are able to rise to sudden prominence and make a living doing what they love. A satisfying life will focus on the former while keeping the door open to the latter possibility.

There's these two friends of mine. Between them, there are are least six blogs floating around the web like ghost ships, mysteriously abandoned shortly after launch with no signs of struggle. Both of them are well-educated and driven to be serious writers. Both of them struggle to write regularly and frequently seem despondent over what they do produce, as well as the lack of any foreseeable future for it in the marketplace. The identity of being a writer-with-a-capital-W seems to be a heavy burden on their shoulders; the glass always seems half-empty. And here I am, a mere scribbler, about to chalk up 1500 posts and loving every minute of it. It's almost enough to give me something like survivor's guilt.

With respect to Gurri's excellent post, I'd still prefer to keep the door firmly barred against recognition and reward. It's a delicate balance to hold; I suppose there's always a chance that making one's work public could lead to unwanted attention. But addressing an audience, even one that mostly exists in the abstract, is how I keep from disappearing into narcissistic diary-keeping, or talking in complacent shorthand and private jokes. Echolocation, however tentative or intermittent, is necessary for perspective. So thanks for reading — and thanks for being so few in number.

Sunday, May 19, 2013

An Ordinary Man Away from Home Giving Advice

Lisa Levy:

Indeed, there is something ersatz, if not quite fraudulent, about de Botton’s entire intellectual enterprise: he often seems like a grad student who shows up to seminar having done just enough of the reading to participate by jumping on other people’s comments, but who never makes an original observation of his own. He is constantly quoting and alluding to great figures — Jane Austen, John Stuart Mill, Stendhal, and Freud, among others, all get name-dropped in his self-help book, How To Think More About Sex (about which more below) — but he tends to meander and summarize after a quotation rather than using it to drive his own argument forward.

...And de Botton’s book makes an enraging little study (all the books clock in at around 200 pages) of contemporary assumptions about sex, marriage, and relationships, regarded strictly from the point of view of a bored, married, middle-aged man who maybe dabbles in philosophy and fancies himself an intellectual. It’s like being hit on by a paunchy, balding European guy at an office party who tries to seduce you with, well, quotes from Jane Austen and Stendhal, and empty proclamations about the place of sex, marriage, and relationships in contemporary society.

Oh-ho, I see what she did there. It's clever, see, because that's de Botton she's describing there! De Botton's book is like de Botton himself in person being all lame and socially striving and stuff! Burn!

To reiterate, my own opinions of his work vary significantly. But I am beginning to admire the knack he has for provoking pretentious twats into displaying their honest contempt for their social inferiors.

Saturday, May 18, 2013

When He Himself Might His Quietus Make

Sara Reardon:

Ultimately, says Nader Perroud of the University of Geneva in Switzerland, if suicidal behaviour is considered as a disease in its own right, it will become possible to conduct more focused, evidence-based research on it and medications that treat it effectively. "We might be able to find a proper treatment for suicidal behaviour."

They can have my right to self-cancellation when they pry it from my cold, dead hand.

Friday, May 17, 2013

My Mirror Disappoints Me

Hamza Shaban:

We tend to think that the offline and the online are of two different realms, with sign-in screens acting as a portal. On the one side: babble, blog posts, centrifugal bumble puppy, Tinder, disengaged tweens, the Kardashians, hyper-regressive attention spans, Facebook farce, The Matrix. On the other: books, truth, orgasmic eye contact, the Socratic Method, a hike through Canadian forests, reality, patience, conversations with Oprah.

Yeah, I don't think the Internet has done anything but dramatically amplify and magnify traits that were already there. I have to agree with Scalzi here:

The online world can be distracting and alienating, but it is often so because people are often inclined to be distracted and alienated. If you’re one of those people, it doesn’t matter where you go or what you do, you’ll still be inclined toward distraction and alienation. You could be in a monastery on the slopes of the Himalayas and get distracted by the snowflakes. No satori for you! On the other hand, dude, snowflakes.

And I still say it's a mistake to just take people at their word when they claim to be distracted against their will, despite their best efforts. Are our gadgets really so fiendishly well-designed to hijack the reward centers of our brains, swiftly and irrevocably altering them beyond our control? To borrow a concept from Nicholas Humphrey who was in turn borrowing it from David Hume, it seems far more likely that people are shirking responsibility for their own satisfaction and excusing behavior they think others will disapprove of by claiming to be powerless to stop it.

Back in the Village Again

Cole Stryker:

The rise of the social web may be perceived as a re-villaging, where the permanence of one’s digital footprint behaves as a deterrent, making it seem to some like an ideal time to reintroduce public shaming to reinforce norms. But considered through a historical lens, public shaming begins to look like a tool designed not to humanely punish the perp but rather to satisfy the crowd.

This explains its resurgence. When has the crowd ever been bigger, or more thirsty for vengeance? The faceless Internet, with its shadowy cyberbullies and infinite display of every social ill is scary. And when it slithers its tentacles in a person’s life, we become desperate for some way to fight back—to shine light into the darkness and counterattack those who would victimize behind the veil of anonymity. But doxing, even just naming publicly-available names to channel outrage (or worse) at someone who has violated your norms, is not only an ineffective way to deal, it risks causing more harm than the initial offense. Last year’s trendy rise of media-sponsored shaming is self-righteousness masquerading as social justice. In many cases the targets deserve to be exposed and more, but public shaming does not drive social progress. It might make us feel better, but let’s not delude ourselves into thinking we’ve made a positive difference.

Nothing I need to add to that.

Unknown Unknowns

There are known knowns; there are things we know we know.
We also know there are known unknowns; that is to say we know there are some things we do not know.
But there are also unknown unknowns – the ones we don't know we don't know.

The Analects of Confuseus

Samuel McNerney:

The same thing occurs when lay audiences read books about thinking errors. They understand the errors, but don’t notice the trick – that simply learning about them is not enough. Too often, readers finish popular books on decision making with the false conviction that they will decide better. They are the equivalent of Edwards’ competition – the so-called best of the best who miss the ruse.

The overlooked reason is that there are two components to each bias. The first is the phenomenon itself. Confirmation bias, for example, is your tendency to seek out confirmation information while ignoring everything else. The second is the belief that everyone else is susceptible to thinking errors, but not you. This itself is a bias – bias blind spot – a “meta bias” inherent in all biases that blinds you from your errors.

I, on the other hand, have recognized that I was born guilty of original bias, and have thus accepted λογική into my heart as my savior, so this doesn't apply to me anymore. Do you have a few minutes? If I could come in, I'd like to talk to you some more about λογική...

Madness to the Method

Jag Bhalla:

The word rational is widely misused. Scientists often apply it unnaturally, in ways that conflict with our biology. Nobel laureates Daniel Kahneman and Gary Becker, and their respective schools of thought, are on opposite sides of this breach with our nature. They revive an old struggle between prudent empiricism and blinkering “theorism” (an overreliance on idealized models).

...To minimize misuse, consider that the word rational really incorporates three types of assumptions: first, about desirable goals; second, about effective methods of attaining them; and third, about whether agents have the needed skills.

Ahaha, oh yes. One of the most tiring things about fellow atheists is the way they use those assumptions interchangeably, with the result that "rational" ends up, in practice, meaning little more than "anything I approve of and agree with". It's becoming one of those words that trigger an, uh, irrational anger in me. As Schopenhauer said about a term often used synonymously with rational:

On the other hand, the epithet reasonable has at all times been applied to the man who does not allow himself to be guided by intuitive impressions, but by thoughts and conceptions, and who therefore always sets to work logically after due reflection and forethought. Conduct of this sort is everywhere known as reasonable. Not that this by any means implies uprightness and love for one's fellows. On the contrary, it is quite possible to act in the most reasonable way, that is, according to conclusions scientifically deduced, and weighed with the nicest exactitude; and yet to follow the most selfish, unjust, and even iniquitons maxims. So that never before Kant did it occur to any one to identify just, virtuous, and noble conduct with reasonable; the two lines of behaviour have always been completely separated, and kept apart. The one depends on the kind of motivation; the other on the difference in fundamental principles. Only after Kant (because he taught that virtue has its source in Pure Reason) did the virtuous and the reasonable become one and the same thing, despite the usage of these words which all languages have adopted—a usage which is not fortuitous, but the work of universal, and therefore uniform, human judgment. "Reasonable" and "vicious" are terms that go very well together; indeed great, far-reaching crimes are only possible from their union.

Thursday, May 16, 2013

The Emperor's Clean Clothes

Jessica Grose:

Which is all to say, it’s seen as socially admirable and masculine for a man to be on diaper duty or to sous-vide a steak, but there are no closet organizing tips in the pages of Esquire, no dishwasher detergent ads in the pages of GQ. Considering the strides that have been made in getting men to share the labor in other traditionally female domestic areas, why has cleaning remained the final frontier?

At its most basic, a reason why a lot of men don’t want to clean is obvious: it’s not fun. The rewards of the other two traditionally female household tasks—childcare and cooking—are palpable. Your kid’s smile, a delicious meal. But not so with cleaning. Drew Magary, a Deadspin columnist and the author of the forthcoming parenting memoir Someone Could Get Hurt, says that men will never take the initiative and clean without being asked “because it sucks.”

...With all these obstacles to real gender parity of chores, what’s a working woman to do? Philosophy professor Alexandra Bradner suggests on the Atlantic’s website that couples sit down with a list of questions like, “Do I do half of the laundry and half of the dishes every day?” to figure out where they’re slacking off in comparison to their mate. This sounds exhausting and impractical. If I do one load of laundry, it’s easier for me to do the second rather than wait for my husband to mosey over. (Bradner also says that when men do traditionally female chores, they’re enacting “‘small instances of gender heroism,’ or ‘SIGH’s”—which, barf.)

Cooking meals and taking care of the crotchfruit are necessities; mopping and scrubbing are more like bonus options. Eating out all the time would be expensive, and society tends to frown on child neglect, but it'll take a while for the accumulation of filth and vermin to actually become hazardous, and my experience suggests that most people, regardless of gender, lean more toward being lazy slobs than neat freaks.

I had to share a room with my brother until I was twelve. My parents used to jokingly call us Felix and Oscar over our diametrically opposed personalities. Once I finally convinced my mom that he was capable of having his own room without needing supervision (she's always been an overprotective worrier), I started doing my own vacuuming, laundry, dusting, etc. and never looked back, while he turned every living space he inhabited into a landfill.

Honestly, I enjoy doing chores. It's very satisfying to make messy things clean. Clutter affects my soul like being forced to listen to the strumming of a hundred guitars, each out of tune in a unique way. I can accept that life itself is chaotic and unruly and prone to make a mockery of all our best-laid plans, but I nonetheless have a deep-rooted psychological need to create simple, streamlined tidiness out of disorder within my living space. Without the opportunity to channel it like this, who knows how that control-freakishness might otherwise express itself?

This led to an amusing culture-shock moment with my girlfriend, who comes from a family where resentfully-performed chores serve as passive-aggressive pawns to be skillfully deployed for Machiavellian advantage on a psychological chessboard. In short, it took some convincing for her to accept that things here were exactly what they appeared to be, that I washed dishes, scrubbed toilets, emptied trashcans and hauled firewood because I had long ago accepted them as basic, inevitable aspects of my routine, aspects that I didn't particularly feel strongly about one way or another. As I've said before, I consider myself mostly honest, not because of any burning devotion to abstract moral principles, but because I simply don't have the patience or love of intrigue to bother weaving tangled webs of deceit. Way too much fucking trouble. Same principle applies here. I'm a simple fellow, baby, I said. I see something that needs doing, and I do it. It'll just gnaw at me if I try to ignore it. Besides, I did all this for years when I lived by myself, I'm in the habit of it, why should that change now? But why not ask or demand that the other members of the household do it? she asked. I stared at her like she was a madwoman.

Because, one, I'm a firm believer in the adage about doing things yourself if you want them done right. And two, I get far, far more satisfaction from getting something done when and how I want it done than I would from asking someone else to do it, waiting impatiently for them to get to it, which would no doubt be too late for my liking, and then being forced to restrain the urge to kibitz their efforts or brush them aside and do it my own damned self. A perfectly-divided pie chart of chore distribution doesn't mean shit to me. I laugh at your weak, puny notions of dialogue and parity. I am the motherfucking tyrant of my domicile, the love-child of Alexander the Great and Mr. Clean, and the linoleum will run with the blood of any who dare oppose or hinder me (not for long, though, because that stuff's hard to get up once it's dried).

Gender heroism? What a laugh. Could that previous paragraph sound any more patriarchal?

Tuesday, May 14, 2013

Not Wise, but Otherwise

Theodore Dalrymple:

So Owl gave me the first intimation in my life that all are not wise who claim to be learned. And Owl was a hint also that the clever could be the most foolish of all.

But why did owls symbolise wisdom in the first place? The splendid photos in my book, succinctly titled Owls, suggested a reason: owls seem to have only two states, the serene calmness of sleep and the most intense alertness when awake. Try as we might not to anthropomorphise, owls look serious; they indulge in no foolish or redundant movement. This is nonsense, of course: owls are bird-brained. And one of the things that I learnt from this book, delightful to me because completely useless, is that the Owl of Minerva does not necessarily spread her wings at dusk: nearly forty per cent of the 133 extant species of owls are diurnal, not nocturnal. I bet you didn’t know that.

...The law of unintended consequences is one of the hardest for people to learn because it is so unflattering to our conception of ourselves as rational beings, and because (if it is a law) it suggests inherent limits to our power. We shall never fail to commit errors.

Those excerpts are indeed all from the same essay, an essay which just so happens to be about two of my favorite things: owls and unintended consequences. Naturally, I had to acknowledge it.

Once in my teenage years, after a soccer game, some teammates and I were eating dinner at a restaurant. Somehow, the conversation turned to deciding which animal we each resembled. The consensus was that I was, of course, an owl. Possibly because of my wide eyes, serious expression and quiet bookishness. Or possibly because of my ability to move silently and swivel my head 270°.

Whatever the case, I shortly thereafter underwent the ritual to adopt the owl as my spirit animal. Climbing a tree under a full moon, I hooted and prayed for a vision, while doing my best to resemble a feathered harbinger of death. Soon, my sacred quest was rewarded by the rustle of prey in the leaves below, which turned out to be my mom who had come looking for me. She did admit that my downward swoop was silent and terrifying, at least.

Since then, I have been blessed with the supernatural abilities to win any staring contest and to snatch up a swiftly running rodent with my bare hand.

The Coming of Numbers

The struggle is on, no
mistake, and I take
the side of life's history
against the coming of numbers.

— Wendell Berry

Steven Poole:

What lies behind our current rush to automate everything we can imagine? Perhaps it is an idea that has leaked out into the general culture from cognitive science and psychology over the past half-century — that our brains are imperfect computers. If so, surely replacing them with actual computers can have nothing but benefits. Yet even in fields where the algorithm’s job is a relatively pure exercise in number- crunching, things can go alarmingly wrong.

Reading all the recent hubbub about "big data", I get the impression that a lot of people look at Hume's "can't derive an ought from an is" and think, you know, maybe if we just piled up a gigantic mountain of is, we'd then be able to step right across to ought. Or maybe ought will be an emergent property of a certain density of is, the way human consciousness seems to emerge from trillions of neurons packed together so tightly. Either way, I've planned on reading Evgeny Morozov's newest book soon, but I'm beginning to think that maybe I should revist Theodore Roszak's The Cult of Information too.

Sunday, May 12, 2013

We're Gonna Send You the Extra Minute Free

Gretchen Reynolds:

In 12 exercises deploying only body weight, a chair and a wall, it fulfills the latest mandates for high-intensity effort, which essentially combines a long run and a visit to the weight room into about seven minutes of steady discomfort — all of it based on science.

7's the key number here. Think about it. 7-Elevens. 7 dwarves. 7, man, that's the number. 7 chipmunks twirlin' on a branch, eatin' lots of sunflowers on my uncle's ranch.

I had a pretty good routine going, alternating yoga with weightlifting and treadmill walking, until I had to refrain from it pre- and post-surgery. Maybe I'll try this thing out as I work my way back to where I left off. It's science!

Saturday, May 11, 2013

Saturday Shuffle

  1. Marsheaux -- Hanging On
  2. Gil Scott-Heron -- Me and the Devil
  3. Tony Joe White -- As the Crow Flies
  4. Wrathchild America -- Time
  5. Richard Thompson -- Stony Ground
  6. Fun Lovin' Criminals -- The View Belongs to Everyone
  7. Alabama Shakes -- Hold On
  8. Metric -- Blindness
  9. The Morning After Girls -- Chasing Us Under
  10. Eddy Grant -- Electric Avenue
  11. American Head Charge -- Take What I've Taken
  12. Midfield General -- Reach Out
  13. Killing Joke -- Requiem
  14. Primal Scream -- Autobahn 66
  15. Sepultura -- Bottomed Out
  16. Philip Boa and the Voodooclub -- Rome In the Rain
  17. Ween -- The Rift
  18. Black Grape -- A Big Day In the North
  19. Chumbawamba -- Timebomb
  20. Danger Mouse and Sparklehorse feat. Suzanne Vega -- The Man Who Played God

Just Think

Susan Suleiman:

Some of the most memorable pages here restate an argument Camus had already developed at length in “The Rebel”: not all means are acceptable, even when employed for noble ends; terrorism and torture destroy the very goals they are supposed to serve. This position was criticized as “idealist” (it was the reason for the famous break with Sartre), but Camus sticks to it — admirably, in my opinion: “Although it is historically true that values such as the nation and humanity cannot survive unless one fights for them, fighting alone cannot justify them (nor can force). The fight must itself be justified, and explained, in terms of values.”

Even more eloquent, perhaps, are his remarks on the responsibility of intellectuals in times of hatred: “It is to explain the meaning of words in such a way as to sober minds and calm fanaticisms.” Great writer that he was, Camus placed hope in the calming power of language carefully used, and of reason; in the preface, he asks his readers to “set their ideological reflexes aside for a moment and just think.”

"Times of hatred" strikes me as an odd designation. Are there ever times when people aren't hating each other? Anyway, yes, The Rebel is probably my favorite of his books; I remember being puzzled after reading it and learning that intellectual opinion of the time had it that Sartre got the better of their argument with his criticism of the book. Hindsight, I presume, has rectified this and elevated Camus above the overrated, wall-eyed, Stalinist toad-man.

Funny enough, even though I loved philosophy class, the subjects who most impressed me, like Nietzsche, Camus, Dostoevsky and Kierkegaard (whose recent bicentennial has inspired a few good posts), were idiosyncratic philosophers at best, if not better described more generally as writers.


Marc Herman:

It’s not shocking that a magazine called Time would be interested in the march of human generations. But the weekly’s much-discussed cover story on the late-’80s to mid-’90s “millennials,” Generation Me Me Me glossed past (as do the inevitable retorts) the possibility that the year of one’s birth just isn’t very important. A broad study three years ago, based on perhaps the largest available data sets measuring American youth, was skeptical that “generational” cohesion—of the sort we obsess over—exists at all.

...The two psychologists’ findings seem common sense. A woman born in Boston, raised in a bilingual environment, with two siblings; who served in Iraq with the Air Force; is married with two children; subscribes to HBO; and earns $52,000 a year working in commercial property administration, is supposed to share primary character traits with a man from a fourth-generation Wyoming family, monolingual, single, no kids, earning $24,000 as a painter, so long as they were both born in 1991, and are Facebook friends?

Fair play to Time, I suppose, for achieving their aim of getting tons of people chattering about their story, even if only to complain about it. Guitar magazines can only look on enviously, wishing that their next pointless cover story on "The 106 Greatest Shredders of All Time or at Least Since the Last List and Until the Next One" could attract so much attention. Generational analysis is astrology for people who fancy themselves too educated and sophisticated to take astrology seriously. The perfect marks, in other words.

The Cold Passion for Truth Hunts in No Pack

From Bruce Hood's book The Self Illusion, this is a dynamic I've certainly seen occur online:

As soon as we blend into the crowd, we no longer feel the need to put in as much effort if it is not recognized. It is only if the group appreciates our efforts that we try harder. This need for recognition also explains why groups can become more polarized on issues that would normally generate only moderate views. In an effort to gain group approval, individuals adopt increasingly extreme positions that they feel represent the group, which in turn drags the group further toward that position. If you couple that dynamic force with "groupthink," the tendency to suspend critical thinking when we are one of many decision-makers so as to try and galvanize the gathering, then it is easy to see how we behave so differently in groups than we would as individuals.

Friday, May 10, 2013


Alissa Quart:

It’s not hard to imagine a future when neurohumanities and neuroaesthetics have become so adulated that they rise up and out of the academy. Soon enough, they may seep into writers’ colonies and artists’ studios, where “culture producers” confronting a sagging economy and a distracted audience will embrace “Neuro Art” as their new selling point. Will writers start creating characters and plots designed to trigger the “right” neuronal responses in their readers and finally sell 20,000 copies rather than 3,000? Will artists, and advertisers who use artists, employ the lessons of neuroaestheticism to sharpen their neuromarketing techniques? After all, Robert T. Knight, a professor of psychology and neuroscience at Berkeley, is already the science adviser for NeuroFocus, a neuromarketing company that follows the engagement and attention of potential shoppers. When neuroaesthetics is fully put to use in these ways, it may do as Alva Noë said: “reduce people and culture to ends, simply to be manipulated or made marketable.”

And he has a point. Today, there’s the sudden dominance of so many ways to quantify things that used to be amorphous and that we imagined were merely expressive or personal: Big Data, Facebook, ubiquitous surveillance, the growing use of pharmaceuticals to control our moods and minds. In other words, neurohumanities is not just a change in how we see paintings or read nineteenth-century novels. It’s a small part of the change in what we think it means to be human.

Perhaps I have an answer to the question I asked a few months ago, then. Maybe people will look back on the early 21st century and laugh at the way so many educated people thought that the colored lights of fMRI studies would offer truer or deeper explanations of human existence rather than simply rewording what we already know.

Tuesday, May 07, 2013

A Beard Is a Lighthouse to Lost, Beardless Souls

I'm enjoying this site very much.


Francis de Waal:

Yet you identify yourself as an atheist.

Yeah, but I really don’t care if God exists or not. If people can lead good lives by believing in God, that’s perfectly fine with me as long as they are not overly dogmatic. But some atheists have also become dogmatic.

Why are you bothered by atheists who don’t like religion and want to smash it?

Because religion is so inherently human that I don’t know what happens if you kick it out of society. Sigmund Freud wrote a whole book against religion and in the end he says that he is still not sure what would happen if we would remove it from society. It might not be good.

...Humans do terrible things to each other, sometimes in the name of God, sometimes without any religious reference. There is no proof that without religion we would be treating our enemies any better. We’re just not a particularly nice species when it comes to the out-group.

I think again of Razib's recent post, which clarified a lot of things for me. The intellectual tendency he described is present in many atheist critiques of religion — they engage with belief as a body of rational propositions which, when taken to their logical conclusions, are obviously found wanting. As Razib noted, theology (and its rational opposition) leave a comprehensive intellectual fossil record in the form of texts, which helps present a misleading picture of ideology's importance in the actual lives of believers and non-dogmatic skeptics. But even the followers of creedal religions are often inconsistent and moderate in practice. Stated reasons are more like the makeup we apply before special events, not the face we wake up with every morning. And so, when latter-day positivists talk about reasoning religion out of existence and the objectively better world that will result, I just laugh at their naïveté. They might be surprised to find how much they have in common with a religious fanatic like U Wirathu, who shares the conviction that if only everyone would agree with him on everything of importance, then there would be no more problems. Isaiah Berlin made this mindset a recurring theme in his lectures and writing:

It was further believed that methods similar to those of Newtonian physics, which had achieved such triumphs in the realm of inanimate nature, could be applied with equal success to the fields of ethics, politics and human relationships in general, in which little progress had been made; with the corollary that once this had been effected, it would sweep away irrational and oppressive legal systems and economic policies the replacement of which by the rule of reason would rescue men from political and moral injustice and misery and set them on the path of wisdom, happiness and virtue.

Once people become convinced that there is one true answer to be found for questions of ethics, politics and relationships, it often seems perfectly reasonable then to marginalize opposition for the good of us all — peacefully if possible, violently if regrettably necessary. Religion has in many ways ceased to be an interesting issue for me; now, I mostly just stay vigilant for moralizing busybodies who have a grand plan for fixing everything and are looking for recruits and victims.

And Stop Calling Me Shirley

Speaking of nonexistent connexions once again: as a non-academic, an autodidact, and a Bear of Very Average Brain besides, I'm always on the lookout for the givens in an argument, the unfounded and unquestioned assumptions from which the rest of the assertions flow. Like we learned in geometry class, it doesn't matter how many elaborate steps follow in a proof if your given is flawed. On that note, I can appreciate the usefulness of Daniel Dennett's little heuristic:

Not always, not even most of the time, but often the word “surely” is as good as a blinking light locating a weak point in the argument. Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn’t be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and—because life is short—has decided in favor of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined “truism” that isn’t true!

If It Was So, It Might Be; and if It Were So, It Would Be; but as It Isn't, It Ain't. That's Logic.

Evan Selinger:

Let’s examine the most egregious Facebook ad of them all: “Dinner” (in the video above). On the surface, it portrays an intergenerational family meal where a young woman escapes from the dreariness of her older relative’s boring cat talk by surreptitiously turning away from the feast and instead feasting her eyes on Facebook Home. With a digital nod to the analog “Calgon, Take Me Away” commercials, the young woman is automatically, frictionlessly transported to a better place: full of enchanting rock music, ballerinas, and snowball fights.

But let’s break Zuckerberg’s spell and shift our focus away from Selfish Girl. Think off-camera and outside the egocentric perspective framed by the ad. Reflect instead on the people surrounding her.

Ignored Aunt will soon question why she’s bothering to put in effort with her distant younger niece. Eventually, she’ll adapt to the Facebook Home-idealized situation and stop caring. In a scene that Facebook won’t run, Selfish Girl will come to Ignored Aunt for something and be ignored herself: Selfishness is contagious, after all. Once it spreads to a future scene where everyone behaves like Selfish Girl, with their eyes glued to their own Home screens, the Facebook ads portend the death of family gatherings.

Remember what I was saying about Hume and his connexions? This is a good example. If you're already inclined to believe that most social media activity is frivolous, it's very easy to thoughtlessly nod along with the next few steps in his reasoning. But people are not integers. Silly ads are not chains of logical propositions understood as literal commands. The Great Unwashed are not blank slates incapable of resisting or criticizing suggestions without the help of a philosophy professor. And when it comes to human behavior, like does not necessarily beget like. Ignored Aunt might very well possess the wisdom of several decades' experience and be gracious enough to realize that Selfish Girl will eventually outgrow her moody adolescent self-absorption, as adolescents always have, without bearing her a grudge over it. Like most humans ever, Selfish Girl will likely become bored by her normal routine — in this case, frittering away spare moments on Facebook — and go looking for greener grass elsewhere. Perhaps even in flesh and blood relationships.

It's typical of intellectuals, in their drive for logical consistency, to trick themselves into seeing logical necessity where none exists. Overthinking things, basically. Really, the idea that millions of years of social instincts honed by evolution will be destroyed or reshaped in a couple decades by some cool new gadgets is only fit for laughing at.

Sunday, May 05, 2013

The Bowel of Minerva

Tim Kreider:

I have no pretensions to any special knowledge, let alone anything like wisdom; I am just some guy, a PERSON IN WORLD looking around and noticing things and saying what I think. If what I say doesn’t reflect your own experience, it’s possible that it isn’t about you. It’s also possible that something that’s not About You might still be of some interest or use. There is even some remote possibility that I am oversimplifying, missing something obvious, or just speaking ex rectum.

I’ve lately been rereading Montaigne, generally considered the first essayist, inspired by Sarah Bakewell’s literary biography “How to Live.” Ms. Bakewell singles out the end of one passage in which Montaigne suggests that being self-aware of your own silliness and vanity at least puts you one up on those who aren’t, then shrugs, “But I don’t know.” It’s that implicit I don’t know at the heart of Montaigne’s essays — his frankness about being a foolish, flawed and biased human being — that she thinks has endeared him to centuries of readers and exasperated more plodding, systematic philosophers.

My least favorite parts of my own writing, the ones that make me cringe to reread, are the parts where I catch myself trying to smush the unwieldy mess of real life into some neatly-shaped conclusion, the sort of thesis statement you were obliged to tack on to essays in high school or the Joycean epiphanies that are de rigueur in apprentice fiction — whenever, in other words, I try to sound like I know what I’m talking about.

A raised glass to both Kreider and my old pal Montaigne. Too many smart people, especially online, are more concerned with winning arguments than actually saying anything insightful or interesting. Punditry, both professional and amateur, has become intolerably boring for me.

Friday, May 03, 2013

Forgotten Children Conform a New Faith


American guitarist Jeff Hanneman, a co-founder of the heavy metal band Slayer, died in southern California on Thursday, the band said in a statement posted on their website. He was 49.

South of Heaven was one of the very first thrash metal records I ever got. A metalhead friend and I skipped our afternoon class one beautiful September day and spent a couple hours sitting in his old diesel rattletrap Mercedes listening to his collection of cassettes. I went immediately out and bought my own copy of that record, along with Exodus's Impact is Imminent and Sacred Reich's The American Way (the latter two didn't hold up quite so well once the novelty faded, but I can still enjoy listening to Slayer un-ironically). A few weeks after that, as it happened, Seasons in the Abyss came out, and the slightly less-aggressive atmosphere of eerie dread pervading those two records still make them my favorites. Less than a year later, the Clash of the Titans, with Alice in Chains, Anthrax, Megadeth and Slayer, was my first concert.

One of the trade-offs of increasing maturity and sophistication is the tendency to experience things from a distance, via critical perspective. Seeing how this or that event fits into a larger narrative takes away some of the spontaneous enjoyment, the direct immersion in the experience. Late-eighties thrash is a strange little subculture: influenced but not accepted by punk rock, not really related to the hairspray glitz and glamour of their mainstream lite-metal peers, and not culturally significant enough to merit wider sociological notice the way grunge did a few years later. And a lot of the music is nowhere near being timeless. What I'm getting at is, it's hard to look back nostalgically now without being fully aware of what a relatively small cultural space thrash metal occupied and feeling somewhat old and self-conscious about it. And yet, that knowledge exists in tandem with the strong perception, perfectly preserved, still just as intensely vivid after more than two decades, of a vast new world opening up in front of me as I sat there in that old car listening to those twin guitar lines, that manic, chaotic drumming, and those trademark Tom Araya shrieks. That moment, at least, is timeless.

Thanks for everything, Jeff.

Wednesday, May 01, 2013

You Do It to Yourself, You Do, and That's What Really Hurts

Paul Miller:

One year ago I left the internet. I thought it was making me unproductive. I thought it lacked meaning. I thought it was "corrupting my soul." It's a been a year now since I "surfed the web" or "checked my email" or "liked" anything with a figurative rather than literal thumbs up. I've managed to stay disconnected, just like I planned. I'm internet free. And now I'm supposed to tell you how it solved all my problems. I'm supposed to be enlightened. I'm supposed to be more "real," now. More perfect.

...My plan was to leave the internet and therefore find the "real" Paul and get in touch with the "real" world, but the real Paul and the real world are already inextricably linked to the internet. Not to say that my life wasn't different without the internet, just that it wasn't real life.

...What I do know is that I can't blame the internet, or any circumstance, for my problems. I have many of the same priorities I had before I left the internet: family, friends, work, learning. And I have no guarantee I'll stick with them when I get back on the internet — I probably won't, to be honest. But at least I'll know that it's not the internet's fault. I'll know who's responsible, and who can fix it.

Most people — and I certainly include myself here — are nothing terribly special by objective standards. They're not geniuses, saints, inventors or talented artists. They never were going to become rich or famous, or be remembered beyond their grandchildren's lifespan. And so I repeat what I said before: rather than honestly ask ourselves if we truly are the deep-thinking, edge-living, profound sunsabitches we think we should be, and if not, whether that's really such a bad thing after all and whether our lives have meaning regardless, we procrastinate some more by blaming the Internet for rewiring our brains, thus avoiding that self-reckoning. I coulda been a contender, I coulda been somebody...

It's Just a Sign of the Times, Going Forward in Reverse

Steven Poole:

1 Going forward

Top of many people's hate list is this now-venerable way of saying "from now on" or "in future". It has the rhetorical virtue of wiping clean the slate of the past (perhaps because "mistakes were made"), and implying a kind of thrustingly strategic progress, even though none is likely to be made as long as the working day is made up of funereal meetings where people say things like "going forward".

Despite my recent fatwa, I generally accept that language is plastic, ad-hoc and ever-evolving, and so refrain from getting too exercised over deviations from some supposed True Standard of writing or speech. And even when doing copywriting, where word count is king and content is an afterthought, I can have a sense of wry humor about the accepted presence of so much excess, empty verbiage. But Broca's area has its reasons of which reason knows nothing, and holy mother of fuck, the constant use of this particular phrase makes me want to punch the speaker repeatedly in the larynx until they can only ever utter a rasping squawk for the rest of their days, or fantasize about a hidden mousetrap mechanism in their keyboards being triggered and crushing their fingers.