Monday, April 29, 2013

I Had No Pulse Last Time I Checked

It feels like it's been at least several days since we've heard someone declare this or that to be officially dead, doesn't it? Surely there's a eulogy being delivered somewhere, yes?

We will still have blogs, of course, if only because the word is flexible enough to encompass a very wide range of publishing platforms: Basically, anything that contains a scrollable stream of posts is a "blog." What we are losing is the personal blog and the themed blog. Less and less do readers have the patience for a certain writer or even certain subject matter. Instead, they use social media to efficiently pick exactly what they do and do not click on, rather than reading what a blogger or blog offers them... A necessary byproduct was that even if you were a devotee, you were not interested in about half of their posts. You didn't complain, because you didn't have an alternative. Now, in the form of your Twitter feed, you do, and so these old-style blogs have no place anymore.

Blogs? You mean they've been lingering on life support all this time?

Now, granted, I'm not the most sociable or well-connected or "with-it" type of fellow, so I just want to make sure I'm understanding this: am I really seeing what I'm seeing? Am I seriously reading a complaint about how the problem with reading blogs is the inability to customize them to your exact specifications so as to not waste a single second reading something you might not find interesting? Am I to understand that spending a few moments in suspended judgment to consider a different perspective or new information is now considered inefficient? Isn't that how your interests became your interests to begin with, because a casual glance turned into a stare, and you took a few moments to engage with them and find out what they were all about? So, in your twenties or thirties, you've already called time on all that, confident that you're only ever going to want more of the same of what you already know you like? If that weren't so naïvely ignorant, it would be insultingly stupid.

To be clear, Tracy is lamenting the supposed death of that blogging culture, not cheering it, and I don't doubt that it's an accurate description of how a lot of people think. Still, though, the rambling personal mini-essay has been around ever since Montaigne; it's not going anywhere. Don't confuse "trendiness" with "rude health".

Noisy Noise Annoys Noisy Roister

Brad McCarty:

It wasn’t so very long ago that we were all looking for more ways to be connected online. With the rise of social media sites, this task was made more simple than ever before. I remember trading in my Treo 755 for a BlackBerry Curve because it had a better Web browser, by which I could more readily access what was happening on FriendFeed. When Twitter rolled around, I was subscribed to everyone that I followed by SMS. The world was a noisy place, and I loved every minute of it.

But as is often the case, there can indeed be too much of a good thing. I quickly found myself worn down from the constant updates, random banter and Foursquare checkins. Fast forward a few years, add Instagram, Vine, cross-posts from Facebook and just about everywhere else and suddenly the dull roar had become a full-blown riot.

And so, more than 150 years after Thoreau's famous dismissal of the fetish for more information faster faster!, we find ourselves observing the latest in a spate of trend pieces averring that, even though his Klout score must have suuuucked, he might have had a point after all. That strange sound you heard was Nicholas Carr's sudden erection thumping the bottom of his desk.

Sunday, April 28, 2013

Enantiodromia

Wilfred McClay:

What would American political culture look like without its pervasive moral dramas of sin and redemption, sometimes expressed in forms lofty and noble, but at other times resembling nothing so much as the smarminess and vulgarity of soap opera? One thing can be said for certain: We are not only intensely fascinated by these episodes of political theater, but fully in the grip of them, as far more than mere onlookers. For an allegedly secular society, the United States seems to be curiously in thrall to ideas, gestures, emotional patterns, nervous tics, and deep premises that belong to the supposedly banished world of religion. These habits of heart and mind are evident everywhere we look, and they possess a compulsive and unquestioned power in contemporary American life. It is as if the disappearance of religion’s metaphysical dimension has occasioned a tightening hold of certain of its moral dimensions, particularly so far as these relate to guilt and absolution.

Consider the range of manifestations: The feeding frenzies over malfeasances by public officials, real or imagined, eventuating in obligatory rituals of public confession and abasement before the altar of Oprah Winfrey or some other secular priest or priestess invested with the power to give or withhold absolution. The obsession with our environmental sins, both as an overconsuming society and as individuals leaving carbon footprints, giving rise to such phenomena as “carbon offsets,” schemes that have been decried by skeptics as little more than “green indulgences,” transparent sops to voracious (and credulous) consciences. The almost bottomless reservoirs of racial guilt and recrimination, most recently illustrated by the embarrassingly abject apology proffered by James Wagner, the president of Emory University, for the sin of mentioning in an essay the formulation of the three-fifths rule in the U.S. Constitution as an example of political compromise, instead of condemning the rule with thundering, absolute, and final moral certainty, as so many on his faculty demanded he do, no doubt in the spirit of academic freedom. The similar and related tendency to shout down all unwelcome speech as being a form of bigotry and therefore morally unacceptable: anti-Semitic, racist, sexist, homophobic, un-American, and so on. On many college campuses, the inhibiting fear of saying the wrong thing at the wrong time in the wrong way to the wrong person has all but rendered vigorous debate impossible. Whatever else one might say of these manifestations, they do not reflect a culture in which easygoing relativism, tolerance, skepticism, and laissez-faire permissiveness reign. It is instead a culture clenched taut with every imaginable form of moral anxiety, seemingly convinced despite its own secular professions that we inhabit a universe that has an inherent and unforgiving moral structure.

Other writers have noticed this as well. Not to mention the interesting irony that the first nation consciously designed in the spirit of the Enlightenment should have retained, on the popular level, such an intense religiosity. Those theological assumptions were only stashed in a Micmac burial ground, so they return with something not quite right about them. The widespread need for mythological structure to human lives doesn't disappear just because someone points out that the myths aren't literally true.

Art Is a Lie That Makes Us Realize Truth

I've been returning to Richard Marshall's cool essay on Nietzsche and Beckett over the past week, trying to absorb all the good bits. Here's some of them:

Leiter’s Nietzsche writes only to the poet artist. Leiter considers Nietzsche rare in being someone not looking for a universal readership. His philosophy of art is like Stendhal’s in that it states that art promotes arousal. When discussing the figures who exemplify best what he is discussing it is Goethe, Beethoven and himself who Nietzsche cites. Nietzsche was addressing the artistic genius. His concern was not directly political or social or moral – although he did think that without the spectacle of the artistic genius civilizations would decline – his concern was to save the artist from our ascetic planet where morality and bourgeois conventions threatened to crush artistic wonders. Nietzsche is arguing for an exceptionalism for the likes of Beethoven and Goethe (and Nietzsche) in order that art and the artist could thrive. It is a philosophy of artistic bohemian hedonism. I argue that Beckett is a supreme exemplar.

...Remember that Nietzsche’s spectacularly illiberal elitism and anti-morality was about preserving the artistic genius from restrictions that would obliterate their ability to fulfil their role. The elitism is not about aristocratic breeding, wealth, intelligence or any of the usual suspects. He is wholly concerned with art genius. Nietzsche examples of the overman are Goethe and Beethoven (and Nietzsche too). He thought that without the artist, we would be deprived of the spectacle of artistic genius and deprived of such as these, we would deprive ourselves of the source in life of aesthetic pleasure.

...For Nietzsche the terrible truth is existential, moral and finally epistemic. We know little. What we do know science delivers and it fails to sustain our illusions about our selves, such as freewill. Most of our cherished beliefs are illusory. To know what others really think of oneself would make you clinically depressed. That much of what we cherish, including our moral beliefs, are lies and falsehoods, coupled with the idea that the truth is unbearable, is a core of the Niezschean philosophy.

...Nietzsche writes: ‘The truly serious task of art …[is] to save the eye from gazing into the horrors of night and to deliver the subject by the healing balm of illusion from the spasms of the agitations of the will’. Art is a protection and remedy to the tragic insight of our existential situation.

The Enkindled Spring

This spring as it comes bursts up in bonfires green,
Wild puffing of emerald trees, and flame-filled bushes,
Thorn-blossom lifting in wreaths of smoke between
Where the wood fumes up and the watery, flickering rushes.

I am amazed at this spring, this conflagration
Of green fires lit on the soil of the earth, this blaze
Of growing, and sparks that puff in wild gyration,
Faces of people streaming across my gaze.

And I, what fountain of fire am I among
This leaping combustion of spring? My spirit is tossed
About like a shadow buffeted in the throng
Of flames, a shadow that’s gone astray, and is lost.

— D.H. Lawrence

Let Your Fingers Do the Talking

John McWhorter:

Texting has long been bemoaned as the downfall of the written word, “penmanship for illiterates,” as one critic called it. To which the proper response is LOL. Texting properly isn’t writing at all — it’s actually more akin to spoken language. And it’s a “spoken” language that is getting richer and more complex by the year.

...In the old days, we didn’t much write like talking because there was no mechanism to reproduce the speed of conversation. But texting and instant messaging do — and a revolution has begun. It involves the brute mechanics of writing, but in its economy, spontaneity and even vulgarity, texting is actually a new kind of talking. There is a virtual cult of concision and little interest in capitalization or punctuation. The argument that texting is “poor writing” is analogous, then, to one that the Rolling Stones is “bad music” because it doesn’t use violas. Texting is developing its own kind of grammar and conventions.

Currently, in the panopticon of the social web, not maintaining a presence on social networking sites is liable to earn you anything from for-your-own-good finger-wagging to accusations of psychopathy. I have a few friends who occasionally pester me to communicate with them through Facebook or LinkedIn. But the basic truth of McWhorter's observation raises a new possibility — now, I can explain my refusal to participate in everything from texting to tweeting as having taken a vow of silence, the spiritual overtones of which should hopefully inspire a respectful (blessed) silence from those otherwise incapable of shutting the hell up.

Water, the Universal Solvent

Anne Kingston:

The trend to mindfulness would seem to signal mass recognition of the need to slow down and pay attention in a turbo-driven, reactive society. Yet its migration from ashram to boardroom is not without tensions. High-profile Buddhists are taking off the gloves, albeit thoughtfully; they say mindfulness is part of a continuum—one of the seven factors of enlightenment—not a self-help technique or “a path which can lead to bigger profits,” as the Financial Times put it. And long-time practioners worry that mindfulness repackaged as a quick fix or a commercial platform could in fact lead to mindlessness, and reinforce the very problems it’s trying to heal.

...Donald Lopez, a professor of Buddhist and Tibetan studies at the University of Michigan, calls “secular Buddhism” an oxymoron: “Buddhism has always been a religion,” he says. “To see it as a way of life is a modern conceit that disparages the lives and religious practices of Buddhists over thousands of years.” The author of The Scientific Buddha, published in 2012, says belief that “mindfulness” is an ancient Buddhist practice is a fallacy: “There’s a cachet that comes from saying some ancient sage a millennium ago in India invented these things,” he says.

There's an omnipresent tension between those who use religious teachings as a means of reinforcing an egocentric worldview and those who use them as challenges to it. Same as it ever was. I will note, though, that Alan Watts claimed Buddhism itself was a reinvention of existing traditions for the sake of particular needs:

Hinduism is not a religion, it is a culture. In this respect, it's more like Judaism than Christianity, because a person is still recognizable as a Jew even though they don't go to synagogue. Jewish people, coming from a long line of Jewish parents and ancestors who have been practicing Jews, still continue certain cultural ways of doing things, certain mannerisms and attitudes, so they are cultural Jews instead of religious Jews. Hinduism is the same sort of thing; it is a religious culture. Being a Hindu really involves living in India. Because of the difference of climate, of arts, crafts and technology, you cannot be a Hindu in the full sense in Japan or the United States.

Buddhism is Hinduism stripped for export. The Buddha was a reformer in the highest sense; someone who wants to go to the original form, or to re-form it for the needs of a certain time.

In panta rheism, such sectarian distinctions are irrelevant, of course. The headwaters are unimportant as meaning and truth can be found at any point along the river, wherever water flows.

Saturday, April 27, 2013

Sedate Expectations

Roman Krznaric by way of Maria Popova:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.

Beginning guitar students are taught to always start below the desired note and tune up to pitch, never down, to prevent the string from going flat. I'm going to repurpose that bit of knowledge for metaphorical use here: if you approach work with an image of your ideal, there's almost nowhere to go but down into flat disappointment. Very few things live up to your highest hopes, but modest expectations going in can leave more room for pleasant surprises along the way. And as Abraham Maslow said, it isn't normal to know what we want. It's a rare and difficult psychological achievement. Our consciously articulated desires are often incoherent or limited; we figure out what makes us content by process of elimination, by trying a lot of different things.

Augusten Burroughs and John Gray, among others, have said that being interested is a more worthwhile goal than being happy. I tend to agree. And as someone who's had to build new careers from scratch after the loss of a family business, let me assure you that having your financial needs met with plenty of free time left over is nothing to sigh about.

The Bubble-Head Charm

Noreen Malone:

Davis wandered off to appease his Inc. editor, and Saltzman turned his thoughts to a favorite South by Southwest subject: the revolutionary potential of South by Southwest. “The reality is that these people are going to change the world,” he said, looking around him. “Technology as a growth pattern has grown exponentially in our lifetime to where we’re going to be solving some serious fucking issues and living forever. I know it’s a crazy thing to think about, but we’re solving the world’s problems.”

...But at a party that night, as I was playing the name game with a new acquaintance, the woman paused and said something like, “Wouldn’t it be nice if we had bubbles above our heads with all this information?” I couldn’t tell if she was joking.

One day, I walked into the Convention Center and saw a gray-haired man with a notable lack of devices sitting at a round table near the entrance, staring, puzzled at all that was happening around him, as if he were a lost time-traveler. It turned out, he was, sort of. The man was an archaeologist and classics professor, in residence at the Institute for Advanced Studies and in town to use the library at the University of Texas. He wasn’t a Luddite—he’d helped build an Internet archive of Mediterranean transcriptions, and he uses, he assured me, a MacBook Pro—but there was something about the crowd that struck him as off-putting. “I haven’t ever seen so many people staring at their phones like idiots,” he said, observing what I might have if I’d been looking up from my phone. “They actually believe this stuff is worthwhile.” Then he got self-conscious. “I’m too old, I guess,” he said. And I knew, suddenly, that I’d be signing up for head bubbles someday.

Having said this just yesterday, it should be acknowledged that, of course, messianic fever dreams of technological salvation still have a good deal of life in them yet. Is this why North Korea was threatening to nuke Austin? 'Cause I gotta say...

Friday, April 26, 2013

History is Written by the Writers

Razib Khan:

But this brings me to the more fundamental issue. Theology and texts have far less power over shaping a religion’s lived experience than intellectuals would like to credit. This is a difficult issue to approach, because even believers who are vague on peculiarities of the details of theology (i.e., nearly all of them!) nevertheless espouse that theology as true. Very few Christians that I have spoken to actually understand the substance of the elements of the Athanasian Creed, though they accept it on faith. Similarly, very few Sunni Muslims could explain with any level of coherency why al-Ghazali‘s refutation of the Hellenistic tendency within early Islam shaped their own theology (if they are Sunni it by definition does!). Conversely, very few Shia could explain why their own tradition retains within its intellectual toolkit the esoteric Hellenistic philosophy which the Sunni have rejected. That’s because almost no believers actually make recourse to their own religion’s intellectual toolkit.

This is the hard part for many intellectuals, religious or irreligious, to understand. For intellectuals ideas have consequences, and they shape their lives. Their religious world view is naturally inflected by this. And most importantly they confuse their own comprehension of religious life, the profession of creeds rationally understand and mystical reflection viscerally experienced, with modal religiosity. This has important consequences, because intellectuals write, and writing is permanent. It echoes down through the ages. Therefore our understanding of the broad scope of religious history is naturally shaped by how intellectuals view religion. In a superficial sense the history of religion is the history of theology, because theology is so amenable to preservation.

Like the old joke about the drunk, we look for the keys to understanding religion in theology because the light's better there. I think this is clearly true, and it also underlies why I think it's so ridiculous to hear atheists talk about forming a world without religion, as if it would simply be a matter of convincing people to reject certain logical propositions through argument, as opposed to a complete remaking of human psychology and culture along rational lines, which, of course, worked so well for the Jacobins and their offspring.

All You Know-It-Alls with Politic Views

Anthony Gottlieb:

It was David Hume, in the 18th century, who showed how to bring scepticism back to life. The first step is to keep in mind what Hume called the "strange infirmities" of human understanding, and the "universal perplexity and confusion, which is inherent in human nature". Armed with this knowledge—for our ignorance is the one thing of which we can be certain—we should be sure to exercise the "degree of doubt, and caution, and modesty, which, in all kinds of scrutiny and decision, ought for ever to accompany a just reasoner". Apart from anything else, this would help to cure people of their "haughtiness and obstinacy".

In theory, we have all learned Hume’s lesson, because a modest scepticism is the official philosophy of the modern sciences, which avow the maxim that every result is to be probed, repeatedly, and no truth may be admitted until it has stood the test of time. But, in fact, we have not learned his lesson. Nobody has time to wait and see whether yesterday’s experiment will still stand several decades from now. Life is short and writers have deadlines. So scepticism is a philosophy that is not easy to live up to. But who would want a philosophy that was?

For most of us, if we're honest, we have to admit that much of our "knowledge" has been bought with cheap credit from potentially unreliable intellectual lenders, liable to come crashing down if subjected to rigorous scrutiny. And so I'm tempted to agree with Gottlieb's opinion that Hume's philosophy is the "best" all-around. In his Enquiry Concerning Human Understanding, Hume said something that I've always found practically useful: "And here it is constantly supposed that there is a connexion between the present fact and that which is inferred from it." Training yourself to be keen to those faulty or nonexistent "connexions" will keep you busy for as long as you're willing, and possibly even make you look much smarter than you really are.

Spring Evening

A spring evening—one priceless moment.
The smell of fresh flowers and the glow of the moon.
Sweet song drifts down from the balcony—beautiful.
The garden swing hangs motionless as evening drips away.

— Su Shi (1037-1101)

Watching the World Around Me Change Its Ugly Faces; Yes, This is Mankind

Will Glovinsky:

To be sure, the power of crowdsourcing has given us gifts both precious (like Wikipedia) and picayune (the cover design of Elizabeth Gilbert’s next book), but it is a legitimate achievement of the digital age, one that proves that the internet is capable of transforming the way we interact collectively. Prior to vilifying Tripathi, for example, Reddit users had been helpfully sifting for leads amid the enormous amount of footage taken at the Boston Marathon. Nevertheless, the reflexivity with which we invoke the “wisdom of crowds” seems to suggest less that we think crowds are truly wise and more that we understand — if only dimly — their undeniable potency. The fact is that the digital age has yet to really countenance the cultural anxieties produced by the new invisible crowd.

In this regard, the jitters of the internet era bear an almost comic resemblance to a central disquietude of the august, and apparently closed, epoch we call modernity. For just as the increasing institutionalization and regulation of the internet seem to be attended by the lurking possibility that everything could crumble from one cyber attack unleashed by a handful of anonymous malcontents, so too did the project of modernity grapple with the contradiction that even as its liberal institutions grew more powerful, its stability became more dependent on the whims of the crowd. The French Revolution and its aftershocks are the textbook examples here, providing the archetypal images of the crowd in all its revolutionary splendor and violence. Now clearly, the works of Anonymous or 4chan don’t quite match the grisly proceedings of the Jacobins, but one can detect definite recursive qualities between them — namely excessive zeal, indiscriminateness, feverishness. And if history insists on recursion, there’s no reason not to learn from its lessons.

Kottke:

There's little diversity and independence: Twitter and Facebook mostly show you people who are like you and things your social group is into. And social media is becoming ever more centralized: Facebook, Twitter, Tumblr, Medium, Pinterest, etc. instead of a decentralized network of independent blogs. In fact, the nature of social media is to be centralized, peer-dependent, and homogeneous because that's how people naturally group themselves together. It's a wonder the social media crowd ever gets anything right.

The dream of a new humanity dies hard. Formerly, the post-Enlightenment project to create one was understood as being political and cultural in nature. In the last couple decades, there's been some half-hearted gestures in the direction of the old socialist ideals, but the focus mostly seems to have shifted to technology. The Internet and all its gadget offspring have given us one more sugar rush of belief in transformative potential, but it turns out that a bajillion monkeys still don't produce Shakespearean-quality art even after you replace their typewriters with MacBook Pros. Perhaps we could say that our love affair with all things digital was a rebound relationship after our breakup with socialist utopianism, a torrid, escapist fantasy to avoid coming to terms with the limitations of our nature. Somewhere in the back of our mind, we knew that we weren't seriously going to reinvent the human animal simply by the use of incredible new tools, but those winking screens and sleek designs were just too tempting, and besides, it was good for our self-esteem as we slowly processed what we'd lost.

It is something fun to wonder about, in my vague, uninformed way. Are we starting to bump up against the edge of our petri dish in more ways than one? If political, cultural and technological attempts to progressively alter human nature toward some nebulous betterment all fail, what then? Will we retreat into some kind of mystical inwardness, experiencing life on a level plateau instead of a sharp incline, realizing that wherever we went, there we were?  

Thursday, April 25, 2013

What Sacred Games Shall We Have to Invent?

Thomas de Zengotita:

When Friedrich Nietzsche, who had fully absorbed the implications of Darwinian evolution, announced that “God is dead,” he was not merely addressing orthodox religion. He was telling moderns that there was no meaning or direction to be found in nature or history at all. He was telling them that Georg W. F. Hegel, Auguste Comte, Herbert Spencer, and Karl Marx were victims of the residual influence of the idea of Providence, that their visions of an unfolding historical necessity were delusions. The thing that really mattered in the long run about Charles Darwin wasn’t the impact of “we are descended from monkeys” on reactionaries; it was the impact of “we are a meaningless accident” on progressives. Think of the tone in Max Weber, Sigmund Freud, and Theodor Adorno, the stoic willingness to face up to irredeemable loss and make the best of it. Think of the ferocious absolutism of twentieth century totalitarian regimes. These represent opposing but characteristic moods, and both were responses to a condition of utter abandonment and a consequent shift of the burden of responsibility for human practices to human beings.

This paragraph isn't really representative of the larger essay; I just thought it was an elegant encapsulation of all the light and heat surrounding that famous phrase. Still, calling humanity a "meaningless accident" seems a bit like an overcorrection — an accident typically implies a regrettable deviation from purpose. If human existence, with all its myriad wonders and joys, is an accident, then what must the implied purpose be? It could only be pointlessness, nothingness, entropy. As I've said often, nihilism is a last-gasp attempt to find universal meaning by inverting the concept when all other options have failed. If meaning and value can't be grounded in something eternal, then nothing ever has any meaning at all. (It's the same logical error that says, if you can't get an absolute perfect score on a test, fuck it, might as well turn in a blank page and get a zero.) God's brittle skeleton gets propped up on the throne out of desperation.

It seems just as accurate, without being quite so bitterly spiteful, to say that humans aren't guaranteed anything. We weren't guaranteed to come into existence as a species. We're not guaranteed to keep molding our world to our satisfaction. We're not guaranteed to figure out the answers to any questions or problems we can conceive of. And we're not guaranteed to exist indefinitely. Nonetheless, we're here, and what we do matters to us. Let that be good enough.

If You Smell What He's Cooking

Michael Pollan:

To cook or not to cook thus becomes a consequential question. Though I realize that is putting the matter a bit too bluntly. Cooking means different things at different times to different people; seldom is it an all-or-nothing proposition. Yet even to cook a few more nights a week than you already do, or to devote a Sunday to making a few meals for the week, or perhaps to try every now and again to make something you only ever expected to buy — even these modest acts will constitute a kind of a vote. A vote for what, exactly? Well, in a world where so few of us are obliged to cook at all anymore, to choose to do so is to lodge a protest against specialization — against the total rationalization of life. Against the infiltration of commercial interests into every last cranny of our lives. To cook for the pleasure of it, to devote a portion of our leisure to it, is to declare our independence from the corporations seeking to organize our every waking moment into yet another occasion for consumption.

I can get down with all that. Shamefully, though, learning how to really cook for the art and joy of it is one of those things I've been aiming to do for some time without ever getting around to it. A few years ago, I was planning to take some culinary classes, but this and that came up and it never happened. One of these days...

Tuesday, April 23, 2013

Why Are You So Petrified of Silence?

Ruairí:

In a month where between Thatcher and Boston we definitely saw some of the worst of ‘social media’*, Mike Monteiro hits the proverbial nail on the head when he asks us to question our contributions to the digital debate. Twitter just seems to be flowing with bile at the moment. Conversely, I took a dip into Facebook recently following months of having a deactivated account and it just seems so dull, full of banal marketing waffle. Either way, its not very nice to swim in.

I'd love to believe that social media users might collectively develop the self-awareness to curb the worst of their logorrheic squirts, but I, as you know, am an optimist in full possession of the facts, which is to say, a pessimist. I feel like Steve Martin, despairing over an endless parade of Chatty Cathy dolls, pulling their own strings:



(Pressed for time, I forgot to include this clip in the original draft of this post. No, you're not going crazy. It wasn't there the last time you looked.)

Seriously, though, it's something I think about a lot. Like I said before, this blog is my practice, to borrow the Buddhist concept. Writing is a form of thinking out loud for me, a way to challenge myself to seek out interesting writers and attend to more substantial topics. I'm not big on the popular notion of personal growth, but as the original panta rheist, I do think my interior life needs to have a feeling of flow to it — not necessarily improving or progressing, but always moving, avoiding stagnation. I wouldn't ever want to feel like I was writing out of habit or obligation. Every word should count.

This all reminds me of a strange memory from when I was about five or six years old. (And it's safe to say a five year-old Scribbler attending school in 2013 would almost certainly be diagnosed as being somewhere on the autism spectrum.) Anyway, my dad was trying to engage me in conversation, encouraging me to give more than the bare-minimum of a reply. I distinctly remember thinking that I needed to save my words for when they were really needed, as if we were only given a limited supply of them to begin with. And though I couldn't have quite articulated it then, I was also aware that this was something odd. I was reflecting on my own reticence, conscious that I felt a very strong urge to be quiet as much as possible, conscious that this made me different somehow.

That urge has never really left me. There are social situations where I'll recognize that I could say something to contribute to a conversation, but I'll actually hesitate while conducting a brief internal dialogue, as I weigh up whatever factors I consider relevant and see which way the scales tip. I'd almost always rather err on the side of silence rather than talk when I don't have anything important to say. Wouldn't want to use up my words and be left speechless years down the road when I really need them!

When writing posts in my head, I often spend hours going over every variation on a point I want to make, feeling like I have the rough outline of a novella, only to be genuinely surprised when the final result has somehow been condensed down into a couple mid-sized paragraphs. It's like I'm just congenitally incapable of using more words than absolutely necessary. Not everything has to be deep or ultra-serious, of course, but apparently I need it all to be meaningful.

Monday, April 22, 2013

I Guess We're All Damaged in Our Own Way, Alone in Our Own Way

After reading Erik Davis's post about it earlier this year, I finally got around to watching the documentary Kumaré, which I greatly enjoyed and highly recommend. Seriously, go watch it and then come back here and tell me what you think. He fakes it so real he is beyond fake.

A couple brief impressions: rationalism really seems like pretty thin gruel to offer people when you see their honest pain up close and personal. When you get a suspicion of just how many people are deeply damaged or deluded, you tend to appreciate whatever it takes just to help them function. I do think it's true that most people looking to a guru for answers are just projecting their own authority onto him or her. But not all of them can be made to realize that in the same way, at the same time. Perhaps, in response to the question Davis asks — is such deceit necessary? — you could say that it is, if used as negation. Fighting illusion with illusion. Using one to cancel the other out. If not strictly necessary, maybe at least effective. Like a form of jujitsu, using the momentum of their own illusions against them.

Lost Man in Time; Was His Name Peter?

I know what I'll be reading this fall:

Noted metal journalist Jeff Wagner to author biography of the late leader of Type O Negative, Peter Steele. Jeff Wagner (Mean Deviation book, former Metal Maniacs editor) is currently assembling his next book, Soul On Fire – The Life and Music of Peter Steele. The book will be, in Wagner’s words, “a thorough telling of Peter’s life, from his ‘diaper days’ to his death. It will not only include analysis of the music he created in Type O Negative and Carnivore -- and the many triumphs and personal tribulations that came along with it -- but also let fans in on the details of his early days. The book will also feature numerous images from throughout his life, on stage and off. Peter’s fans miss him, and as a follower of his music since 1986, I’m proud to put together this tribute for them. While warring factions within the story have already been heard, my mission is simple: cut through the crap and tell one of the most extraordinary stories in modern music, with great respect to the subject himself.”

Culled from a variety of sources, including recollections from band mates, family, close friends and record label reps, Soul On Fire is an accurate account of Steele’s creative genius and incredibly complex personality. Steele was a visionary and a provocateur; a generous friend and a self-deprecating hedonist; a band mate and a brother. His struggles with addiction and his acceptance of the Catholic faith he grew up with and then grew out of…all of this is surveyed and detailed within Soul On Fire.

I am bouncing up and down in my seat, grinning from ear to ear, and clapping my hands like a toy monkey with cymbals. Well, maybe not literally, but I was thinking about Steele last week on the anniversary of his death. As much as I've enjoyed the blog his sisters have kept in his honor, I'd always vaguely wished someone would write a definitive biography. In some strange way, Type O always felt like a best-kept musical secret to me as a fan, always just slightly in the shadows. Never belonging truly and completely to any specific musical genre or any cultural moment, partially hidden behind cryptic humor and the symphonic majesty of their music. Usually I'm content to just enjoy the results of an artist's labors and pay no mind to the person behind the curtain, but I always have been curious to know a little more about the personal life of the man behind some of the greatest music I've ever heard.

This Is Not a Storybook of Once Upon a Time and Happily Ever After

Patricia Vieira:

The problem with most arguments in the debate about reading is that they posit literature as an instrument used to achieve a certain goal: either the good of the individual (it is good for you) or the good of society (it makes you good). Leaving aside the issue of deciding whether what makes you good is not, ultimately, good for you, a more fundamental question arises: why does literature need to be defended at all?

The anxiety to justify literature is symptomatic of our age, when all activities should have an easily identifiable objective. The difficulty with literature, as well as with music or the fine arts, is that it has no recognisable purpose or, in Immanuel Kant's elegant formulation, it embodies "purposiveness without purpose". Reading certainly has myriad effects, but it is difficult to pinpoint exactly how it influences each person and harder still to translate this impact in terms of quantifiable gains.

Literature breaks the continuum of the everyday and makes us stop and think. The linguistic experimentation that is the hallmark of the literary estranges us from the most commonplace of tools, our language, while the fictional elements of novels, plays and poems offer us a glimpse into a reality that is not our own.

Several months ago, there was some article making the rounds that claimed some sort of proof for the idea that reading fiction made you a better person. I think it was Arts & Letters Daily that pithily remarked how, if that were true, English departments should then be oases of enlightened, saintly behavior. That's probably all the refutation it needs, but I'll add a pair of pennies: it's a progressivist delusion that the "bad" parts of humanity can be surgically excised, or selectively bred out of existence. Reading widely may increase the breadth and depth of one's understanding of human experience, but the basic elements of that experience will still include all the things moralists call vices or sins. Reading widely may increase the cognitive tools at one's disposal, without touching the perennial drives which motivate the use of those tools. All the condensed human experience and wisdom in the world won't eliminate the forks in the road that we all constantly encounter and put us on a direct path to perfection.

Ran-tanning

Russell Blackford:

What I didn’t realise until recently was how far this call-out culture has become prevalent in other corners of the blogosphere and the general culture, especially in forums of people who perceive themselves as fighting for social justice. Nor did I know how much critique call-out culture was receiving around the internet. That’s something to be grateful for. Posts such as those by Dzodan and (most recently) Thompson are very valuable and deserve wide reading.

I've always figured that I was destined to settle comfortably into something like Taoist hermitry, but my movement in that direction has been significantly accelerated by repeated exposure to the highly-concentrated stupidity enabled by Web 2.0 social mores. Callout culture, like its equally retarded sibling, boycott culture, is the tailgate party of sociopolitical activism. Who's playing? Who cares! Just show up, get hammered on self-righteousness and be seen around the scene!

Human nature hasn't changed in recent years, obviously. But it seems a new generation has become attached to the fantasy that their cool toy phones and a vastly expanded public square in the form of the social web have enabled a new potential for mass activism, and as with all shiny objects, it just seems to be taken for granted that the shortcomings of the old models will somehow magically fail to make an appearance. People are just as stupid as they've always been; now, they just have tools to magnify and amplify their stupidity exponentially. Someone said or did something questionable on the Internet? Assume the absolute worst, and respond as quickly and aggressively as possible, multiplied by a million.

Other Gods Before Me

Richard Wolin:

Assmann argues that biblical monotheism, as codified by the Pentateuch, disrupted the political and cultural stability of the ancient world by introducing the concept of "religious exclusivity": that is, by claiming, as no belief system had previously, that its God was the one true God, and that, correspondingly, all other gods were false. By introducing the idea of the "one true God," Assmann suggests that monotheism upended one of the basic precepts of ancient polytheism: the principle of "divine translatability." This notion meant that, in ancient Mesopotamia, the various competing deities and idols possessed a fundamental equivalence. This equivalence provided the basis for a constructive modus vivendi among the major empires and polities that predominated in the ancient world.

Assmann readily admits that the ancient Middle East was hardly an unending expanse of peaceable kingdoms. However, he suggests that before monotheism's emergence, the rivalries and conflicts at issue were predominantly political rather than religious in nature. For this reason, they could be more readily contained. Monotheism raised the stakes of these skirmishes to fever pitch. According to Assmann, with monotheism's advent, it became next to impossible to separate narrowly political disagreements from religious disputes about "ultimate ends" (Max Weber) or "comprehensive doctrines" (John Rawls). According to the new logic of "religious exclusivity," political opponents to be conquered were turned into theological "foes" to be decimated.

By introducing the "Mosaic distinction," Assmann argues, the Old Testament established the foundations of religious intolerance, as epitomized by the theological watchwords: "No other gods!" "No god but God!" Thereafter, the pre-monotheistic deities were denigrated as "idols." As Assmann explains: Ancient Judaism "sharply distinguishes itself from the religions of its environment by demanding that its One God be worshiped to the exclusion of all others, by banning the production of images, and by making divine favor depend less on sacrificial offerings and rites than on the righteous conduct of the individual and the observance of god-given, scripturally fixed laws."

The review is negative overall, but this sure does seem like an interesting book.

Sunday, April 21, 2013

Don't Fail Me Now

David Simon wrote a post recently:

Dead children and monied politicians.

The Guardian, which I hear used to be an actual newspaper, reprinted it, but not before HuffPosting up the title a bit:

The Senate's gun control fail: dead children and monied politicians

I am in favor of the death penalty for adults who use the word "fail" as a noun.

You'd be Well-Advised Not to Plan My Funeral Before the Body Dies

Near as I can tell, the only proof for this silly claim that Richard Dawkins-style confrontational atheism is over and done, supplanted by a kinder, gentler version, is the author's say-so. But okay, let's see if there's any fire behind this smoke by checking a standard that I happen to know a li'l bit about by virtue of one of my jobs: book sales.

In a nutshell: I would consider any book with an Amazon sales ranking of 100,000 or lower to be something I could sell within a few days. So how do some of the New Atheist authors fare as of this moment? How thoroughly have consumers rejected the rudeness of Dawkins, Dennett, Harris and Hitchens via the power of their purses?

The God Delusion: #713
God is Not Great: #987
Free Will: #3,650
The Portable Atheist: #3,729
The End of Faith: #5,559
The Magic of Reality: #5,579
Letter to a Christian Nation: #8,861
The Greatest Show on Earth: #14,369
The Moral Landscape: #32,114
Breaking the Spell: #35,157
Unweaving the Rainbow: #193,629

Even the "weakest" of the bunch is something I'd be glad to get hold of. These books have all been out for a while, too, so this isn't a case of a just-released book rocketing up the rankings by virtue of everyone buying it at once, or of a dramatic sales bump following the news of an author's death (man, you should have seen how much Hitchens' books were selling for a little over a year ago; death really is a good career move). The reading public can't seem to get enough of this stuff.

As the old saying goes, money talks and bullshit walks, so if Theo Hobson presented me with brand-new copies of all of these books, I would snatch them out of his hands faster than he could blink and turn a tidy profit on them by the end of the week while he sputtered and harrumphed about how "nobody" is reading them anymore. But I imagine a devout believer is no stranger to telling himself what he wants to hear.

Second Time as Farce

John Gray:

Less well known are Marx’s deep differences with Darwin. If Marx viewed Trémaux’s work as “a very important improvement on Darwin,” it was because “progress, which in Darwin is purely accidental, is here necessary on the basis of the periods of development of the body of the earth.” Virtually every follower of Darwin at the time believed he had given a scientific demonstration of progress in nature; but though Darwin himself sometimes wavered on the point, that was never his fundamental view. Darwin’s theory of natural selection says nothing about any kind of betterment—as Darwin once noted, when judged from their own standpoint bees are an improvement on human beings—and it is testimony to Marx’s penetrating intelligence that, unlike the great majority of those who promoted the idea of evolution, he understood this absence of the idea of progress in Darwinism. Yet he was just as emotionally incapable as they were of accepting the contingent world that Darwin had uncovered.

As the late Leszek Kołakowski used to put it in conversation, “Marx was a German philosopher.” Marx’s interpretation of history derived not from science but from Hegel’s metaphysical account of the unfolding of spirit (Geist) in the world. Asserting the material basis of the realm of ideas, Marx famously turned Hegel’s philosophy on its head; but in the course of this reversal Hegel’s belief that history is essentially a process of rational evolution reappeared as Marx’s conception of a succession of progressive revolutionary transformations. This process might not be strictly inevitable; relapse into barbarism was a permanent possibility. But the full development of human powers was still for Marx the end point of history. What Marx and so many others wanted from the theory of evolution was an underpinning for their belief in progress toward a better world; but Darwin’s achievement was in showing how evolution operated without reference to any direction or end state. Refusing to accept Darwin’s discovery, Marx turned instead to Trémaux’s far-fetched and now deservedly forgotten theories.

The whole essay is very interesting. And it makes it even funnier to then go back and read this kind of wishful wanking. We must not reverse the idea of historical necessity, comrades! Have, uh, faith!

In Between Demotic and Altiloquent

Simon Kuper:

But texts, blogs, emails and Facebook posts are infecting other kinds of writing, and mostly for the good. They are making journalism, books and business communications more conversational.

Social media offer a pretty good model for how to write. First, the writers mostly keep it short. People on Twitter often omit “I”, “the” and “a”, which are usually wastes of space anyway. Vocabulary tends to be casual: bloggers say “but” instead of “however”. They don’t claim a false omniscience, but proclaim their subjectivity. And the writing is usually unpolished, barely edited. That’s a great strength.

...George Orwell in 1944 lamented the divide between wordy, stilted written English, and much livelier speech. “Spoken English is full of slang,” he wrote, “it is abbreviated wherever possible, and people of all social classes treat its grammar and syntax in a slovenly way.” His ideal was writing that sounded like speech. We’re getting there at last.

That article from Wired that I linked to last week said something similar, that texting is the most efficient form of communication ever invented. Well, I know this is a heretical thing to say to a culture obsessed with business and technology, but have you ever considered the possibility that efficiency isn't the fucking be-all, end-all of human activity? That not everything is automatically improved by making it faster and simpler? That dancing is more than simply a convoluted means of getting from one point on the floor to another? That music is more than just an easy way to light up the nucleus accumbens? That writing is more than a delivery system for communicating facts? Nietzsche, whose prose style I greedily wish I could better approximate, explained to an acquaintance why some things deserved to be expressed in a style other than the vernacular:

When one writes a book and thus steps into the public light, that is always a significant act deserving of a certain solemnity, so that one has to put aside everyday language. You have a good example in Catholicism, toward which, as you perhaps know, I am not exactly friendly, but this does not prevent me from recognizing the great worldly wisdom with which Rome has been conducting its business over the ages. Why does Rome still have the Mass read in Latin? To give the solemn act, veiled in mystery, a special solemnity even externally. But that must not be at the expense of clarity or intelligibility. If thoughts were thereby hidden, if the real meaning became hard to understand, that would of course be false, that would no longer be solemn, that would be foolish.

That's always resonated with me. Solemnity without sacrificing clarity or intelligibility. I agree with the perception of Nietzsche as a philosopher who wrote like a poet. His ideas were earthy enough, but they were so often presented in such unexpected, vivid images! In fact, if there's one thing that particularly annoys me about my own writing, it's that it's too often plain, straightforward and unadorned. I'm not saying I want to become Henry James or James Joyce, but I would like to aim higher than colloquial standards.

Saturday, April 20, 2013

The Horror of That Moment


"The horror of that moment," the King went on, "I shall never never forget!"
"You will, though," the Queen said, "if you don't make a memorandum of it."


Greg Lukianoff:

Speech codes and changed attitudes about freedom of speech have created all of these negative feedback loops for expression and critical thinking. As you censor unpopular opinions you end up with classroom environments where individuals can’t really speak their minds. You also end up with students mostly talking to people they already agree with. The research on this is very strong—when you talk to people you already agree with, it thwarts development of critical thinking skills, and it makes people much more confident in what they already believe. It tends to make people more adamant, and exacerbates the serious problem of groupthink.

If we’ve legislated politeness, and legitimized the idea that disagreeing with somebody could potentially hurt his or her feelings, why bother to discuss anything? We have to teach people that debate and discussion lead to better ideas—they allow us to be more creative and to develop critical thinking skills. Moreover, the idea that meaningful, meaty debates over the most serious issues can actually be fun has been badly damaged.

...Putting this weird energy around disagreement, dissent, satire, parody, devil’s advocacies, or thought experimentation makes everything so dreadfully serious. Students no longer appreciate the idea that the professor whose seemingly strange attitudes about everything from sex to religion to politics could actually be presenting an opportunity to dive into something interesting—as opposed to saying something another person’s fragile ego can’t handle.

As essential and true as this is, I always thought it was obvious almost to the point of banality. And again, the only formative experience I had was taking debate class for two years in middle school, where we learned how to make the best argument possible independent of our personal feelings on the topic, and taking philosophy in community college, where we were taught that asking questions was a better path to truth than making bold assertions, and that there was nothing to fear about having to change our minds in light of better information.

Yet a decade of reading in the blogosphere has left me with the strong impression that, aside from literal libertarians, whom almost everyone hates, and my old pal Tauriq Moosa, whom many people seem to consider an incestuous, murderous, blasphemous rapist (it's a wonder he has time to post at all, being that busy), most people are willing to disregard these principles whenever it's convenient, which is often. One of the many things that cemented my utter disgust with the social justice brigade of online atheism was how the empty snark of "freeze peach" became a trendy meme among them.

Said it before, I'll say it again: being offended is not a big deal. It can even be a good thing if you're able to deflate your ego and relinquish your delusions of world-changing self-importance.

Trial by Smartphone

Owen Gibson:

Kick It Out, the campaign group that six months ago on Saturday was the focus of a protest by leading black footballers over the game's response to high-profile incidents of racism, is planning to launch smartphone apps next season that allow players and fans to report anonymously abuse in the stands or dressing room.

The aim is to challenge the dressing-room omertà and fan impotency that have frustrated efforts to change the culture at all levels of the game.

The new apps, one for players and one for spectators, will make it easier and quicker for players and fans to report abuse and there are hopes within Kick It Out that they will give a more accurate picture of the scale of the problem, as well as leading to more sanctions and criminal convictions.

Yes, there certainly are hopes. Mine are more modest; I simply hope that this experiment in crowdsourced snitching and rumormongering will go better than the recent trend toward vigilante sleuthing. Like the poster says, none of us is as dumb as all of us.

As for whether this is truly a serious problem demanding a severe response, I'm willing to take Kenan Malik's word on it:

This is why the current furore over racism seems so bizarre. I cannot remember the last time I faced the kind of abuse that was so common in the eighties.  Racism still exists, of course, and needs always to be confronted, but it is relatively isolated. Indeed, it is precisely because racism is so rare that it seems so shocking when we are confronted with it.

...If racism is not the issue that once it was, why the sudden interest on the part of the football authorities in combating racism? Having spent decades ignoring racism in the sport when it was a real, live issue and required a robust response, the FA is now trying to gain the moral high ground by conducting a war that has largely been won.  It would have taken guts and commitment to have stood up to racism three decades ago. Today, the FA is trying to clamber on to a moral high ground that has long since become crowded.

If the character of racism has changed over the past three decades, so too has the character of antiracism. Antiracism has all too often become less about challenging discrimination or hatred, more about moral posturing. ‘A lot of the issues that we’ve gone on about in the last season or so, it’s more about people driving the issue than the issue being a real focus’, as David James put it.

Friday, April 19, 2013

I Do Many Things. I Span the Genres.


Devindra Hardawar:

You can think of Wizpert as an IT help desk for life. The company recruits knowledgeable bloggers, which it calls “Wizperts,” across topics like exercise, health/wellness, and parenting. Advice seekers can connect to Wizperts via their blogs or the service’s website, and most importantly, they can begin a conservation within seconds...Wizpert is a win-win for experts and advice-seekers alike. Bloggers will be able to monetize by offering advice in their free time (Wizpert takes a 25 percent cut), and consumers will be able to instantly get the help they need from approved experts.

Well, I'm not sure about knowledgeable, because they actually sent me an invitation to join. I declined, of course — it wrenches my conscience enough as it is to inflict my thoughts on you all for free; I can't imagine charging anyone money for the privilege. My girlfriend said this could be the big break I need toward my dream of becoming a cult leader, but I guess I have to face up to the fact that I just don't have enough ambition for that after all.

Dream Dream, Filling Up an Idle Hour

David Cain:

A few weeks ago, as I was turning on the dishwasher before we left my place, she said something like, “Dishwashers are what’s wrong with the world.” Something about that sounded right. I asked her to explain.

“Life is composed of primarily mundane moments,” she says. “If we don’t learn to love these moments, we live a life of frustration and avoidance, always seeking ways to escape the mundane. Washing the dishes with patience and attention is a perfect opportunity to develop a love affair with simply existing. You might say it is the perfect mindfulness practice. To me, the dishwasher is the embodiment of our insatiable need, as a culture, to keep on running, running, running, trying to find something that was inside of us all along.”

We used to have to spend a lot more time and attention maintaining our basic possessions. Dishes had to be washed by hand, stoves had to be stoked, clothes had to be mended, and meals had to be prepared from scratch.

Little was automated or outsourced. All of these routine labors demanded our time, and also our presence and attention. It was normal to have to zoom in and slow down for much of our waking day. We had no choice but to respect that certain daily tasks could not be done without a willing, real-time investment of attention.

There's some truth in that. Much of life is insignificant mundanity. I certainly prefer a life of relaxed, calm focus to one of frantic busy-ness. But still, I say — and not just because last month's experience is still fresh in my mind — this strikes me as someone trying too hard. If you were inclined to be uncharitable, you could say it sounds a bit like spiritual one-upsmanship: "I appreciate housework on a deeper level than you do." It sounds like what Alan Watts described, a hyper-conscious attempt to scrutinize every moment so as not to miss...some vague transcendant something-or-other.

In fact, I say that because I used to be the exact same way. I never had a dishwasher until I moved into this house several years ago. Back in my days of renting a house in the country, I stood at the sink thousands of times with my hands soaking in soap suds and my brain soaking in ideas from books about Buddhism and voluntary simplicity, taking quiet pride in how "mindful" I was being. Watching over my own shoulder, essentially, as if something profound would be revealed in how I scrubbed hardened pasta off a plate.

Like any activity performed deliberately and attentively, it could have a grounding, calming effect, sure. But that could apply to eating a bag of potato chips or picking your nose, too; there's nothing sacred about chores, unless you're still harboring the cobwebs of a Protestant work ethic in the corners of your psyche. And it could be argued that the most profound effect washing the dishes had on me was to fulfill my slightly control-freakish need to assert myself as master of my domain and put things in clean, efficient order, thus contributing to further alienation from the natural, messy, chaotic flow of life.

Humans seek out meta-levels of reality. It's what we do. Perhaps it's the definition of what it means to be human, to treat every action, every object, as a symbol of something else, as a link in an endless chain of contingent meaning. Seeking "deeper" meaning is an expression of that. Imagining a ceaseless state of love and acceptance which can be attained by concentrating one's focus and will like a laser to burn through the veil of maya and perceive the pure truth behind is another expression of that. But there's nothing more true or authentic about washing mass-produced utensils and plates in a sink with store-bought detergent and running water, as opposed to loading them up in the dishwasher, or lugging them down to the creek in a hand-built cart to scrub them with pine needles.

It may have particular benefits. It may contribute to you being a kinder, more thoughtful person. It may help you calm down after a stressful day. But if it doesn't, you're not necessarily doing it wrong, either. I don't doubt that there are many people who feel most alive and perform at their optimal level in a frenetic environment and feel irritation and frustration when forced to move ponderously. I don't relate to them, and I certainly wouldn't want to be one of them, but I wouldn't imply that they're lacking some integral part of human nature, either. The contemplative sage is just one of many possible human permutations, by no means the ne plus ultra.

And daydreaming is just as integral a part of what it means to be human as anything else.

Wednesday, April 17, 2013

Improved Means to an Unimproved End

Jerry Adler:

They are the Nisei of cyberspace—the first generation born into a world that has never not known digital life and so never had to adjust to it as the rest of us settlers have. Like all Nisei, they understand the new world in ways their parents never will and speak its language with far more fluency. If you want to understand the past two decades, they are perhaps the perfect subjects. The drumbeat of disruption and technological advance that has defined the past 20 years is their natural rhythm.

I was born in 1949, so the first 20 years of my life spanned a similarly disruptive era. But the forces that molded my generation were political and cultural, not technological. Nothing in my use of vinyl records or radio or the telephone set me apart from people who were born in 1929 or 1909.

That's an interesting way to put it. I was actually just thinking about that earlier today — so much of what I see in the blogtwitosphere is simply yakking about the media and gadgets we use. The tools themselves. As if achingly banal things are somehow relevant or interesting simply because a social networking site was involved, or because they were facilitated by the current soon-to-be-obsolete gadget. Politically and culturally, things do seem pretty stagnant, so I guess people are eager for any sort of distraction. But it makes me think of an old interview with Eddie Vedder, where he mentions being sick of hearing about young people, and actually looking forward to getting old. I thought that seemed strange when I first read it, but I think I can relate to it now.

Monday, April 15, 2013

An Ill-Fit of Peak

William Deresiewicz:

There is a lesson here. Idiomatic mistakes, at least the ones that stick, are not produced by the hoi polloi. They happen when people try to sound educated—or to be precise, when educated people try to sound more educated than they actually are. A little learning is a dangerous thing. You hear a word like vagaries or misnomer, you think it sounds impressive, you think you know what it means, and you deploy it the next chance you get. And then somebody who has less cultural capital than you, and who looks to you as an authority, picks it up and uses it in turn.

I don't think it's so much a hierarchy of authority, I think it's probably more like the children's game of telephone. Those of us who don't read dictionaries for fun — not that I know any people who actually do stuff like that — learn most of our new words through the context of conversation. The speaker doesn't have to be personally authoritative; the word just has to plausibly fit in the overall sense of the surrounding sentences. If I came across a word I didn't recognize, and I didn't feel like looking it up, I'd figure out what the speaker or author was getting at, and look for any vacant space of meaning, so to speak, where I could fit the new word in.

Speaking of pedantry, though, I found this amusing: I finished reading Dennis Baron's book A Better Pencil last week. Baron, a professor of English and Linguistics at the University of Illinois at Urbana-Champaign, had this to say on page 222:

Plus, as an editor of mine once told me ruefully, even printed books are never error-free: there's always some infelicity of style, misstated fact, or typo that has escaped the eagle eye of editor and proofer...

Followed by, as if to prove the very point, on page 241:

Nor should it come as news that all technologies of the word control access, or attempt to do so: the full force of the law will come down on anyone who tries to sneak a peak at the latest Harry Potter before its release date... 

Nooo! They got the English and Linguistics professor, too! Damn it. Looks like I'm going to have to steel myself for a long, solitary guerrilla campaign on this front.

Sunday, April 14, 2013

If I Quit My Hobby, Then What Would I Do?

Meng-hu:

What is our work on this earth? Is it a career, profession, busyness pursued for gain, idleness pursued for leisure? Is anyone obliged to pursue anything more than the work of the soul in discovering itself, in harmonizing itself (to God, to Nature, to the universe, to the planet)? Whatever form of work one has, that work must always be the work of enlightenment, or, to use a more modern diction, that of consciousness. What we do to buy food and pay rent is not work but social necessity. That which we do to enrich the soul is our work. Let us pursue it diligently.

Yes, let's. This reminds me of Bruce Ellis Benson's excellent book Pious Nietzsche, from which I'll present one of many equally worthy paragraphs:

To affirm the music of life —to practice music — is to cultivate one's creative vitality (which Nietzsche often calls the "will to power"). This rather broad conception of "practicing music" may seem strange to us (since we today define "music" in a relatively narrow sense), but it would have been perfectly sensible to Nietzsche, who would have had the ancient Greek sense of mousikê in mind. Practicing music for the ancient Greeks was much more than "playing" or "listening" to music. Indeed, as we will see, it also included any art that developed oneself or cultivated one's soul.

Listening to music isn't just entertainment for me; it's a way of harmonizing myself, as mentioned above. It's a way of tuning jangled thoughts and turbulent emotions, gaining inspiration, and facilitating restoration. Same goes for writing. What I do here is more like what Buddhists call their "practice", or what Benson in his book calls askêsis, a spiritual discipline. I'm not persuading anybody or expressing anything important. The goal is merely to become incrementally better at turning vivid perception into clear expression, possibly becoming a mousikos one day.

Came Into This World as a Reject

Will Shetterly:

Its third sin is featuring a rap artist. Many elitists hate rap as much as they hate country, though they don’t like to admit it for fear of appearing racially insensitive. Those who do like rap, like Ta-Nehisi Coates, say Brad Paisley teamed with the wrong rapper, as if only certain black artists deserve to have opinions about white folks who wear the Confederate flag.

I admit up front that this is nothing more than anecdotal evidence and speculation, and that it's nothing anyone would candidly admit to in any event, but I've always suspected that much of the visceral loathing you see expressed on pop-culture snob havens like the Onion's A.V. Club or Pitchfork toward the musical genre popularly known and reviled as nü-metal was a pure example of scapegoating. What I mean is: one of the defining characteristics of that genre, the thing that made it more than a simple continuation of late-80s, early-90s heavy metal, was the overt influence of rap and hip-hop. These were the kids who grew up seeing the collaborations between Aerosmith and Run-DMC, Anthrax and Public Enemy, and the Judgment Night soundtrack, and turned them into entire signature styles, rather than just novelty songs.

For a while there, I recall a lot of people saying it outright: bands like Limp Bizkit, Crazy Town and Korn, whatever else could be fairly said about them, were objectionable in large part for being wiggers. White boys who acted black, adopting the slang, clothing and mannerisms (to wit). The high-minded interpretation that sociology students and social justice warriors would likely offer is that they were simply offended by the "cultural appropriation" of privileged honkies getting rich and famous by making black styles more palatable to the mainstream. (Of course, if Elvis, Led Zeppelin and the Rolling Stones can be respected as artists having done the same thing, Fred Durst can probably feel at least a little justifiably aggrieved at his lack of critical acclaim.) But it always seemed to me that your typical discerning, progressive music fan would usually acknowledge the misogyny and other, uh, regressive attitudes prominent in some rap with a squirming awkwardness, whereas they had no problem coming down like a ton of bricks on nü-metal musicians for the same things.

Again, it's just my (possibly limited) perception. But I used to read a lot of popular music press, and it wasn't difficult to find attacks on those bands for their dick-swinging macho lyrics about bitches and fags, for their unashamed, un-indie commercial striving, and for their lack of musical creativity, expressed with rabid hatred in a way that I seriously doubt any of those critics would have felt comfortable criticizing black artists.

And What I Wouldn't Give to Meet a Kindred

I just linked to a post by Will Shetterly, and I'm about to link to another one, but in between those, I wanted to acknowledge this one too:

This blog has gotten a few hits recently from people looking for leftist critiques of identity politics. I suggest starting here:

"The limits of anti-racism" by Adolph Reed Jr.
"Why Anti-Racism Will Fail" by Rev. Thandeka
"Race, class, and "whiteness theory"" by Sharon Smith
"The Trouble With Diversity" by Walter Benn Michaels

Years ago, I had asked a few of my academic friends for precisely such critiques. Arthur suggested Robert Hughes' Culture of Complaint, which was indeed enjoyable, but other than that, they didn't have any further suggestions, and the political blogosphere, with its tendency to encourage echo chambers and tribal signifiers, was no help. Along the way since then, I've picked up a few useful insights on the topic from conservatives and formulated some rudimentary ones through my own effort, but it's a pleasant surprise to just stumble over something like this, all wrapped up nice and pretty with a bow on top. I read Michaels's book a few months ago (found it so-so), but the rest are new to me.

I had seen perceptive comments from Shetterly on Freddie deBoer's blog, and now, looking over more of his posts, he seems to be interested in many of the same topics, from groupthink to cognitive biases, that I've been fascinated by lately. Check him out, I think you might find it worthwhile. You're already reading here, aren't you? This could only be a step up.

And Every Bumper Sticker on the Back of Your Car Makes You Feel a Little More Real

Brad Warner:

If we become comfortable with not knowing, our life becomes easier, and we become freer. Krishnamurti talked about “freedom from the known.” It’s a wonderful feeling. We often become mired in that which we think we know. We become trapped by it, stuck in it. This is a problem because knowing is always to a greater or lesser extent an illusion.

There are a few things we reliably know. I know there’s a cup of water to my right. I know how to drive a car. I know where I parked it. I know how to make really good curry.

But most of the really important stuff we just don’t know. I don’t know how to end gun violence. I don’t know what to say to a girl after I’ve taken her to dinner. I don’t know where I’ll be in five years. I don’t know if my forthcoming book will sell a bazillion copies or none. I don’t know a lot of stuff. And this makes me uncomfortable.

Will Shetterly:

Now that I've accepted that we're all rationalizing animals, I'm giving up my belief systems. Whether I can live without one, I dunno, but I'll try. Because so far as I can tell, whenever large groups of people get together to do something awful, a belief system is the excuse, even when the actual reason is the hope of taking someone else's lands or goods. To do the worst things, people need to believe they're doing good.

This isn't a celebration of willful ignorance, of course, merely an acknowledgement of the need to relinquish ideological rigidity. Personally speaking, that's not to say that I don't know nuthin' 'bout nuthin' at all; it's just that my knowledge tends to be more ornamental than useful. And it's not to say that I don't have opinions, obviously; it's just that if I'm honest, I don't feel very certain about the ramifications or the implications of those opinions. A does not necessarily entail B, let alone everything else along the way to Z. I don't feel the need anymore to connect all those opinions together within an internally consistent ideological edifice. Could be that I'm finally just mature enough to cheerfully acknowledge what has always been the case: that I don't necessarily have a basic understanding of everything of importance, nor do I need to. The Chinese farmer knows the deal.

Friday, April 12, 2013

You're the Sounds I Never Heard Before, Off the Map Where the Wild Things Grow, Another World Outside My Door

You, darkness, of whom I am born—

I love you more than the flame
that limits the world
to the circle it illumines
and excludes all the rest.

But the dark embraces everything:
shapes and shadows, creatures and me,
people, nations — just as they are.

It lets me imagine
a great presence stirring beside me.

I believe in the night.

— Rilke

Melanie McGrath:

For me, insomnia’s greatest gift is the uninterrupted time and mental space it allows for reading and thinking. There’s a freedom to the night, an unconstrained permissiveness. Under cover of darkness, anything goes. Being awake in the night feels like stealing a march on time. Senses sharpen, so does the memory. The air stills and it is as though you have passed into some other, more magical dimension in which earthly rules no longer apply. There’s an exploratory feeling to the night, a special magic, as anyone who regularly stays awake through it knows. The night’s sounds, smells and sights are exclusive. The quiet lends itself to brooding, even to epiphany, at the very least to an intense focus, what Seamus Heaney calls ‘the trance’ which can be both alluring and, for creativity, highly fruitful.

And so I think and I read.

That's a beautiful little passage, and she's right. I always loved that about working at night. For me, I think it was an extension of my extreme introversion more than anything else; being awake and active at night was a chance to be free from the prying eyes and intrusive approaches of other people. But yes, assuming you're not awake due to being delirious with illness or caught in the grip of the night terrors, it does engender a certain psychological shift, a slight difference in perspective that I've always found beneficial for imagination. Much of the writing I've done here, up until a couple years ago, was actually done in my head while working at night.

When I was a kid, I used to be allowed to go to work with my dad on weekends or over summer vacation if I could be awake and ready to go on time (and I quickly learned that reading in order to avoid getting sleepy was a terrible strategy). I was always fascinated to see evidence of other people being awake at all hours, especially those who weren't working like we were — why were they still up? What was different about their lives? As in most things, I'm sure the reality was far less interesting than the fantasy, but still, I sensed something attractive and intriguing about what it would mean to consciously choose to set yourself against the traditions and habits of everyone else. Overnight travelers, drunks looking for a place to sleep, or people working the graveyard shift for lack of any better options were transformed in my childish imagination into what I would later conceptualize as philosopher-poet-hermits, gently resisting the gravitational pull of normalcy and respectability.

Even as I got older, I'd still feel a fleeting sense of kinship at the sight of a lit bedroom window out in the suburbs, or a silhouette moving through the kitchen, or even the occasional person walking through a parking lot or alongside the road; not hitchhiking, not furtively sneaking around, just purposefully heading somewhere for their own reasons on their own schedule. We shadow people flitting about on the fringe. Eyes of awareness keeping solitary vigil.

To go in the dark with a light is to know the light.
To know the dark, go dark. Go without sight,
and find that the dark, too, blooms and sings,
and is traveled by dark feet and dark wings.

—Wendell Berry


Thursday, April 11, 2013

Hermit Grabs

Man. It's been a very busy couple of days for me, and then having to post bail on top of that...

Tuesday, April 09, 2013

Sociopathology

N+1:

With the generalization of cultural sociology, however, the critical impact has vanished. Sociology has ceased to be demystifying because it has become the way everyone thinks. Discussions about the arts now have an awkward, paralyzed quality: few judgments about the independent excellences of works are offered, but everyone wants to know who sat on the jury that gave out the award. It’s become natural to imagine that networks of power are responsible for the success or failure of works of art, rather than any creative power of the artist herself.

It's a basic tenet of much Marxist-related thinking that the individual is not a coherent unit of sociopolitical analysis; only the study of class, gender and race (plus a few other distinctions that the article mentions) reveals the interests and forces at work in the body politic. And so, in what I'm starting to think of as the blogtwitosphere —the part of the Internet consisting mainly of social media peacocking and posturing, pop/youth/geek culture enthusiasm, self-congratulatory "I see what you did there" humor, malignant snarkomas, and vaguely progressive woo-girlish politics — you get a lot of former humanities students who offer up half-baked analysis of pop culture by way of irrelevant metrics. It's enough to make me glad I never went to college.

Monday, April 08, 2013

Right Here, Right Now, There is No Other Place That I Want to Be

David Berreby:

Now, I have to admit that fantasy is probably truer to the actual sweep of human history. We know many cultures view of history as an endlessly repeating cycle. This view is more common than the modern secular notion that history has progress and a meaningful direction. And we also know from the archaeological record that most assemblages of human beings have been remarkably culturally static. Ancient Meso-Americans had the wheel, but used it as a toy. The ancient Greeks had a steam engine, but also only as an ornamental curiosity. For all their roads and bathhouses and architecture, the Romans never thought to invent the stirrup. In the 18th century, the industrial revolution took hold only in one tiny corner of a populous planet.

The belief that technological improvement is inevitable or constant is a myth. The fact that it happened at all may well be a lucky accident. It seems to me likely that most of the human race has lived its real, non-fantasy life in a Game of Thrones world—where knowledge was static and scarce, and change unknown. And that's the saddest thought of all. When I have to pick my genre, I'll take a dream world of starships over that any day of the week.

That's interesting, because I am firmly in the fantasy camp myself. Never been into science fiction. I prefer sending my imagination backward to pseudo-medieval times. Yet, of course, that doesn't translate in my lived life to a yearning for rural, small-town existence or anything like that. I wouldn't choose to live anywhere, anytime else (I'd just like to take some time-travelling vacations, if that's all right).

But in saying that, don't I simply mean that I wouldn't want to have to make a sudden transition to the living conditions of another time and place? That is, assuming for the sake of this thought experiment that it makes sense to imagine "me" somehow living in Jazz Age America or T'ang Dynasty China or some D&D-style medieval Europe, isn't the problem the fact that I can only do so by bringing along a bunch of luggage containing my experience and understanding as a 21st-century American? Not to put too banal a point on it, but I can never truly know what it was like to actually live back then and truly inhabit the moment. I can only imagine what it would be like to be there as my homesick self.

And so, then, does it make any sense to project our own conception of purposeless stagnation back onto pre-industrial societies, as if they lived with a conscious awareness of how tedious their lives would look in comparison to ours? Did they sit around unhappily bemoaning their lack of technological invention and supporting infrastructure, or did they just live their lives, loving and hating within their given parameters? Does the average American sit around in a library today joyfully enthusing about the wonders of science and progressive knowledge, or do we complain about our shitty cellphone reception and seek out porn and reality TV to pass the time? Human nature only changes very slowly, if at all. Perhaps the biggest difference between us and people of bygone ages is the way we've become addicted to novelty and customized individual choice.

My own attitude toward all this is one of jaded-but-indulgent acceptance; I'm one of Montaigne's spectators of life. I try to be consciously grateful for the amazing things unique to my day and age, but I also recognize that the mystics are right when they say wherever we go, there we are.

The End is Not, So Long as We Can Say “This is the End”

Carlos Lozada:

So, men may rise again, free markets may survive, power may concentrate, lawyers may thrive and sex may go forth and multiply. As Vassar’s Hsu put it to me, this type of idea “works best when it’s somewhat self-aware of its limitations.”

Except the entire genre undercuts that impulse. The more grandly you proclaim the end, and the more vast and undefined the thing that is ending, the easier it is to kill off. That’s why we see essays like Peter Theil’s “The End of the Future,” as ambitious in scope as it is dubious in argument.

If you’re contending that something specific has ended — well, the specific is measurable, observable and debatable. Specificity implies expertise. Generality is accountable to no one.

So let’s get it over with and declare, once and for now, the end of everything.

Meta-humor like this makes me snicker. Of course, let me remind you that I already pronounced a time of death on such pretentious portentousness. Or portentous pretentiousness. No, wait, had it right the first time. Whatever.

Sunday, April 07, 2013

Unhumanize Our Views a Little

As for us:
We must uncenter our minds from ourselves;
We must unhumanize our views a little, and become
confident
As the rock and ocean that we were made from.

— Robinson Jeffers

John Gray:

In any case a low-tech, relocalised economy would not deal what Lovelock regards as the fundamental problem: the rising numbers of human beings. Climate change has not always been caused by us; there appear to have been several large shifts before the human species existed. However, if the current global warming is anthropogenic (as Lovelock still firmly believes), human numbers play a critical role in the process.

When I suggested to him that the perennially unfashionable Thomas Malthus may in the long run be shown to have been on the right track, he responded: “Yes, John, I agree strongly with you that rising population is probably the greatest danger. If we had stayed at Malthus’s numbers, one billion, there would be no climate problem.”

Like nearly all economists, most greens insist that Malthus was wrong. The problem, they say, lies in the resource intensity of the western way of life; what we need to counter this is a global redistribution of power and wealth. I am not sure if Lovelock shares my view that this is an entirely utopian prospect, but he is clear that sustainable development – the current mantra – cannot deal with the challenges posed by a rising population. What is needed instead, he suggests, is sustainable retreat: a strategy of reducing the human impact on the planet by abandoning old modes of food production and embracing high-density urban living. (There are parallels between Lovelock’s ideas on these issues and those of Stewart Brand, the editor of the Whole Earth Catalog.)