A sweary—and expertly punctuated—weblog.

Tuesday, December 21, 2010

Nowell

Here is one of my very favorite Christmas arrangements. Take a listen!



(Also: the color changes throughout the video. That is neat!)

Sunday, December 12, 2010

A skeptical Christmas

I have almost no memory of a belief in Santa Claus.

The single memory I have is nebulous at best. One Christmas Eve, when I was four or five, my uncle pointed out the window and told us he could see Rudolph's nose twinkling in the distance. By now the memory is so old and worn-out that I can't say how much I believed him, but I recall that I scoured the sky with at least some expectation of finding a red glow. Of course, whatever hopes I had were for naught.

On the other hand, I have precisely zero memory of finding out "the truth" about Santa. To the best of my knowledge my parents never sat me down to let me in on the secret, and I never accidentally stumbled upon the presents hidden in my parents' closet. As far as I remember, I just grew out of it. (By the time we got our first Nintendo—1988 or so—I didn't believe. I know this because my brother and I found the Nintendo a week or two before Christmas, a discovery which challenged no one's worldview.) I was lucky never to deal with the emotional trauma or feelings of betrayal that (I hear) other kids have to overcome when they learn their parents have been pulling the wool over their eyes. I just stopped believing.

I'm sure my Santa skepticism was facilitated by the fact that my parents didn't try artificially to keep us believing. (Thanks, Mom and Dad!) But even so, the fact that I can hardly remember believing in Santa Claus is a symptom of a more general condition: I'm a skeptical fellow, and I always have been. In that spirit, perhaps you will not be surprised at the following announcement, which after all this time I can no longer keep bottled up:

One year ago, I left the Church of Jesus Christ of Latter-day Saints.

I've agonized for months over how to tell this story, writing (and frequently discarding) pages and pages of explanation of my decision to leave. Perhaps sometime soon I will post a detailed justification, but for now let it suffice to say that I no longer find the church's truth claims compelling. It's no more complicated than that. There was no pint of cream nor transgression to conceal. I simply no longer believe.

As with Santa Claus, losing my faith in God has been a surprisingly natural process. That's not to say everything has been easy; if nothing else, managing relationships with friends and loved ones who still believe is a work in progress. But I am the same person I was a year ago, the same person most of you know in the flesh. While I do occasionally mourn my lost faith, on the whole I'm as happy now as I ever was inside Mormonism. Certainly I am as fundamentally good and honest an individual as I ever have been. Indeed, other than the occasional adult beverage (for the record: beer is okay, wine is nice, and whiskey is wonderful) my lifestyle is largely indistinguishable from that of an active latter-day saint.

I consider myself rather fortunate to have survived apostasy with so few emotional scars. It means that I need not reject all of the tradition and culture with which I was raised. It means that, while I no longer believe in God, I still love Christmastime. People have wondered at this, and occasionally I have been accused of inconsistency or even hypocrisy on this point. I admit that my accusers have a point, and while I refuse to apologize I will attempt to explain.

Having experienced one-and-a-half Christmases as a non-believer, I now realize that religious belief is only tangential to what makes Christmas special. Christmastime is an opportunity to spend time with the people we love. To eat food and listen to music that connects us to our childhood. To participate in traditions that not only bring us closer to our loved ones, but also reinforce connections to our shared past. I maintain that one does not need a belief in God to sing Christmas carols, to cook and eat festive food, or even go to a Christmas service. This morning I played the cello in Amanda's ward's Christmas program, and last year we attended watchnight services at a 13th-century cathedral. These experiences were not even slightly cheapened by my unbelief.

But of course I can't ignore the religious aspect entirely, and as an apostate from Mormonism I can testify firsthand of the strife faith can cause. Yet I am entirely untroubled by the religious underpinnings of Christmas. Christmas is religion divested of its propensity for ill. It brings a simple, universal message of peace and goodwill, a God-figure as benign and innocent as a newborn babe. Probably there is nothing true about Christmas's religious message. But there is no harm in this.

This is my third Christmas post. If you look back to previous posts, you'll notice that every year I have made some mention of religious skepticism. Christmas has lately been a particular opportunity to explore the crisis of faith with which I have been dealing for several years now. But never—never—has my lack of faith interfered with my ability to enjoy the season.

Merry Christmas to you all.

Friday, October 29, 2010

When worlds collide

You guys know that I love Dinosaur Comics, and by now many of you will have (correctly) surmised that I have a possibly-unholy man-crush on its creator, Ryan North. You also know that I hate political extremism, especially as exhibited by porcine ideologue Glenn Beck. Yesterday, in a singular amalgam of rage and glee, those passions merged.

A few days ago, Ryan North released his new anthology Machine of Death. Inspired by this comic, it's a collection of short stories about a machine that tells people—accurately but obliquely—how they will die. The machine might tell you "old age", for example, but instead of settling down for a comfortable, long life you are murdered in your twenties by a raging octogenarian! The collection prominently features the work of the webcomics community: David Malki ! of the wonderful Wondermark co-edited, Kate Beaton of the historically hilarious Hark! A Vagrant provided illustrations, and even Randall Munroe of the overrated and frequently abysmal xkcd contributed a story.

While Ryan North and friends might be darlings of the net-savvy world, they don't have tons of real-world clout. Machine of Death was therefore self-published, and most of its publicity came via the webpages of its various collaborators. Imagine their surprise when their scrappy opus went straight to #1 on amazon.com! It's a feel-good story for the ages.

The release of Machine of Death, however, coincided with the release of Glenn Beck's latest book Broke. And instead of debuting at #1 as he has come to expect, Glenn Beck was beat out by a ragtag group of independent artists (and, adding insult to injury, Keith Richards).

But instead of accepting third place graciously, Beck decided that his loss was due to a liberal "culture of death"—never mind that he was beat out by both "Death" and "Life", which must indeed be demoralizing!—that threatens to destroy our very way of life. You can find the audio clip from Beck's radio show here. What's that? You don't want to listen to Beck's sonorous, mellifluous voice? Very well then; here's the salient quote:

These are the — this is the left, I think, speaking. This is the left. You want to talk about where we’re headed? We’re headed towards a culture of death. A culture that, um, celebrates the things that have destroyed us. Not that the Rolling Stones have destroyed us — I mean, you can’t always get what you want. You know what I’m saying? Brown sugar. I have no idea what that means.

This is where we are heading, you guys! If they are not stopped, small, independent groups of creative, sincere, kind (seriously: check out North's and Malki's twitter feeds; they are populated with enthusiastic interactions with fans), and entrepreneurial artists will, for dozens of hours, succeed in selling more books than pink-faced, corporate-sponsored propagandists. This is a threat to us all.

In response to the controversy, Malki created an infographic helping us to distinguish political thinkers from opportunistic shysters:

Here's how can you tell: Instead of accepting defeat—which conforms to his purported political principles—like a man, the shyster will cry like a baby when his followers don't give him enough money, complaining that subversive, insidious forces are to blame.

Quoth my lovely wife: "Aw, poor Glenn Beck. Here, have a lollipop to make everything feel better."

Tuesday, September 21, 2010

Noisemaker

I found this quote on Facebook today:

Intelligence is the ability to hear someone's opinion and not be swayed by it.

It was left unattributed, prompting me to consider the sad possibility that the page's proprietor authored it himself, feeling sufficiently proud of his brainchild to inflict it on the internet-going public. Too bad for him. His quote is blindingly, breathtakingly stupid, and completely antithetical to every intellectual endeavor ever. (Also: it makes me angry.)

At the risk of insufferable white-knighting, let me clear something up. Intelligence never exists in a vacuum. Intelligence is routinely mistaken. Intelligence neither has nor pretends to have all the answers. Intelligence is sufficiently confident that it cheerfully admits its limitations. Above all, intelligence craves further understanding, relishing in opportunities to refine and revise and be swayed by the opinion of another.

Here is a tip for you, O anonymous peddler of Facebook quotations. When you argue, you can nearly always find common ground with your opponent, a reasonable component of his argument that causes you to adjust — ever so slightly — your thinking. When this proves infeasible, there are two possibilities: either your opponent is an intransigent, incoherent noisemaker, or you are. Take care that it isn't you.

I feel guilty having subjected the internet to the above quotation, and my grandstanding is not penance enough. Here is compensatory wisdom, quotes that ought to be on our anonymous friend's profile:

The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts. - Bertrand Russell

The first principle is that you must not fool yourself — and you are the easiest person to fool.
- Richard Feynman

The truth is always a compound of two half-truths, and you never reach it, because there is always something more to say. - Tom Stoppard

In all affairs it's a healthy thing now and then to hang a question mark on the things you have long taken for granted. - Bertrand Russell

There. Now I feel better.

Thursday, September 9, 2010

The best just keeps on getting better

Sigh. Many of you will remember Jon McNaughton, whose unfettered artistry I sampled in a recent post. Well, he's at it again:

The Forgotten Man!
(Incidentally, one of the cutest parts of McNaughton's site is his right-click blocker, which soberly informs you that the images on his site are copyrighted. I just want to pat him on the head and ruffle his hair!)

Perhaps he realized that playing artist-in-residence to the fringe right generates notoriety and its corresponding profits. I can hear Ayn Rand's post-mortem exultation from all the way over here.

I seriously considered not writing about this painting—there isn't much to say that doesn't also apply to McNaughton's previous offering—yet after the enthusiastic response to my previous post I feel duty-bound to share it. It's all here, just as before: the crushingly heavy-handed political message, the workmanlike copy-and-paste historical portraiture, the inarticulate rebuttal of "liberal" criticism. So as much fun as it would be to tear The Forgotten Man limb from limb, allow me instead to offer summary criticism. It's a better use of everyone's time.

Art transcends the prosaic machinations of day-to-day politics. It may reflect on its times, but when it does so it captures their essence rather than regurgitates their tiresome details. Decades from now we will have largely forgotten the political minutiae responsible for the controversies over which we so bitterly disagree, their particulars no more noteworthy than the 1791 whiskey tax or the merits of bimetallism. Yet McNaughton glorifies the petty conflicts, disgorging one-sided talking points as though they were timeless truths plucked from the tree of knowledge. He panders to the immediate present, and the result has a correspondingly short shelf life. He wants us to accept it as art, but it isn't; it's a political cartoon whose medium happens to be oil on canvas.

Sunday, August 22, 2010

Deconstructive criticism (or: The Pixar model)

Getting a Ph. D. is hard, you guys.

As an undergraduate, I could be a guy who knew stuff. Classes gave me material to learn, and as long as I mastered that material I could confidently expect success. It wasn't easy, necessarily—learning subtle and unfamiliar concepts demands effort—but it was clearly defined; there was never any mystery in how to succeed. Classes had well-encapsulated curricula, insulating me from my ignorance, distracting me with newfound knowledge while keeping me from knowing what I didn't know.

Graduate school is exactly the opposite. Your first task is to become familiar with the literature in your discipline, which is no small feat. Even within a single discipline there are more ideas than you can internalize, more papers than you can ever possibly read. And since graduate research is highly individualized, there can be no master syllabus of necessary and sufficient papers. You must therefore decide for yourself what to learn. Rather than being led through a carefully-crafted curriculum, you get a shocking, unstructured look at your ignorance—an ignorance you can only selectively remedy. And, for good measure, there's no one to administer a quiz at the end to make sure you understand it correctly. All of this means that even after reading a bunch of papers it's hard to have confidence that you know enough to carry out successful research.

It's even harder when doing your own research. This isn't just because the problems to be solved are difficult. In fact, I bet most researchers would agree that actually doing the work is the comparatively easy part of graduate school. Instead, the hard part is knowing what to work on. The great struggle of a scientific Ph. D. is finding a problem that is simultaneously important, unsolved, and solvable. And there's no formula—at least, none that works—for finding such a problem. It's relatively straightforward (again, not easy, but usually straightforward) to sit down and start doing a little math. It's hard to know whether or not that math is going to lead to anything important. The path to Ph. D. is a ragged serpentine, full of blind alleys and unintended excursions.

And you have to do it alone. One of the most terrifying realizations of graduate school is that your advisor does not have the answers. He doesn't fully understand what you're working on. He doesn't know whether or not that work will yield publishable results. Hell, he usually doesn't even know whether or not there are mistakes in your work. He doesn't have time to hover over your shoulder and micromanage your progress. He can give you valuable advice from his experienced (but information-poor) perspective, but no one knows your research as well as you.

This shift the burden of criticism back onto you. You have to look through your own work, scour it for flaws and weaknesses, and decide whether or not you're doing fruitful research. In part this is a healthy exercise: anyone who hopes eventually to head his own research program needs to learn to discern good work from bad. And, in any case, healthy self-criticism is an important part of being a well-adjusted member of society.

But let's be honest: self-criticism is hard. Every talented person wants to believe he is talented, secretly fears he is not, and furtively goes about proving to himself and others that his fears are unjustified. Graduate students routinely suffer crises of confidence, and even honest self-criticism can aggravate the symptoms. Yet academic research is sufficiently demanding that confidence—perhaps even unreasoning, absurd overconfidence—is an essential component of success. It's nearly impossible to fruitfully chase down an idea while tending nagging fears of making mistakes or wasting time down blind alleys.

So, as a researcher you face a dilemma similar to that of an artist. Pay too much heed to your internal critic, and you end up paralyzed by self-reflection. Ignore it entirely, and you produce flawed, unimportant, or otherwise self-indulgent output. Every artist, composer, scientist, and writer faces this dilemma, each walking the knife-edge in his own way.

But some walk it really, really, well. I recently came across an article describing the creative process at Pixar. (If you don't like Pixar films, then you're either too cool, have a heart made of stone, or otherwise suffer from crippling personality flaws. Seek professional help.) Pixar has consistently output innovative, high-quality films, and they achieve that success through interesting means. Every morning, animators and directors gather to examine the work completed the previous day and pick it apart in excruciating detail—deciding what works, what doesn't, and how it should be improved. The animators then take the criticism back to the drawing room and implement these changes. Rinse and repeat until you have a masterpiece on your hands.

In one sense it's not terribly surprising that Pixar's formula is successful. Take a bunch of creative and talented people, create a highly collaborative work environment, and you get phenomenal success! How... non-obvious? But their story struck me in a non-obvious way. How much easier is it to silence the self-critic when you know that a group of your smartest, most talented friends is going to go over your work tomorrow, looking for flaws you overlooked? How much easier is it to be freer, more daring, and more innovative when you know that other talented people are going to keep you from going too far off the deep end?

As someone who suffers from an overactive internal critic, it sounds liberating to me. It's not that I want to avoid the responsibility of evaluating my own work but that I could do it much more effectively with constant, systematic oversight from respected, talented colleagues. As I look forward to the (hopeful) future of running my own research group, the Pixar model appeals to me.

Of course, implementing it presents serious challenges. In describing the model I've tacitly assumed that you have a group of talented people, each of whom is secure enough in his talents and secure enough in his colleagues' that he will not only accept criticism without feeling attacked but also give criticism without attacking. That's a work environment that must be difficult to create and even more difficult to maintain—especially among smart, successful people whose overdeveloped superegos have gotten them far in life. It probably requires a strong sense of shared objective that's hard to foster among creative, independent thinkers.

I don't know what the answer is, but Pixar's success gives me optimism. They've proven that such an environment is possible to create and to maintain for well over a decade. Can't I hope to achieve a creative environment that crusty ol' Steve Jobs has pulled off?

[PS: Made on a Mac!]

Saturday, July 31, 2010

Retrospective / Autobiography of a face

One year ago today, we moved into our new house. After a few hectic days of packing and painting and paperwork, assisted by our gracious friends, we finally moved everything into our townhouse and officially became homeowners. Your first home is a life-changing experience, the kind of achievement you mark for the rest of your life.

But we're not here to celebrate that anniversary. Houses are great and all, but a much more momentous change came upon me that day. For on July 31st, 2009, I began growing a beard. Those of you fortunate enough to see me regularly in person know the glory of which I speak. Some of you may even be jealous. Never mind that; today is a day for celebration. My beard has accompanied me through much life in the past 365 days. It saw me across the finish line of my first marathon, journeyed with me on a South African safari, and yes, grew with me as I settled into becoming a homeowner. It may be my truest friend.

Yet it is hard, even for me, to go on for paragraphs and paragraphs about the unfathomable wonder of my facial hair. Instead, allow me to turn this moment of celebration into a public service: a how-to guide, written for all of you—and I trust it is indeed all of you—who secretly yearn to emulate my success.

When I first set out upon my beardly quest there were few references available. Esquire had a useless article, and I found a YouTube video or two, but I could not find sufficiently specific information, and I was forced me to make it up as I went. And I made some bad mistakes. Fortunately, after a year of experimentation I have become a fully qualified beardologist. Allow me to share with you my wisdom, lest you fall into the same traps that so ensnared me.

Preliminaries

Before discussing technique, let's first talk about why you should grow a beard. It's imperative that I dispel the all-too-common myth of beard teleology: you need no specific reason to grow a beard. You need no rebellion against the draconian standards of your youth. You need no desire to augment your manliness. These are lesser justifications, put forward by lesser men growing lesser beards. The true beard-wearer grows his beard simply because he wants to. He is nothing more than a dude with an adventurous spirit and a desire to let his hair follicles in on the adventure.

It's also imperative that you maintain confidence through the early stages of beard growth. The neophyte beard-grower is often disheartened by the time necessary to grow out a fully magnificent beard. I will not sugar-coat the truth: as much as a month without shaving will be required to grow a respectable beard, and in the meantime your proto-beard will not be an attractive testament to testosterone but a sparse, scruffy embarrassment. Take courage, friend; all who would achieve beard-dom must pass through this ordeal. Given time, your paltry prickles will blossom into a majestic mane.

Once your follicles have produced the raw material, it must be molded into beardly greatness. I will devote the remainder of my words to introducing the three core principles of beardology: shape, contour, and blend. Each corresponds to one of three tools you will need: razor, trimmer, and scissors. Let's discuss each one individually.


The instruments of beardology. All photos courtesy Amanda.

I. Shape

The first step in growing a beard is choosing the shape, as defined by the portion of your face you choose not to shave. One of the benefits of having a beard is that you won't have to shave as much or as often, but it's still important that you keep your beard's shape well-defined by shaving—with a razor—the area around your beard.

While this chart does a pretty good job of ranking beards from best to worst, shape is ultimately a personal decision determined by the natural coverage of your facial hair and your personal sense of facial aesthetics. For example, I have pretty full coverage (as well as classic, timeless aesthetic), so I grow a standard full beard. Villainous folks will be interested in goatees and soul patches, and those of you looking to make use of your aviators will want to grow a mustache.

Don't grow a mustache.

For a full beard, shape is defined primarily by the neckline. There are a few schools of thought on proper neckline position. Esquire mandates a neckline at one inch above the adam's apple. Others suggest a neckline that closely follows the jawline, resulting in a mostly bare underjaw. I prefer to place my neckline at the corner where the underjaw intersects the neck; it results in a clean, anatomically-defined line rather than one arbitrarily and artificially imposed.

There is room for honest disagreement on this issue. A lower neckline, for example, can accentuate the turkey-neck, so a higher neckline may be preferable. On the other hand, I have seen successful necklines that stretch well below the neck/underjaw corner. Regardless of position, the key to a successful neckline is geometry. Draw an imaginary line from the bottom of your ear down to whatever point you've chosen above or below your adam's apple. The line should curve slightly to accommodate the shape of your neck, but it must be smooth. Shave along this line. This can be difficult, as part of the line will likely be obscured by your jaw, so it's often helpful to pull back the skin around your jaw so you can clearly see where you are shaving.

Some beard-growers will also define an upper shave-line on the cheeks. I think this is typically a bad idea: artificial lines make your face look evil and should be kept to a minimum. Instead, the occasional stray hair on your upper cheek can be shaved or plucked individually, preserving the beard's natural contours. However, if your beard ends raggedly on your upper cheek, you may be forced to define an upper boundary with your razor. In either case, allow your beard to define itself as much as possible.

If you aren't growing a full beard, the shaping principles will be different. In general a goatee's neckline will be much closer to the jaw, and a "philosopher's" beard is best grown with no neckline whatsoever. But for specifics you'll have to look elsewhere.


Left: the corner-defined neckline. Right: a natural upper boundary. There are a few individual hairs that probably could be plucked to clean up the boundary without making it look artificial.

II. Contour

To maintain a beard, you will need to purchase a beard trimmer. They aren't terribly expensive, and they are indispensable for a clean-looking beard. A beard should look classy, not sloppy. Your poorly-maintained beard ruins it for the rest of us.

Decide how closely you want to trim your beard. In general, shorter beards look cleaner and younger, while longer beards look serious and distinguished. Let your age and relative awesomeness be your guide. However: do not, under any circumstances, consider a stubble beard. No, it doesn't look good on you. No, you don't look rugged or roguish or dangerous or debonair. You just look like an ass.

My worst rookie move was to assume that I should trim all of my beard to the same length. This is a mistake. Part of the reason for this is evenness: different parts of your beard have different coverage densities, and trimming at different lengths allows you to create an illusion of uniformity across your face. In general, the more dense the coverage, the shorter you should trim the hair. The other part is aesthetics: some portions of your beard (such as your underjaw and neck) are unattractive on their own, and others (such as your mustache and soul patch) have evil connotations. By trimming these portions shorter than the rest of your beard you can de-emphasize them, resulting in a beard whose components fit together in a unity of form.

Contour is tricky to get right, and ultimately you'll simply have to experiment. It's useful to pay attention to photos of yourself, particularly ones taken from odd angles. This allows you to see your beard as others see it—instead of what you see in the mirror. In my case, I set the trimmer to 11 for most of the beard. I turn it down to 8 for the neck/underjaw, 5 for the mustache, and all the way down to 3 for the soul patch. It's nearly impossible for an honest man to trim his soul patch too short.

III. Blend

On the whole, your trimmer will do most of the heavy lifting of getting your facial hair the length you want it, but it is not a precision instrument. To finish the job you will need a decent pair of barber's scissors. Go out and buy a pair; you can get one at Target for $5 or so.

You need the scissors for two reasons. First, no beard trimmer works perfectly. Since your face is irregularly shaped, there will be sections of beard that your trimmer is not nimble enough to handle. My trimmer, for example, has a particularly hard time with the area just under the hinge of my jaw. Your trimmer will also leave the occasional stray hair. This looks particularly bad on the mustache; mo-hairs encroaching on the lip are particularly unsightly. In either of these cases, you'll need to go in manually with scissors. There's no magic technique to scissor-trimming, and unfortunately it's a bit clumsy at first, but it isn't difficult to get the hang of it.

Second, we've trimmed different sections of beard to different lengths as prescribed by Contour. To forge these disparate sections into a coherent unit of magnificence you must blend them together with your scissors. The trick is to look along the section boundaries for mismatched hair lengths. Most of my blending efforts are spent either fading my jawline into my underjaw or matching my mustache into the rest of my face. Again, there's no trick to blending, but fortunately it's easy to get into a quick rhythm. Be bold; an accidental short patch will grow out quickly.



Left: my trimmer can't get at these hairs under my jaw, so I go at them manually with scissors. Right: notice the individual hairs drooping onto my lip. Not attractive.

Conclusion

There you have it. This guide, while incomplete, gives you the tools and knowledge necessary to traverse the path to beard-dom. Go forth, o my brothers, and make majesty of your faces.

Sunday, July 18, 2010

Pushing

Written last week during a fit of insomnia, and delivered against my better judgement into the cruel hands of the internet:
On my twenty-sixth birthday, I joked that I was "pushing thirty". It was a casual, throwaway technicality, designed to poke fun at the neuroses of those nearer to the mark, borne of an arrogance only possible because my own senescence was a hypothetical whose realization I had never confronted. Now, scant years later I am, unequivocally, pushing thirty. No longer merely discernible on the horizon, it hurtles toward me (or I toward it) relentlessly, and I—fattening, incipiently balding, cognitively ossifying—scrabble for handholds as my horizontal trajectory tilts savagely to vertical, seeing not only the mark but the headlong path beyond it, hanging momentarily weightless at the precipice as fear and anxiety are swallowed up in a single, overriding despair:

I don't want to die.

Sunday, July 11, 2010

Meta-post: Rebranded

You hopefully notice that the blog has a new look. I worked at a web design company during undergrad, and after the successful redesign of my ECE webpage and Amanda's now-mostly-photo blog, I decided to apply my vestigial skills to these humble pages. It turns out that Blogger really doesn't want you to design your own template; they'd rather have you use one of their prefab designs. But after wrestling Blogger to the floor and putting it in a half-nelson, my custom design emerges victorious.

I hope you like the new design—I stole from a lot of good pages to put it together! After a year or so of resisting it, I finally incorporated the "lowercase" idea (which was never intended as a pun; "lowercase profanity" refers to mild swearing, the kind a good Mormon boy might use when he gets upset) into the template. I've tested everything out on a few different browsers, but if anyone notices any layout issues I'd be grateful to hear about them.

On an unrelated note, I've decided to de-list this blog from Google Buzz. Most of those following me are co-workers/collegues, and I'd prefer to have the freedom to write whatever I like without worrying about the fact that I'm explicitly broadcasting it to the ECE department. Everyone is welcome and encouraged to follow (and comment at!) this blog, but I'd rather not deliver it to everyone's inbox. It feels too exhibitionistic to me. I hope that many of you will still keep up with it. That'd be real swell.

Thursday, June 24, 2010

The best art of all the art

We're in Utah this week, attending my sister's wedding and hoping to see some old friends while we're here. We moved away from Provo nearly two years ago, and I'm surprised to find just how nice it is to be back. Not only are the surroundings much, much more beautiful than anything Houston has to offer, but Provo also has a bright, cheerful air that I both admire and miss. Fueled as it may be by equal parts naïveté and self-delusion, Provo's happy-go-lucky optimism makes me feel at home in a way I never could have anticipated while I lived here.

Reconnecting with our Utah roots, Amanda and I wandered around BYU campus for a few hours, eating lunch at the Cougareat, visiting old classroom halls, and eventually perusing the BYU bookstore. In addition to the usual university bookstore fare—hats, T-shirts, and textbooks—there's also a candy store, a floral shop, and a gallery where you can purchase art frames and (mostly LDS-themed) paintings.

You can also purchase terrible, terrible shit.

While browsing the gallery I came across this painting, prominently displayed, by Utah-based painter Jon McNaughton:

Initially I just laughed at what I considered a simplistic, oh-so-Utah expression of religion-cum-patriotism, appropriately rendered in the artless schlock of Thomas Kinkade. As I looked closer and realized the specificity of the artist's "message", however, my emotions began to vacillate between acute annoyance and a long-shot hope that this thing might be a marvelously subtle joke.

Sadly, McNaughton earns no points for irony. His painting may look like an exercise in self-caricature, but the humor is unintentional. That you might understand my frustration—and that I might blow off a little steam—allow me to turn my trained artistic eye on this painting and provide a critical exposition.

The central focus of the painting is Jesus Christ holding the U. S. Constitution up to the world.
This makes sense because Jesus actually wrote the Constitution and revealed it to the founding fathers—devout Christian men like Benjamin Franklin, Thomas Paine, and Thomas Jefferson.
Divine authorship of our Constitution is the main reason that the U. S. is the best country of all the countries. So it's important that we immortalize that in our art.

Along with the requisite historical figures flanking the Author and Finisher of our Constitution, there are a few "modern" presidents whose presence is worth mentioning. Obviously Ronald Reagan, who by construction was the most benevolently badass President, supports Jesus and His pro-American agenda. Curiously, however, JFK is also represented among the Constitutional vanguard. As the only righteous representative of American liberalism, his inclusion can only be explained by his willingness to kill godless Communists.


The lower half of the painting is given over to a depiction of the modern American public, divided into two groups who, significantly, are on the right- and left-hand sides of Jesus. On His right hand, obviously, are the ordinary, decent Americans who Believe in and Uphold the Constitution. Their simple patriotism is rendered in stereotype: there's a soldier in uniform, a mother with child in arms, and a simple, working-class man in plaid and overalls.

In a laudable effort at racial inclusion, a lone black man is counted among the righteous—presumably because he's got his copy of Skousen in hand. (Non-LDS readers should be advised that W. Cleon Skousen was an influential LDS author in the 50s and 60s, writing on both political and religious topics. Whatever his other accomplishments, politically he was a conspiracy-theoretic crank. To wit: Glenn Beck has recently promoted his books in an ill-conceived effort at instigating a Skousian renaissance.)

Finally, we have a school teacher, who reminds us that education is an acceptable vocation among the righteous, but only if you restrict yourself to no further than secondary education and appear as mousy as possible while actually in the act of teaching.

On His left hand are the wicked, unpatriotic individuals whose nation-hating nature is indicated by their association with The Devil Himself!

Most of the evil are easy to identify. The secular scientist looks smugly down on the proceedings through trendy rectangular frames, his arrogance and godlessness manifest in the way he clutches his copy of Darwin's Origin of Species. (Never mind that over 100 yeas ago Darwinian evolution was officially declared to be compatible with LDS doctrine, or that modern evolutionary synthesis is taught as a matter of course in BYU biology classes.)

The activist judge buries his face in despair, realizing that Jesus is here to save the Constitution by eradicating his evil, liberal rulings (including Marbury v. Madison, which set the precedent for judicial review; I wonder what McNaughton thinks of Brown v. Board of Education?).

Others are harder to puzzle out. I suppose the microphone-toting blonde is an agent of the MSM, peddling her liberal propaganda to a populace of proletarian sheeple? It's hard to say.

It's even harder to guess at the money-counting businessman or the near-to-bursting pregnant woman. Their politics seem ambiguous at worst and Jesus-friendly at best. Why are they condemned to kick it with The Devil Himself?


All sarcasm aside, this painting is beyond absurd; it's odious. It seeks to legitimize a narrow, nasty, and monolithic ideology—one that rewrites history, cheapens patriotism, and demonizes disagreement—under the guise of fine art. It's an affront to any who believe that the LDS faith comes with no political strings attached, that Mormonism neither prescribes nor proscribes any political platform. It's discouraging enough that this sort of painting generates enough demand to keep McNaughton's studio solvent; that it's popular enough to be featured at an educational institution is pathetic.

It has been famously asked why the LDS community, while over-represented in business, law, and politics, produces so few great artists. I believe the answer is bound up in the kind of art the LDS community wants to consume, which, based on the preceding, isn't very good. Art challenges, is subtle, is occasionally subversive or controversial. And the rank-and-file LDS community isn't interested in controversy or subtlety, but in consuming media that brazenly reinforces its worldview. So for every Orson Scott Card, Minerva Teichert, or even Arnold Friberg (who managed a much more tasteful synthesis of spirituality and patriotism), there are dozens of Stephanie Meyers, Janice Kapp Perrys, and Michael McLeans. Jon McNaughton is merely a particularly egregious example of the countless LDS artists whose work does not inspire, but ploddingly reinforces stale, suffocating orthodoxy.

And that isn't art. It's kitsch. It's the opposite of art. It destroys art. It destroys souls.

[PS: It turns out you can read McNaughton's interpretation of the painting, as well as his response to "liberal" criticism. I think you will find his rhetorical chops exactly commensurate with his artistry!]

Friday, June 11, 2010

In me you trust

A quick observation:

Over the last twelve months I've climbed six full rungs up the ladder of facial hair trustworthiness, which places me ahead of Abe Lincoln, Wilford Brimley, and Aristotle. I suggest you all take a moment to celebrate me and my meaningful achievements!


(Image credit goes to Matt McInerney, and thanks go to David Malki ! of Wondermark for sharing the image with the beard-going public.)

Sunday, May 2, 2010

Fist of fury

Sometimes I just want to punch people in the face.

Let me emphasize that this is a new feeling for me. I've never been a physically violent or even hot-tempered person. I never got into a fight in school—not because I was afraid of getting beat up (although that's very likely what would have happened), but because it isn't in my nature. I DO like to argue, as everyone reading this must already know, so I don't shy away from conflict, but typically in an argument my emotions remain in check. I've always felt that arguments come to blows only when people are either too stupid or too cowardly to articulate their ideas verbally. In other words, people resort to violence only when their words are impotent. Turns out I'm a fan of neither stupidity nor cowardice, and I'm certainly not cheering for verbal impotence, so you'd expect anger management to come to me naturally. And usually it does.

But sometimes I still want to punch people in the face.

Not very often, of course. It's actually a very specific set of circumstances that boil my blood, and I've spent a reasonable amount of time trying to figure out exactly why they set me off when ordinarily it's not in my nature. I found common thread: arguments in which I've gone to considerable effort to explain myself, yet the other person almost deliberately refuses to understand me. In these arguments, my words are involuntarily rendered impotent—not because I can't articulate myself, but because I'm dealing with someone who has already deemed unimportant something he doesn't care to understand.

In some ways my frustration is probably obvious and commonplace—no one likes to have their hard work casually tossed aside—but for me it's more personal and not at all trivial. In all my interpersonal interactions—with my wife, family, friends, colleagues, whatever—my overwhelmingly top priority is to be understood. Not to be praised or to dominate or to be the smartest, not even to be comforted or loved, but to be understood. That's why I love to teach, why I sincerely appreciate it when people dissent on this blog, and particularly why I enjoy arguing. Done properly, disagreement gives me an opportunity both to understand someone else and to be understood. THAT is the miracle of human interaction—that after hours of discussion two friends arguing over dinner can breach the lonely barrier of solipsism and arrive at a mutually edifying mutual understanding. For me it's the only really authentic way of connecting with another person: through his ideas. Everything else is superficial by comparison. So when you stomp on that connection because you're too busy pushing your agenda, defending your pride, or just being angry simply because I disagree with you, you stomp on an innate part of me, and you deny me the only meaningful way I have to connect with you.

And when that happens, don't get angry if I want to punch you in the face. Maybe you deserve it.

Wednesday, March 31, 2010

The excluded middle

A week ago I posted a link to this article, written by David Frum, to my Facebook profile. There are parts of his analysis that I disagree with, of course, but on balance I found the article to be a thoughtful, constructive, and pragmatic take on heathcare reform from a conservative perspective. Most appealingly, Frum didn't engage in the petty histrionics of the tin-foil crowd: his article delightfully misses the entire {'Marxist','bloodless coup','facist','government takeover','armageddon'} set. I was impressed enough to check out his blog, where I found a collection of interesting, articulate political pieces written from a more-or-less conservative perspective. I also found a relatively intelligent commentariat who, in spite of ideological differences, manage impressively civil disagreement. I thought to myself that, in an age of Limbaugh, Beck, and Palin, Frum is exactly the sort of thing the political right ought to promote: reasonable, self-critical, even academic arguments for conservative ideas without anti-intellectualism and scorched-earth demagoguery.

Alas, it was too good to be true. On Tuesday, two days after the publication of the linked piece above, Frum was fired from his position at the conservative American Enterprise Institute.

I won't lie: I was really angry when I found out about it. Of course I can't say whether or not Frum was fired over his politics (if you like you can read his take as well as that of one of his AEI colleagues), but in any case it's disheartening when a political group rejects its moderate elements. People like to complain about the hulking inagility of the two-party system, but one of its chief benefits is that it discourages extremism. Radical ideas are first taken up by third parties, and if they become sufficiently mainstream they are picked up by one of the major parties. This forces the ideas through an incubation period, moderating them before they have any chance at becoming policy. This system works relatively well because the major parties have incentive to appeal to as large a base as possible. Empty political slogans or not, Reagan's big tent, Clinton's third way, and even Bush's compassionate conservatism sought common ground among an ideologically diverse electorate, thereby forcing moderation on the relevant party.

But this system doesn't work if a party caters to its extreme elements. If it continues to pander to the tea party bloc and push out moderates like Frum, the Republican party will give its stupidest and most reactionary elements control over its agenda, which is bad for everybody involved. Energizing a narrow, vocal portion of the base may garner short-term political capital, but it's a losing strategy in the long-term—one that poisons the political atmosphere in the meantime. White-hot partisan noise deepens divides while alienating moderate voters, and it takes more than angry paranoiacs to win elections.

My own politics are a hodgepodge of left- and right-leaning ambivalences, but my loyalties are beside the point: regardless of the party in power, we need a strong, moderating opposition. But when the opposition chooses downtown Glennbeckistan as its ideological epicenter, it relinquishes its claims to credibility and does real damage to democracy.

Wednesday, March 10, 2010

Here we come a-qwantzling

I've made it no secret on this blog that I love Dinosaur Comics. I find the strip extremely funny—both intellectually and viscerally—in a way that I can't properly explain to people who don't share my appreciation. If you aren't familiar, you should give it a fair hearing. It could change your life for the awesomer.

Ryan North, the writer of Dinosaur Comics, embedded a puzzle in one of his recent comics. Inspired by the cryptographic messages of early modern scientists like Newton and Hooke, he encoded the strip's punchline as an anagram: "12t10o8e7a6l6n6u5i5s5d5h5y3I3r3fbbwwkcmvg", meaning that there are 12 't's, 10 'o's, etc. He left it to his readers to decode the scripts and offered prizes for the first person to return the correct punchline. That was over a week ago.

So far no one has solved it. It's like Excalibur. Or the Riemann hypothesis.

Realizing that his "qwantzle" is challenging, Ryan has been slowly giving out clues. So far, this is all we know:
  1. The solution is a single, reasonably grammatical sentence that fits the context of the strip. It begins with the word "I", contains a colon and a comma (in that order), and ends with a double exclamation mark!!
  2. Letters in the solution are capitalized as in the code, and there are no proper nouns; thus, combined with the first clue, all instances of capital I must be the word "I".
  3. All words in the solution have been used previously in Dinosaur Comics. (DC is searchable, and readers have put together a dictionary of all possible words. My untrimmed dictionary has 14,000 unique words.)
  4. The longest word in the puzzle has 11 letters, and the next-longest word has 8; these words appear sequentially in the solution.
  5. [RN recently posted a final clue: the largest word is 'fundamental'. It helps, I suppose, but I think most people had already guessed that, and in any case the search space is still obscenely large!]

Even with the clues, qwantzle is maddeningly hard. Naively, there are 97! letter combinations, and even if you incorporate all of the hints the number of possible word combinations is staggering—far too many for a computer to enumerate. So there's a small community of readers working on heuristic approaches to the problem, trying to combine human intuition with brute-force computational strength. But so far, most solutions (interestingly, readers have submitted many grammatical sentences that meet the criteria, but none of them has been correct) have come simply by guess-and-check.

I won't lie: I've spent more time than I care to admit on qwantzle. I've taught myself a new programming language and spent a few idle hours crash-coursing on computational linguistics. To show for it, I've developed two approaches that I thought were clever. One takes a valid solution, randomly deletes a few words, and forms a new anagram with the deleted words; this gives you a way to automatically explore variations on a solution that you think might be pretty close. The other performs a genetic algorithm on letter ordering alone. The letter orderings are ranked according to how well they correlate with DC dialogue, randomly mutated and crossed over, and made to compete in a pseudo-Darwinian process intended to improve the overall quality of the solutions.

But, despite (what I consider) reasonable creativity, these approaches don't work all that well. They WILL spit out technically valid solutions, but they aren't terribly grammatical. My next step should be to include natural language processing techniques—NLP is a new field with surprising success at computationally characterizing language as it is spoken and written—but I'm having a hard time being optimistic. In general, computers are far inferior to human brains at pattern recognition, and I struggle to believe that a computer could be made to recognize the right answer even if it found it.

I'm certainly on the lookout for new solution ideas, of course. But I think that the problem will remain unsolved until Ryan North finally gives out enough clues—at which time it will be solved by a human brain performing (computer-aided) guess-and-check.

[Note: I originally mistyped the anagram, so if any of you were working on the puzzle using my copy—and I hope you weren't—I'm very sorry and it's fixed now!]

Monday, February 15, 2010

Meta-post

You probably noticed: I didn't do it. It turns out I picked a pretty poor week (maybe the week of your birthday celebration(s) isn't the best one for a challenge of creativity) for daily blogging, and I greatly underestimated how much effort it would take to write a post every day. Perhaps you noticed that I posted the entries just before midnight, or that I edited them significantly the next day, or that I had a hard time simultaneously keeping up with multiple threads of comments while writing the next day's post. I didn't run out of things to say--I have a post or two waiting in the wings--but rather struggled to get the ideas adequately expressed. Finally, on Thursday I just didn't get it done. Originally I intended to pick up and not let it deter me from finishing out strong for the rest of the week, but once you've failed at your goal, it's hard not to give up now. So I did.

Failures aside, it was a useful experiment. Although I ultimately succumbed to the pressure, I managed to write my way through a few complex posts in much shorter time than I usually would. I now also see the value in a few days' wait between posts. It takes a day or two for people to see the post, decide what they don't like about it, and respond; it takes a further day or two for arguments and counter-arguments to converge. A new post's arrival simply stifles the process.

So: where to go from here? Inspired by Marie and Gareth's suggestions, I'll try to occasionally post shorter, lighter, even more superficial posts. Although it's quite difficult for me, probably I can share a thought or an idea or an experience without picking its bones clean with over-analysis.

Finally, inspired by Grant's suggestion, I want to spend some time writing a series of introductory articles to particularly important (or interesting) mathematical ideas. Consider it a sort of "greatest hits" of mathematics, with an emphasis on accessibility to the lay person--sacrificing precision for intuition. Such posts will likely take a good chunk of effort from me, so I probably won't do it unless there's sufficient interest. Thus I ask: is there sufficient interest?

Wednesday, February 10, 2010

Je pense, donc je suis

Every modern scientist owes an intellectual debt to René Descartes. In addition to his mathematical contributions--Descartes invented the coordinate axes and made early contributions to calculus--his philosophy instigated the rationalist movement, which arguably forms the basis of the scientific method. Though a Catholic, Descartes' philosophy was founded on ultimate skepticism. Sense experience is subjective and therefore unreliable, he argued, so everything is open to doubt until it can be proven logically. He posited that for all he knew an "evil genius" was manipulating his sensory input, constructing a false reality. His only given (either an axiom or a tautology depending on your perspective) was his famous pronouncement "I think, therefore I am". He (ostensibly) constructed his entire philosophy from this single premise, eliminating the possibility of the evil genius and logically establishing the existence of God.

'Ostensibly', of course, is the key word. I have to give Descartes credit, as his efforts at radical doubt were seminal, but by modern standards he was a lightweight skeptic. His proof for the existence of God is essentially an elaborate repressing of Anselm's ontological argument. Put very simply, Descartes' argument goes as follows: I can imagine a perfect, benevolent God; something cannot come from nothing, but since I am imperfect, the idea of a perfect God cannot come from within me; thus the idea must come from something perfect; that something is God. Regardless of your theological loyalties, this argument is not terribly convincing. It's not at all obvious, for example--and Descartes leaves it unproven--that an imperfect being cannot conceive of something perfect. Yet Descartes rejected critics' arguments and maintained throughout his life that his proof was complete.

Descartes was no fool; in fact, I'm convinced that he was highly intelligent as well as fully sincere in his philosophical quest. In spite of his extreme efforts at skepticism, however, he ended up more fully convinced of his convictions even though the evidence was far short of watertight. His goal was to discard everything he couldn't justify logically, but he ended up unilaterally embracing a logically unprovable proposition. I have no quarrel with Descartes' beliefs, but it's disheartening to see a brilliant and sincerely skeptical man peg his philosophy to a facile, credulous argument simply because it confirms his preconceptions.

Descartes' example highlights a worrying reality: smart people rationalize their preconceptions just like the rest of us. In fact, they may be even worse about it. Intelligence and education can encourage us to be honest with ourselves, but just as often they provide us merely with the sophistry we need to talk ourselves into believing whatever we choose. Every think tank with an innocuous name and a viciously partisan agenda testifies to the fact that, with enough intellectual effort, we can study ourselves into whatever ideological corner we prefer. It's a sobering thought, one that should give us particular pause in our religious and political affairs where intransigence so often prevails. I'm not arguing that we can't believe in anything, of course--I'm nobody's nihilist--but we must be careful with the arguments we accept for belief, ensuring that we don't simply follow tortured intellectual paths because they lead to the conclusion we wanted all along. Our instinct is to confirm our biases, and intelligence and education are not enough to counter it; only sustained, self-skeptical honesty can successfully keep it in check.

Tuesday, February 9, 2010

B sides

When I read an article online, I usually spend about twice as long reading the comments section as I do the original article. Partially this is because the sheer bulk of comments--nonsensical or otherwise--easily outweighs that of the article. But mostly I'm fascinated to read other people's arguments--again, nonsensical or otherwise--particularly when the parent article is an opinion piece or somehow controversial. An article, no matter how nuanced or multifaceted its arguments, is almost always a single monologue from a single writer's perspective, and I'm convinced that I learn more from the interdependent swarm of commenters who respond not only to the original article, but to each other's arguments.

I feel the same way about this blog: I think my most concise, coherent writing has actually been in response to your comments, since I have another person's ideas to (a) offer an alternative perspective and (b) give concrete objections on which to focus my thoughts. They're like B-sides: not as polished and audience-friendly as the featured tune, but often a better look into the artist's soul.

Based on the preceding, I've constructed a method for evaluating the quality of a website: instead of trying to judge the content directly, it's easier and more accurate to judge the quality of the discussion it generates. A site that inspires misspelled, all-caps ranting is likely to serve up marginal content, whereas comments thread with meaningful debate usually appear on sites with quality material. "Meaningful debate" is in the eye of the beholder, of course, but I've decided to rank five corners of the web according to this criteria, from worst to best:

5. Youtube

Youtube comments are perhaps the only compelling argument I've ever heard in favor of voluntary extinction. It's been argued (persuasively, I say) that it's impossible to post a comment on YouTube so stupid that people realize that you're kidding. Spelling and grammar are nonexistent, and any disagreement devolves immediately into name-calling. My only consolation is the assumption that mostly teenagers have the time and temperament to watch videos all day. Oh, I hope.

4. Ordinary news outlets (CNN, NYT, MSNBC)

Comments here aren't terrible. By the mass-appeal nature and easy controversy of news sites, comments tend towards angry posts telling people to "wake up" and "stop drinking kool-aid". But there are usually enough calm, reasonable comments to steady my wavering faith in humanity.

3. Online "magazines" (Slate, The Economist, etc.)

Comments here are often quite good. Magazines tend to cater to a somewhat niche audience--often a particular portion of the political spectrum--and the general readership is well-informed, articulate, and capable of quality debate. The biggest drawback is homogeneity: the vast majority of commenters comes from the appropriate political niche, and so the minority opposition feels the need to comment both loudly and inarticulately. Reading comments on Slate (respectively The Economist) therefore gives the impression that conservatives (respectively Keynesians) are ignorant screamers, which isn't good for leveling one's biases.

2. AV Club

The Onion's sister site is not a joke, but its commenters are very, very funny. They are also mean, crude, and pretentious. Funny trumps nice, however (and anyways, AV Club posters expect to be ridiculed--it's an understood part of the fun), so I give the AV Club a hearty recommendation.

1. Wikipedia

Whenever I encounter a Wikipedia naysayer, I invite them to check out the talk pages behind the articles they read. First, the comments there are the best possible way to discern the quality of an article--of course there are bad articles on Wikipedia, but an article with lots of discussion has probably been pored over, its content verified, and its controversial sections moderated by compromise. Second, talk page debates are fascinating examples of quality argument. While debates occasionally degenerate into personal attacks and edit wars, contributers--for no compensation other than (semi-)professional pride--typically succeed in finding common ground, achieving consensus, and turning out a reasonable article. It's a soul-sustaining specimen of human dignity.

Conclusion and obvious subtext: comments here are encouraged!

Monday, February 8, 2010

How wide the divide?

[Note: yes, I changed the title on this post. I actually thought of this title last night, but I couldn't remember it when it came time to actually post. So the original title went in as a placeholder until inspiration re-struck.]

So, it turns out that I get made fun of sometimes, which probably comes as no surprise to most of you. I'm usually happy to laugh at myself, but often these attacks are unfair and petty--I'll use a word too large for my audience, and ridicule follows (I get it; sometimes people are insecure, and maybe sometimes I'm unintentionally intimidating. I'm sorry about that, and I've even gone to considerable lengths to avoid it, but there's nothing I can do to help your problem; you need learn to believe in yourself, just like the last scene in all movies!). But sometimes the ridicule is justified. Recently I was made fun of by a friend for using a semicolon in a text message. I'm not sure that I feel guilty, exactly, but I have to admit that this act was completely mockable: only a laughably curmudgeonly prescriptivist would go to all that trouble just to avoid a comma splice.

You read correctly: today is the day we talk about prescriptivism!

The prescriptivist/descriptivist debate perennially pits high school-vintage grammar snobs against college-educated linguists. Simply put, prescriptivists argue that a language is a collection of officially-defined words which are ordered into sentences and paragraphs according to standardized rules. In other words, language is defined by its rules, and "correct" usage simply follows those rules. I admit that I harbor prescriptivist leanings. I believe that correct usage increases the precision and concision of language, thus making written communication more effective. I appreciate it when objective pronouns are rendered 'whom', when the subjunctive is used properly, and, yes, when comma splices are mended with a simple semicolon.

[Side note: any grammar or usage mistakes in this post are placed there intentionally. For irony!]

Linguists, who are almost universally descriptivists, point out that prescriptivism is largely a modern phenomenon. Language has evolved continuously ever since its advent, and it's only been in the last few hundred years that, with a high literacy rates and the ubiquity of print, it's even been possible to standardize language. Since rich, complex languages existed before prescriptivist rulesets--the average Roman, for example, didn't run around worrying about the difference between the indicative and imperative moods--the languages must therefore not be defined by such rulesets. You can make a sort of Platonic argument: the true language exists independent of the ruleset, thus the ruleset only approximates the language as actually used.

Despite my prescriptivist bias, I'm forced to accept the logic of the descriptivist argument. On one hand, prescriptivist standardization facilitates communication. On the other hand, descriptivists are obviously right in that language transcends standardization. Pure prescriptivism gives us the boorish grammatical hair-splitter; pure descriptivism gives us teh intertubes.

Given my penchant for reconciling conflicting viewpoints (yes, kids, I'm inviting you to read my thesis; while you're at it, if you could cite it in your academic work, that would really help me out), allow me to postulate some common ground: what prescriptivists and descriptivists really care about is that, rather than just spouting out words, we think about language as we use it. A prescriptivist, I argue, doesn't care so much that I use quotation marks as dictated by the Chicago Manual of Style, but that I pay attention to the issues when making choices with punctuation. A descriptivist doesn't really advocate wildly ungrammatical writing, but simply points out that a consistent definition of 'grammatical' is a chimera. So, I can use "standard" usage to write as precisely as possible, while acknowledging that language is inherently fluid and mutable.

Sunday, February 7, 2010

Meta-post

After more than a year of maintaining this blog, I've come to a sad realization: I don't post here often enough. Seven posts in an entire year--no matter how significant I might consider those posts individually--is not many. The issue isn't a lack of material--I have plenty of ideas to write about and opinions to (try to) articulate--but carrying the ideas from the back of my brain to the tips of my fingers increasingly requires far too much effort. Today's post, for example, took days of writing and re-writing, and after all that it still doesn't say quite what I want it to.

I'm not sure exactly why I struggle so much lately, but--along with today's Jungian theme (this post really is meta)--I blame my internal editor, an ever-fattening curmudgeon who blocks the path between my intuitive ideas and the conscious crafting of words. My rational, quasi-perfectionistic mind judges and discards my prose before I get the chance to chew on it.

Therefore, I'm going to conduct an experiment. For the next seven days, I will write one post per day. Probably the posts will be shorter, less polished, and related to shallower topics, but I'll have an opportunity to work on getting ideas on paper quickly. Hopefully it will kick off a trend in more frequent posting, because let's be honest: people want to hear what is inside my brain.

So, I have a request for my meager readership. I believe that I have enough topics to fill seven consecutive days of lightweight blogging, but I'd rather not come up short and have to write a completely superficial post just to make a self-imposed deadline. Thus, I invite you to submit a topic or two about which you would like to see me spout off a few paragraphs. I won't put hard guidelines on the topics, but make them something worth thinking and arguing about. Political topics are acceptable only if there is a meaningful non-partisan component; I'm not interested in becoming anyone's ideologue. Philosophical and/or theological topics earn bonus points.

Thanks in advance for your participation.

Archetypical

[Editor's note/excuse/statement of victory: I haven't used this blog for personal storytelling--if you want to know what's going on day-to-day, try Facebook--but recently I've been distracted by training for, running, and recovering from the Houston Marathon. I thought about writing about it here, but I decided to spare you the treacly life-is-like-my-sport routine and point you to the official photos instead. I'll leave up the 'marathonous' widget for a few more days to give you all one more chance to admire me.

End note. On with the post.]

A few months ago, W. W. Norton published the so-called "Red Book" by the Swiss psychiatrist Carl Jung. Jung worked closely with Sigmund Freud in the early 20th century, but disagreements eventually led Jung to establish his own school of psychology. Shortly after his break with Freud, Jung began to suffer from major psychoses--voices, hallucinations, and the like--causing Jung to worry that he was going insane. Rather than shunning his episodes or trying to 'cure' them, however, he embraced and even tried to induce them. He viewed his hallucinations as valuable opportunities to become acquainted with his unconscious self, which was necessary to maintain (or recover) his mental health. For sixteen years he kept a detailed record of his hallucinations, transcribing notes and images into a red leather book. After his death, his family kept the book hidden until finally agreeing in 2007 to publish it.

(I promise I'm going somewhere with this. Please hang on.)

Colored by these experiences, Jung developed an unusual theory of psychology. He saw modern man as a victim of too much rational, logical thinking--too much 'consciousness'--which prevented him from realizing his true self. By tapping into the unconscious, letting it rule over the rational mind, a person's true self was realized. In contrast to Freud's 'rational' psychoanalysis, Jung saw psychology as a 'spiritual' (it's not clear--to me, at least--whether or not Jung meant this in a religious sense, or if it was merely figurative) endeavor, and his theories are characterized by a focus on the mystical and metaphysical. So, while Jung made several contributions to modern psychology, his theories are often marginalized as pseudoscientific.

However, the very qualities that render Jungian psychology unscientific simultaneously render it fascinating. The most notable of Jung's theories, for example, is the collective unconscious, the idea that all of humanity shares a common psychic structure that governs our primordial thoughts. To Jung, the commonality of our myths, folklore, and archetypes is due to a deeply-intrenched, genetically heritable psychological framework. In other words, the orphan hero and the wise old sage resonate with us not because George Lucas is a genius (because let's be honest, he isn't! [Cue angry comments.]), but because these characters have been stamped over millennia into our deepest, most immutable subconsciousness.

(Here's where I get to my point!)

Jung may be discredited among psychologists, his ideas useful only as fodder for works of science fiction genius, but I don't care; I'm more interested in the practical aspects of his ideas. While Jung argued that the unconscious mind should rule over the conscious, most of our success comes by doing the opposite. Whenever we overcome a fear, change our habits, or learn a skill, we make a conscious, deliberate effort to alter our unconscious, instinctive selves. This is particularly true for educational efforts: our we consciously struggle with complex ideas until we successfully embed them into our natural intuition.

With that last point comes a dilemma. As my education progresses, my capacity for precise, disciplined, hyperconscious thought improves tremendously. But I feel my creative, impulsive, illogical brain grinding slowly away--not rusting shut with disuse, but being crushed by the weight of ever-present rationality. I love graduate school, don't get me wrong, but as I sharpen my talent for proofmaking and paper writing--in other words, as I expand my rational self--I crowd out vital elements of my mind. I've argued elsewhere that science is not a mechanical process, but one that demands passion and creativity, in which the most valuable ideas spring from the irrational and are then honed under the care of the rational. So I am left with an important question. How do I expand the capacity of my rational mind without squashing the sparks of creativity? How do I go forward in training without neutering the very neuroticism that nurtures the innovation necessary for successful research?