A sweary—and expertly punctuated—weblog.

Monday, February 15, 2010

Meta-post

You probably noticed: I didn't do it. It turns out I picked a pretty poor week (maybe the week of your birthday celebration(s) isn't the best one for a challenge of creativity) for daily blogging, and I greatly underestimated how much effort it would take to write a post every day. Perhaps you noticed that I posted the entries just before midnight, or that I edited them significantly the next day, or that I had a hard time simultaneously keeping up with multiple threads of comments while writing the next day's post. I didn't run out of things to say--I have a post or two waiting in the wings--but rather struggled to get the ideas adequately expressed. Finally, on Thursday I just didn't get it done. Originally I intended to pick up and not let it deter me from finishing out strong for the rest of the week, but once you've failed at your goal, it's hard not to give up now. So I did.

Failures aside, it was a useful experiment. Although I ultimately succumbed to the pressure, I managed to write my way through a few complex posts in much shorter time than I usually would. I now also see the value in a few days' wait between posts. It takes a day or two for people to see the post, decide what they don't like about it, and respond; it takes a further day or two for arguments and counter-arguments to converge. A new post's arrival simply stifles the process.

So: where to go from here? Inspired by Marie and Gareth's suggestions, I'll try to occasionally post shorter, lighter, even more superficial posts. Although it's quite difficult for me, probably I can share a thought or an idea or an experience without picking its bones clean with over-analysis.

Finally, inspired by Grant's suggestion, I want to spend some time writing a series of introductory articles to particularly important (or interesting) mathematical ideas. Consider it a sort of "greatest hits" of mathematics, with an emphasis on accessibility to the lay person--sacrificing precision for intuition. Such posts will likely take a good chunk of effort from me, so I probably won't do it unless there's sufficient interest. Thus I ask: is there sufficient interest?

Wednesday, February 10, 2010

Je pense, donc je suis

Every modern scientist owes an intellectual debt to René Descartes. In addition to his mathematical contributions--Descartes invented the coordinate axes and made early contributions to calculus--his philosophy instigated the rationalist movement, which arguably forms the basis of the scientific method. Though a Catholic, Descartes' philosophy was founded on ultimate skepticism. Sense experience is subjective and therefore unreliable, he argued, so everything is open to doubt until it can be proven logically. He posited that for all he knew an "evil genius" was manipulating his sensory input, constructing a false reality. His only given (either an axiom or a tautology depending on your perspective) was his famous pronouncement "I think, therefore I am". He (ostensibly) constructed his entire philosophy from this single premise, eliminating the possibility of the evil genius and logically establishing the existence of God.

'Ostensibly', of course, is the key word. I have to give Descartes credit, as his efforts at radical doubt were seminal, but by modern standards he was a lightweight skeptic. His proof for the existence of God is essentially an elaborate repressing of Anselm's ontological argument. Put very simply, Descartes' argument goes as follows: I can imagine a perfect, benevolent God; something cannot come from nothing, but since I am imperfect, the idea of a perfect God cannot come from within me; thus the idea must come from something perfect; that something is God. Regardless of your theological loyalties, this argument is not terribly convincing. It's not at all obvious, for example--and Descartes leaves it unproven--that an imperfect being cannot conceive of something perfect. Yet Descartes rejected critics' arguments and maintained throughout his life that his proof was complete.

Descartes was no fool; in fact, I'm convinced that he was highly intelligent as well as fully sincere in his philosophical quest. In spite of his extreme efforts at skepticism, however, he ended up more fully convinced of his convictions even though the evidence was far short of watertight. His goal was to discard everything he couldn't justify logically, but he ended up unilaterally embracing a logically unprovable proposition. I have no quarrel with Descartes' beliefs, but it's disheartening to see a brilliant and sincerely skeptical man peg his philosophy to a facile, credulous argument simply because it confirms his preconceptions.

Descartes' example highlights a worrying reality: smart people rationalize their preconceptions just like the rest of us. In fact, they may be even worse about it. Intelligence and education can encourage us to be honest with ourselves, but just as often they provide us merely with the sophistry we need to talk ourselves into believing whatever we choose. Every think tank with an innocuous name and a viciously partisan agenda testifies to the fact that, with enough intellectual effort, we can study ourselves into whatever ideological corner we prefer. It's a sobering thought, one that should give us particular pause in our religious and political affairs where intransigence so often prevails. I'm not arguing that we can't believe in anything, of course--I'm nobody's nihilist--but we must be careful with the arguments we accept for belief, ensuring that we don't simply follow tortured intellectual paths because they lead to the conclusion we wanted all along. Our instinct is to confirm our biases, and intelligence and education are not enough to counter it; only sustained, self-skeptical honesty can successfully keep it in check.

Tuesday, February 9, 2010

B sides

When I read an article online, I usually spend about twice as long reading the comments section as I do the original article. Partially this is because the sheer bulk of comments--nonsensical or otherwise--easily outweighs that of the article. But mostly I'm fascinated to read other people's arguments--again, nonsensical or otherwise--particularly when the parent article is an opinion piece or somehow controversial. An article, no matter how nuanced or multifaceted its arguments, is almost always a single monologue from a single writer's perspective, and I'm convinced that I learn more from the interdependent swarm of commenters who respond not only to the original article, but to each other's arguments.

I feel the same way about this blog: I think my most concise, coherent writing has actually been in response to your comments, since I have another person's ideas to (a) offer an alternative perspective and (b) give concrete objections on which to focus my thoughts. They're like B-sides: not as polished and audience-friendly as the featured tune, but often a better look into the artist's soul.

Based on the preceding, I've constructed a method for evaluating the quality of a website: instead of trying to judge the content directly, it's easier and more accurate to judge the quality of the discussion it generates. A site that inspires misspelled, all-caps ranting is likely to serve up marginal content, whereas comments thread with meaningful debate usually appear on sites with quality material. "Meaningful debate" is in the eye of the beholder, of course, but I've decided to rank five corners of the web according to this criteria, from worst to best:

5. Youtube

Youtube comments are perhaps the only compelling argument I've ever heard in favor of voluntary extinction. It's been argued (persuasively, I say) that it's impossible to post a comment on YouTube so stupid that people realize that you're kidding. Spelling and grammar are nonexistent, and any disagreement devolves immediately into name-calling. My only consolation is the assumption that mostly teenagers have the time and temperament to watch videos all day. Oh, I hope.

4. Ordinary news outlets (CNN, NYT, MSNBC)

Comments here aren't terrible. By the mass-appeal nature and easy controversy of news sites, comments tend towards angry posts telling people to "wake up" and "stop drinking kool-aid". But there are usually enough calm, reasonable comments to steady my wavering faith in humanity.

3. Online "magazines" (Slate, The Economist, etc.)

Comments here are often quite good. Magazines tend to cater to a somewhat niche audience--often a particular portion of the political spectrum--and the general readership is well-informed, articulate, and capable of quality debate. The biggest drawback is homogeneity: the vast majority of commenters comes from the appropriate political niche, and so the minority opposition feels the need to comment both loudly and inarticulately. Reading comments on Slate (respectively The Economist) therefore gives the impression that conservatives (respectively Keynesians) are ignorant screamers, which isn't good for leveling one's biases.

2. AV Club

The Onion's sister site is not a joke, but its commenters are very, very funny. They are also mean, crude, and pretentious. Funny trumps nice, however (and anyways, AV Club posters expect to be ridiculed--it's an understood part of the fun), so I give the AV Club a hearty recommendation.

1. Wikipedia

Whenever I encounter a Wikipedia naysayer, I invite them to check out the talk pages behind the articles they read. First, the comments there are the best possible way to discern the quality of an article--of course there are bad articles on Wikipedia, but an article with lots of discussion has probably been pored over, its content verified, and its controversial sections moderated by compromise. Second, talk page debates are fascinating examples of quality argument. While debates occasionally degenerate into personal attacks and edit wars, contributers--for no compensation other than (semi-)professional pride--typically succeed in finding common ground, achieving consensus, and turning out a reasonable article. It's a soul-sustaining specimen of human dignity.

Conclusion and obvious subtext: comments here are encouraged!

Monday, February 8, 2010

How wide the divide?

[Note: yes, I changed the title on this post. I actually thought of this title last night, but I couldn't remember it when it came time to actually post. So the original title went in as a placeholder until inspiration re-struck.]

So, it turns out that I get made fun of sometimes, which probably comes as no surprise to most of you. I'm usually happy to laugh at myself, but often these attacks are unfair and petty--I'll use a word too large for my audience, and ridicule follows (I get it; sometimes people are insecure, and maybe sometimes I'm unintentionally intimidating. I'm sorry about that, and I've even gone to considerable lengths to avoid it, but there's nothing I can do to help your problem; you need learn to believe in yourself, just like the last scene in all movies!). But sometimes the ridicule is justified. Recently I was made fun of by a friend for using a semicolon in a text message. I'm not sure that I feel guilty, exactly, but I have to admit that this act was completely mockable: only a laughably curmudgeonly prescriptivist would go to all that trouble just to avoid a comma splice.

You read correctly: today is the day we talk about prescriptivism!

The prescriptivist/descriptivist debate perennially pits high school-vintage grammar snobs against college-educated linguists. Simply put, prescriptivists argue that a language is a collection of officially-defined words which are ordered into sentences and paragraphs according to standardized rules. In other words, language is defined by its rules, and "correct" usage simply follows those rules. I admit that I harbor prescriptivist leanings. I believe that correct usage increases the precision and concision of language, thus making written communication more effective. I appreciate it when objective pronouns are rendered 'whom', when the subjunctive is used properly, and, yes, when comma splices are mended with a simple semicolon.

[Side note: any grammar or usage mistakes in this post are placed there intentionally. For irony!]

Linguists, who are almost universally descriptivists, point out that prescriptivism is largely a modern phenomenon. Language has evolved continuously ever since its advent, and it's only been in the last few hundred years that, with a high literacy rates and the ubiquity of print, it's even been possible to standardize language. Since rich, complex languages existed before prescriptivist rulesets--the average Roman, for example, didn't run around worrying about the difference between the indicative and imperative moods--the languages must therefore not be defined by such rulesets. You can make a sort of Platonic argument: the true language exists independent of the ruleset, thus the ruleset only approximates the language as actually used.

Despite my prescriptivist bias, I'm forced to accept the logic of the descriptivist argument. On one hand, prescriptivist standardization facilitates communication. On the other hand, descriptivists are obviously right in that language transcends standardization. Pure prescriptivism gives us the boorish grammatical hair-splitter; pure descriptivism gives us teh intertubes.

Given my penchant for reconciling conflicting viewpoints (yes, kids, I'm inviting you to read my thesis; while you're at it, if you could cite it in your academic work, that would really help me out), allow me to postulate some common ground: what prescriptivists and descriptivists really care about is that, rather than just spouting out words, we think about language as we use it. A prescriptivist, I argue, doesn't care so much that I use quotation marks as dictated by the Chicago Manual of Style, but that I pay attention to the issues when making choices with punctuation. A descriptivist doesn't really advocate wildly ungrammatical writing, but simply points out that a consistent definition of 'grammatical' is a chimera. So, I can use "standard" usage to write as precisely as possible, while acknowledging that language is inherently fluid and mutable.

Sunday, February 7, 2010

Meta-post

After more than a year of maintaining this blog, I've come to a sad realization: I don't post here often enough. Seven posts in an entire year--no matter how significant I might consider those posts individually--is not many. The issue isn't a lack of material--I have plenty of ideas to write about and opinions to (try to) articulate--but carrying the ideas from the back of my brain to the tips of my fingers increasingly requires far too much effort. Today's post, for example, took days of writing and re-writing, and after all that it still doesn't say quite what I want it to.

I'm not sure exactly why I struggle so much lately, but--along with today's Jungian theme (this post really is meta)--I blame my internal editor, an ever-fattening curmudgeon who blocks the path between my intuitive ideas and the conscious crafting of words. My rational, quasi-perfectionistic mind judges and discards my prose before I get the chance to chew on it.

Therefore, I'm going to conduct an experiment. For the next seven days, I will write one post per day. Probably the posts will be shorter, less polished, and related to shallower topics, but I'll have an opportunity to work on getting ideas on paper quickly. Hopefully it will kick off a trend in more frequent posting, because let's be honest: people want to hear what is inside my brain.

So, I have a request for my meager readership. I believe that I have enough topics to fill seven consecutive days of lightweight blogging, but I'd rather not come up short and have to write a completely superficial post just to make a self-imposed deadline. Thus, I invite you to submit a topic or two about which you would like to see me spout off a few paragraphs. I won't put hard guidelines on the topics, but make them something worth thinking and arguing about. Political topics are acceptable only if there is a meaningful non-partisan component; I'm not interested in becoming anyone's ideologue. Philosophical and/or theological topics earn bonus points.

Thanks in advance for your participation.

Archetypical

[Editor's note/excuse/statement of victory: I haven't used this blog for personal storytelling--if you want to know what's going on day-to-day, try Facebook--but recently I've been distracted by training for, running, and recovering from the Houston Marathon. I thought about writing about it here, but I decided to spare you the treacly life-is-like-my-sport routine and point you to the official photos instead. I'll leave up the 'marathonous' widget for a few more days to give you all one more chance to admire me.

End note. On with the post.]

A few months ago, W. W. Norton published the so-called "Red Book" by the Swiss psychiatrist Carl Jung. Jung worked closely with Sigmund Freud in the early 20th century, but disagreements eventually led Jung to establish his own school of psychology. Shortly after his break with Freud, Jung began to suffer from major psychoses--voices, hallucinations, and the like--causing Jung to worry that he was going insane. Rather than shunning his episodes or trying to 'cure' them, however, he embraced and even tried to induce them. He viewed his hallucinations as valuable opportunities to become acquainted with his unconscious self, which was necessary to maintain (or recover) his mental health. For sixteen years he kept a detailed record of his hallucinations, transcribing notes and images into a red leather book. After his death, his family kept the book hidden until finally agreeing in 2007 to publish it.

(I promise I'm going somewhere with this. Please hang on.)

Colored by these experiences, Jung developed an unusual theory of psychology. He saw modern man as a victim of too much rational, logical thinking--too much 'consciousness'--which prevented him from realizing his true self. By tapping into the unconscious, letting it rule over the rational mind, a person's true self was realized. In contrast to Freud's 'rational' psychoanalysis, Jung saw psychology as a 'spiritual' (it's not clear--to me, at least--whether or not Jung meant this in a religious sense, or if it was merely figurative) endeavor, and his theories are characterized by a focus on the mystical and metaphysical. So, while Jung made several contributions to modern psychology, his theories are often marginalized as pseudoscientific.

However, the very qualities that render Jungian psychology unscientific simultaneously render it fascinating. The most notable of Jung's theories, for example, is the collective unconscious, the idea that all of humanity shares a common psychic structure that governs our primordial thoughts. To Jung, the commonality of our myths, folklore, and archetypes is due to a deeply-intrenched, genetically heritable psychological framework. In other words, the orphan hero and the wise old sage resonate with us not because George Lucas is a genius (because let's be honest, he isn't! [Cue angry comments.]), but because these characters have been stamped over millennia into our deepest, most immutable subconsciousness.

(Here's where I get to my point!)

Jung may be discredited among psychologists, his ideas useful only as fodder for works of science fiction genius, but I don't care; I'm more interested in the practical aspects of his ideas. While Jung argued that the unconscious mind should rule over the conscious, most of our success comes by doing the opposite. Whenever we overcome a fear, change our habits, or learn a skill, we make a conscious, deliberate effort to alter our unconscious, instinctive selves. This is particularly true for educational efforts: our we consciously struggle with complex ideas until we successfully embed them into our natural intuition.

With that last point comes a dilemma. As my education progresses, my capacity for precise, disciplined, hyperconscious thought improves tremendously. But I feel my creative, impulsive, illogical brain grinding slowly away--not rusting shut with disuse, but being crushed by the weight of ever-present rationality. I love graduate school, don't get me wrong, but as I sharpen my talent for proofmaking and paper writing--in other words, as I expand my rational self--I crowd out vital elements of my mind. I've argued elsewhere that science is not a mechanical process, but one that demands passion and creativity, in which the most valuable ideas spring from the irrational and are then honed under the care of the rational. So I am left with an important question. How do I expand the capacity of my rational mind without squashing the sparks of creativity? How do I go forward in training without neutering the very neuroticism that nurtures the innovation necessary for successful research?