America's Secular Religion

Everybody knows somebody who claims to be religious but adheres to his or her professed religion only inconsistently, or maybe hardly at all. Less widely noticed is the phenomenon of the person who claims to be irreligious but is actually driven by blind faith. Some of the most popular religious creeds fly under the radar thanks to shrewd marketing by their adherents (although thereby missing out on the tax exemption). So it's important to keep in mind what "religion" is exactly. Well, it's a lot of things, among which:

1. Religion provides a moral code, a method for distinguishing right from wrong;
2. Religious truth flows from the top down (frequently originating with God at the very top);
3. The religious are loyal to their creeds. The creed becomes the prism through which they interpret the facts of the world around them.

Science, by comparison, has no concern with moral questions of right and wrong—its job is to tell merely what is or is not, not to make judgments of good or bad. Science does not demand belief on the basis of authority; any scientific theory should be verifiable (or falsifiable) by anyone willing to replicate the necessary experiments. It's the ultimate democracy (in theory): the truth is equally accessible to everyone.

(If at this point you object that real scientists don't always--or even usually--just bear with me until reading the rest of this post.)

People think of the old Soviet Union as devoid of religion, but in fact there dominant religion was called Communism. Communism was a religion by every one of the criteria above: it provided their moral compass; the creed was defined and promulgated by the central authority of the State; and true believers shaped their entire world-view around the logical demands of Communism.

A recent report by the Pew Forum finds an increasing percentage of Americans profess adherence to no religious creed. I'm not so sure this would hold if we included some of the new "non-religious" religions. In particular, while communism never caught on big in the U.S., the younger creed of psychiatrology, as I call it. has a definite grip on a large segment of the population.

So many people how are skeptical of what they read in the newspaper or even in the Bible are suddenly credulous when it comes to a book written by a self-help guru, especially one with "Dr." in front of his or her name (which always strikes me as a sign of insecurity--like the Seinfeld character who insisted on being called "Maestro.") If you want to just make stuff up, and then have it become widespread conventional wisdom, then psychology is the field for you (although politics gives it a run for the money).

So we end up with such "everyone knows" phenomena as the five stages of grieving, or the mid-life crisis. (Here's a fun thing to do. If you know a guy over the age of 35, start commenting on every change in his lifestyle--for example, if he decides to go to the gym, or seek a promotion at work, or maybe grow a beard: "You must be having a mid-life crisis." He'll love it.) The problem with these ideas is that if you look for actual research supporting them, it isn't there, or you might find a single study with marginal results, not supported by follow-up work.

Pure science doesn't touch on questions of right and wrong, but psychology, especially pop psychology, is not reticent about telling people what they should or should not do. The terms "right" and "wrong" are not used, but terms such as "healthy", "deviant", "syndrome" are used with the same force. Consider, for example, the phenomenon of homosexuality. Up until 1973, the "Bible" of psychology, that is to say the Diagnostic and Statistical Manual of Mental Disorders (let's call it the Psychobible for short), listed homosexuality as a mental disorder, and in 1973 it was removed. Now I claim the word "disorder" carries a value judgment attached, in that it implies here is a condition which demands to be corrected, or is at least regrettable. And as such when we use a word like "disorder" we are now outside the realm of science.

Was there some kind of scientific breakthrough in 1973 that suddenly revealed homosexuality to lie within the normal spectrum of human behavior instead of outside? Of course not, just as there was no scientific basis for the original classification. The change simply reflected a change in psychiatrists feelings about homosexuality.

The religion of psychiatrology is distinct from the legitimate science of psychology, which includes plenty of solid, fascinating research. Unfortunately the dividing line is not a sharp one. Even professionals in the field sometimes draw conclusions based on faith rather than facts. In 1973 David Rosenhan experimented with admitting perfectly sane people to psychiatric hospitals with a single report of hearing voices. Perfectly healthy test subjects were thereafter judged as insane and kept confined for up to several months.

Currently one of the most damaging tenets of psychiatrology is the concept of so-called Attention Deficit Hyperactivity Disorder (ADHD). This initial appeared in the 1980 version of the Psychobible as Attention Deficit Disorder, which was replaced by "ADHD" in 1987. The rate of diagnosis has exploded from 0 in 1979 to include about 10% of schoolchildren today (and 13% of boys). But only in America (and lately in Great Britain as well, it seems)--this concept essentially doesn't exist in Japan, Russia, or other countries that routinely outperform us in education. The U.S. accounts for something like 90% of the world's consumption of Ritalin.

I hope the absurdity of labeling 13% of boys with a "disorder" is self-evident. If not, then consider: is a condition affecting 20% of the population a "disorder"? How about 50% ? 90% ? In this country, only 2% of the population is redheaded, but we don't label them as having some "disorder" of the hair. If 13% of children are unable to meet the schools' expectations for sitting in a chair and listening passively, does the fault like with the children, or with the schools?







In Defense of Grammar

Here are some other language blogs worth checking out:

Steve Kaufmann at The Linguist
Benny at Fluent in 3 months (not sure about Benny's last name, although I have a theory that "Benny" is his last name and "Irrepressible" his first).

If you read both of these regularly (as I do) you will find some fundamental disagreements between them on the best ways to learn languages; on the other hand, they seem to agree on other points, specifically:

(1) Traditional language classes are a waste of time; and
(2) Studying grammar is a waste of time.

Today I'm out to refute hypothesis (2). I'll leave hypothesis (1) for the future.

As I understand it, the main arguments against grammar study run as follow:

1. Grammar is scary and frustrating. The terminology is unfamiliar and confusing.
2. Real fluency in a language demands speaking intuitively, without stopping to analyze what one is saying.
3. Listening and repeating (the way babies learn) is more "natural" and preferable to the "artifical" approach of learning rules and memorizing vocabulary.

Overall, I think the attitude of wanting to study a language but not wanting the study the grammar is misguided. It's like wanting to learn ones way around a foreign city but not wanting to use a map, because maps are covered with intimidating symbols, and someone who really knows the city wouldn't need a map, and babies don't use maps anyway. However, to address these arguments point-by-point:

1. Grammar is scary and frustrating. This sounds to me like the real issue is grammar doesn't yield immediate gratification. Someone looking to learn a new language ought to be the last to object to having to learn new words. And of course the terminology is unfamiliar, because the concepts are unfamiliar. This is an important part of what you get with a new language anyway--a new way of looking at the world. And this particular new way of thinking ultimately streamlines language learning.

I experienced this myself just recently, in Arabic class, when it comes to understanding why nouns end with the vowel a in some situations, u in others, and i in yet others. Having previously encountered noun cases in Russian and Sanskrit, or even Latin (after which Arabic noun cases are a day at the beach), the explanation made immediate sense. My younger, nimbler, but naive classmates, unfamiliar with the concepts of "nominative", "accusative", "genitive", had a vastly more difficult time understanding what's going on.

2. Real fluency demands speaking intuitively. It's true that a fluent speaker can't be stopping to think about "rules" in the course of formulating a sentence. But the ability to step back occasionally and consciously analyze a sentence is also exceedingly useful, and makes one a better communicator. Drivers instinctively keep to the right side of the road and stop at red lights, but when asked, all drivers can explain the rules underlying their behavior. This makes them better drivers, not worse.

For Sanskrit or Arabic, for example, the development of a formal grammar was an important cultural milestone, and indeed a major achievement of civilization--the realization that this "thing" (language) that everyone uses instinctively could be analyzed and codified. In the age of computers and software, the idea of "grammar" has become an essential ingredient of our technological civilization. Why take pride in one's ignorance of it?

3. Grammar study is "artificial." As pointed out by Khatzumoto (although perhaps in gentler terms) babies are actually lousy language learners. Who else can spend an entire year in a completely immersive environment and acquire only a handful of words and be unable to form even a simple sentence? Children do have a talent for mimickry and the ability to soak up large amounts of vocabulary, but adults can more than make up the difference with rational, analytical thinking.

What's more, what you hear in any language is only the surface layer of something that runs far deeper. Consider the English word "resign". The "g" is silent, so why not write the word as "resin"? Because the "g" still exists below the surface, as you see when you pronounce "resignation."

When I first started Romanian, I used the "child" method--just listening and repeating phrases. Meanwhile I tried to analyze what I was hearing (probably part of my personality, but it's a good idea for anyone). I noticed early on that nouns came in masculine and feminine, but a particular puzzle was that a particular thing could be masculine in one sentence and feminine in another. It was cleared up only when I started reading about grammar and learned that Romanian nouns also have a third gender, which is masculine in the singular and feminine in the plural. Instant clarity.

Could I have figured this out on my own? Maybe, but only by piecing together clues from many different sentences and gradually figuring out the pattern of masculine versus feminine. Why not take advantage of the pioneers who did the analysis before you got there?

A final argument, which may not apply to everyone: I like grammar. It's the same pleasure I get from watching a seagull soar or a dolphin swim--a naturally designed structure, beautifully adapted for the task at hand. The only difference is that seagulls and dolphins exist in physical reality, whereas grammar exists in abstract ideas. But then I'm a mathematician--I like abstract ideas.

(Image above from Gardiner's Egyptian Grammar.)



Practical Joke #8

Photo by TheLichfieldBlog

1. Go to the thrift store. Buy a used baby car seat and a life-sized baby doll.

2. Epoxy the seat to the roof of your car and strap the doll into the seat.

3. Drive around town. Even better in the rain and/or snow.


A Face Language


You know that strange nagging feeling you get sometimes when you meet someone new--that you've seen this person somewhere before? Eventually you might figure it out--the person has a resemblance to someone else you know, either a celebrity or a personal acquaintance. I had just this nagging feeling about one of my students last year. Eventually, to my great relief, I figured it out--he looked just like Dwayne Johnson's (hypothetical) younger, slimmer brother.

When you "get" the resemblance, you might be a little surprised--it may span different races, different ages, even different genders. My advisor said it once: "there are fewer possible faces than there are people walking around."

I thought of a game one could play to test his theory. Take a collection of passport photos. Process them to eliminate differences in skin tone, hair style, or size. Cut each into a top half and a bottom half. The goal is to match up the top and bottom halves. Who wants to bet that you could do much better than random matching?

Another variation on this game: make two sets of photos and have two players (or teams of two players). The game this time is that one member of the team picks a photo at random and describes the face, and the other member of the team needs to find the matching photo based solely on the description.

I suspect that the best strategy in this game is not to describe someone as "eyes close together, nose 35% larger than normal, etc." but rather as "Dwayne Johnson's slender younger brother," or "the missing love child of Walter Cronkite and Britney Spears," or so on.

Since one of the things I'm not very good at is remembering names and faces, I've often wished for a concise, yet precise way of verbally describing a face--again, not in terms of dimensions but in terms of the impression on the viewer: a face language, so to speak. Think of what a boon such a language would be, for example, to describe bank robbers.

I have no good idea how such a language could be constructed, but the proof that it should exist lies in the art of the caricature. A good caricaturist (for example, Mort Drucker's work above) can draw a face recognizable as a unique individual with a few deft strokes. Hirschfeld's work was positively cartoonlike and yet instantly recognizable. And maybe one way to identify such a language is to set a pair of caricaturists to play the face matching game and analyze what they say to each other.