Tuesday, October 04, 2016

Statistical Projection vs. Scientific Generalization

When my daughter was quite young, before she was able to walk, she saw a ball bounce and  roll. She laughed heartily. I don’t think she needed to observe a sample of five hundred round, spongy things bounce and roll in order to conclude that round, spongy things bounce and roll.

Similarly, neurologist V. S. Ramachandran, proponent of the value of individual cases to science, has remarked (quoted in Doidge, p. 178):

Imagine I were to present a pig to a skeptical scientist, insisting it could speak English, then waved my hand, and the pig spoke English. Would it really make sense for the skeptic to argue, “But that is just one pig, Ramachandran. Show me another, and I might believe you!”
The skeptical scientist, typical of nearly all scientists today, insists that the only way to establish knowledge is to observe five hundred cases, or a thousand, or two thousand. Anything less is an isolated instance, often denigrated as anecdotal evidence. In the absence of a sound theory of universals—because David Hume failed to find a necessary connection between cause and effect, and logical positivism picked up the banner of science, followed by Karl Popper’s notion of falsificationism—statistical “generalization” is said to be the only valid method of science.

It is this premise that allows modern psychologists to dismiss the entire Freudian psychoanalytic corpus, including the concept of repression, as unscientific, or worse, as pseudoscientific. Why? Because Freud’s evidence is “anecdotal” and the experimental methods of the physical sciences cannot validate his ideas. It is this premise that allows nearly all scientists to dismiss the notions of consciousness, free will, and introspection.

There is, however, a sound theory of universals: Ayn Rand’s theory of concepts, which I have summarized in my two books (In Defense of Advertising, pp. 147-52, and Montessori, Dewey, and Capitalism, pp. 82-86). Conceptualization is a process of universalization. It is based on Aristotle’s formal cause, which says that an entity’s actions are determined by its identity. Identifying universal relationships between entities and their actions give us principles and laws.
Concepts identify the nature of entities. Their essential distinguishing characteristics are universal. It is not that hard.

Thus, my daughter’s laughter at witnessing the round spongy thing bounce and roll was her conceptualization of that entity, by observing its essential distinguishing characteristic. Of course, she did not have words to describe the process at the time, but her mind, nonetheless, was processing her perception. The same can be said about Ramachandran’s English-speaking pig (assuming no tricks of ventriloquism). One does not need a sample of five hundred English-speaking pigs to conclude that something quite unusual has just happened.

Statistical projection—and the correct word is “projection,” not generalization—has its place in our search for knowledge, but it does not replace scientific (inductive) generalization.

Statistical inference, as it is also correctly called, projects a finding from a sample to a population. Thus, if data in a sample of 500 American men show that two percent have red hair, and the research did not commit any flagrant methodological errors, then a projection (or inference) can be made, within a margin of error, that two percent of men in the entire country have red hair.

A projection moves from some to some—from two  percent of the sample to the same two percent in the population.

A scientific generalization, on the other hand, when, for example, forming a concept of round, spongy things as something that bounces and rolls, or of human beings who possess the capacity to reason, moves from all to all.

All of the balls I have observed bounce and roll; all humans that I have observed possess the capacity to reason. Therefore, all balls, past, present, and future, by their very nature,  bounce and roll. The same conclusion is drawn for all humans.

The place of statistical projection? As I wrote in In Defense of Advertising (p. 157), “Statistics is a branch of mathematics and, as such, is a method of measurement. Statistical inference . . . is used only in contexts in which we do not know—or there do not exist—universal laws that could explain the causal relations of the variables.”

Meteorology represents the former, because of the large number of unknowns and difficult-to- measure variables in constructing weather forecasts (all of which, though given many different names, are forms of statistical projection).

Predictions of people’s behavior, because free will precludes the existence of universal laws governing all of our behavior, represents the latter; we make statistical projections, albeit not based on randomized samples, unless we are professional researchers, of what others will do in the future based on our current and past knowledge of them.

Statistical projection assists scientific research. It is not a substitute for it.

And one does not have to accept everything Freud said to acknowledge his accomplishments, not least of which is his presentation of the first comprehensive theory of psychology.

Freud was looking for universals, and he found a few: repression, defense mechanisms, and the significance of the subconscious to influence our present behavior.

They may not be round, spongy things, but I am laughing heartily—at my discovery of these Freudian universals!

Saturday, September 10, 2016

Is Intelligence Inborn?

My IQ—the so-called intelligence quotient—is probably twelve.

(Psst! And I’m proud of it!)

I say “probably” because I have never known my score. One day in junior high school we were all herded into the auditorium to take a standardized test. After about the first page of questions, I decided, “This is stupid,” and stopped answering. Hence, my presumed score. Teachers never told us what the purpose of the test was.

Intelligence, as defined by the Oxford English Dictionary is “understanding as a quality admitting of degree; spec. quickness or superiority of understanding, sagacity.”

There are two usages here. One, “the intelligence” or “understanding” is synonym for rational faculty, which is our capacity to reason and think conceptually that distinguishes us from the lower animals. The other usage, as the OED says, “admits of degree.” Thus, there are supposedly brighter, smarter, more intelligent people and there also are the dull and dumb.

Degree of intelligence is not the same as quantity of knowledge or retained subject matter. I have met many uneducated blue collar workers and unskilled laborers who are more intelligent than college professors!

Intelligence is supposed to be an inborn ability, not an issue of how much knowledge one has accumulated and can spew out to impress those supposedly less endowed.

Ayn Rand has variously defined intelligence as “the ability to deal with a broad range of abstractions” and “the ability to grasp the facts of reality and to deal with them long-range (i.e., conceptually).”

I don’t doubt that this is a component of what we think of as smart, but a genius who has a greater degree of intelligence than the rest of us, to put it in the vernacular and to relate it to the OED’s definition, is a “quick wit,” a person who grasps an insight or makes a connection ahead of everyone else.

By analogy, an entrepreneur is someone who sees and seizes profit-making opportunities ahead of others. Some of us may also see the opportunity but we often do not act on it. The entrepreneur does.

Similarly, the highly intelligent person does not just make an unseen-before connection, but acts on it by conducting an experiment and writing a book. Some of us may have daydreamed about a “what-if” fuzzy linking but never get beyond the fuzziness.

The genius and entrepreneur both must hold in mind a great deal of knowledge related to their field, possess the ability to work with a broad range of abstractions, and think long-range (i.e., conceptually), but it is their “quick wit” that puts them out in front of others. (And contrary to what our Marxist-laden intelligentsia may think, entrepreneurs can be highly intelligent and even geniuses.)

The problem I have with the concept of intelligence, like all the other traits I discussed in last month’s post on the genes/environment debate, is that other variables, such as that unmentionable one, free will, but especially interest, can explain what is supposedly inborn.

Interest is a desire that directs intensive and sustained attention to a particular goal or object. It is interest, when put into action as effort, that drives a five-year-old to become a concert pianist as an adult. It is interest that drives entrepreneurs to think day and night about the next profit-making opportunity. And it is interest that drives geniuses to uncover every stone until they have found that next important discovery.

Interest is a potent motivator that can separate the highly accomplished, whom we would also likely call highly intelligent, from the rest of us. Strong interest—which also has to mean here the absence of psychological inhibitions and presence of choice or free will—could well be the key variable to explain the “degree of understanding” that the highly intelligent possess.

Yes, knowledge, or a context of subject matter, is required to make great accomplishments and to make great discoveries possible, but I am convinced that anyone with a normal brain, a good teacher, and patience can learn that context of knowledge, however abstract it may be. Interest and will power, if present, can take such a student to the next level.

So is “quick wit” inborn?

Those twin studies don’t prove anything. For nearly a hundred years they have attempted to prove that many traits, including intelligence, are inherited. Clinical psychologist Jay Joseph has thoroughly examined the studies of identical and fraternal twins, both reared together and reared apart (1, 2, 3, plus three books, the latest here), and has declared them “one of the great pseudoscientific methods of our time . . . [that] will eventually be added to the list of discarded pseudosciences where we now find alchemy, craniometry, and mesmerism.”

What about IQ tests? Please! Aside from the fact that these tests, along with college entrance examinations, correlate with socioeconomic status and the latter do not predict college success (high school grades are the better predictors), IQ testing is a contrived situation, as is paper and pencil testing of all kinds. Testing seldom corresponds to the reality it is supposed to represent.*

As I wrote in Montessori, Dewey, and Capitalism (note 19, p. 158), “Supermarket shoppers in one study performed arithmetic calculations far more accurately in the store than on a formal test. And one boy, considered the dumbest in his class, was discovered by his teacher to be a paid scorekeeper in a bowling alley, simultaneously tracking the progress of two teams of four players each. The teacher promptly created word problems, requiring students to calculate scores for games of bowling. The boy could not do the problems.”

I think I’ll go find that dumb kid and have him teach me how to score games of bowling. I never could figure that out.

No wonder my IQ is only twelve!

*And let us not forget that IQ testing originated in the eugenics era, designed to sort out the “dumb” and “feeble-minded” for isolation and perhaps sterilization. Today, IQ testing and college entrance examinations perform a similar function (sterilization excepted), shunting the “dumb” off to the less prestigious colleges and trade schools (and in many countries to a blue collar life that they cannot overcome). The ones who score a little better than the “really dumb” ones and live in government-created slum areas are given special favors and money to attend the prestigious universities. There, many suffer a mismatch with their classmates—and flunk out. The sorting continues. (See 1, 2, 3.)

Monday, August 15, 2016

Genes vs. Environment: Anyone for Free Will?

Do genes cause behavior? If they do, one would expect to see evidence of criminality, genius, schizophrenia, homosexuality, and evangelical Christianity in infants. All of these behaviors, plus many others, have been said to be inborn.

To expect an infant to exhibit these traits is absurd. To say that an infant has inherited the potential to become a criminal, or evangelical Christian, says nothing. We are all born with that potential, plus countless other potentialities.

Does environment cause behavior? The trouble with this assertion is that there are always exceptions to the good and bad things environment does to children when they are growing up.

Some children reared in crime-ridden, slum neighborhoods become criminals while others do not, even if they are siblings in the same family. The same can be said for children reared in safe, wealthy suburbs. Others raised in religious families follow their parents and become evangelical Christians, while some rebel and become atheists.

The determinism of the genes/environment axis is a self-contradiction—determinists have to acknowledge that they are determined to believe in determinism. Yet they pretend to be making a logical choice to believe in determinism.

Something other than genes or environment must be operating to cause our behavior.

Here’s a novel idea. How about thought, that processor of genetic inheritance and environment that generates our motivation and directs behavior?

Thought, or more broadly, consciousness, makes errors and has to control itself in order not to make mistakes. Free will is cognitive self-regulation, which means we may choose to focus on the facts or evade them, allowing other factors, such as emotions, presuppositions, or political doctrine, to interfere with correct perception.*

Our guide to the correct perception of reality is the 2500-year-old science of thinking called logic. As the discipline and art that regulates internal thought processes, logic is the quintessential introspective science. The genes/environment axis, however, does not want to admit that logic is introspective, because then they would have to admit that consciousness controls behavior and that introspection is a valid method of science.

Psychologically, this means our personalities are self-created. The cause of behavior is the innumerable conclusions we have drawn—the myriad thoughts, logical or not, we have had—about our genetic inheritance and the environment in which we live, from the time we were able to process words right up to the present.

These innumerable conclusions and myriad thoughts accumulate and become the mental habits by which we live. As habits (or psycho-epistemologies), many have become so automated, buried in our subconscious with their origins largely forgotten, that they feel to us as if we were born that way, or that something external is making us act the way we do.

Lack of introspection, or more specifically, introspective skill, to examine our motivating premises—thoughts, evaluations, emotions—makes it hard to appreciate how much control we in fact have over our lives.

Habits can be good or bad, the good ones leading us to live a happy life, the bad ones not so happy. The examined life, to paraphrase Socrates, is worth living; the unexamined one leads to problems in living.

Mental habits are all learned.** We were not born knowing how to drive a car, for example, but in adulthood, adults can safely drive while carrying on a conversation and listening to music on the radio. All of our actions follow this pattern.

Certain habits, generated from core evaluations and other less fundamental but nevertheless significant evaluations, are usually acquired when very young, from toddlerhood on. We retain these early conclusions about ourselves (our sense of personal identity), the world, and other people and hold them as unquestioned absolutes.***

It is in toddlerhood that we begin to speak, which means we are beginning to think in concepts and words.

Young children do not usually form these important conclusions through explicit reasoning, but through a process of emotional generalization. At the risk of oversimplification, an emotion at this stage in life, if it could be put into words, might say something like, “That made me feel good about myself. I’ll do it again.” Or, “I didn’t like that and I’m not going to feel it again.”

Repeated many times over, the former, if based on a correct perception of reality, can lead to the development of self-esteem, the latter, which most likely includes errors, to repression and subsequent psychological problems.

If taught from an early age to look inward to identify our thoughts, evaluations, and emotions, and to correct errors we have made, we would grow up with healthy psychologies. Most of us, however, have not been taught much of anything about psychology, in childhood or adulthood.

Thus, when the genes/environment axis comes along, it makes perfect sense that our behavior is caused by something we have no control over.

The irony is that genes and environment do have an influence on us, in the sense that genes give us gender and skin color and environment can make life easy or difficult, but we are the ones who develop attitudes—conclusions, evaluations—about gender, skin color, and environment.

To help us correctly perceive and evaluate what genetics has given us and what goes on in our environment, teaching is crucial. Parents and the schools need to instruct children in the skill of applying logic to their own psychologies.

The unfortunate consequence of the genes/environment debate is that the axis devalues the environmental influence of an education in sound psychology. For that is what is required to help us use our free will to assess genetic inheritance and environment and thereby make better choices to live a happier life.

* This is Ayn Rand’s theory of free will as volitional consciousness.
** All habits, at root, are mental. I use “mental” here to emphasize their psychological origin.
*** The concept of core evaluations was identified by psychologist Edith Packer and presented in her lecture “Understanding the Subconscious” in 1984. Lectures on Psychology, chapter 2.

Thursday, July 07, 2016

The Fascist Early Progressives

This post title may be a bit extreme, to call progressivism “fascist,” but not by much.

After all, Ludwig von Mises referred to fascism as “socialism of the German pattern” to distinguish it from the Russian version. Instead of expropriation of private property to achieve socialist states, Nazi Germany and Fascist Spain and Italy imposed extensive government regulations to control private life, both business and personal.

The Progressive Era in the United States, from about 1890 to 1930, established the same pattern, but it was based on the leaders’ learning of democratic socialism in Prussian universities.

The early progressives’ specific policies, as comprehensively documented in Thomas C. Leonard’s book Illiberal Reformers: Race, Eugenics, and American Economics in the Progressive Era, would not be considered politically correct today, but their fundamental principles of using think-tank-guided “experts” and government guns to achieve socialist goals have been internalized by modern liberals and conservatives alike, and vastly extended to control nearly all aspects of private life.

Most of the early progressives were reared in old New England families, which made them evangelical white Anglo-Saxon Protestant males, and their program was largely independent of political affiliation: it was strongly supported, for example, by both the Republican Theodore Roosevelt and Democrat Woodrow Wilson.

Here’s a first taste of the progressives’ thinking, as stated in a review of Leonard’s book: “In the early twentieth century, progressives displayed an open contempt for individual rights. In a 1915 unsigned editorial at this magazine [The New Republic], the editors ridiculed the Bill of Rights as a joke.”

The reviewer continues, “If Leonard didn’t have the quotes from prominent progressives to back up his claims, this would read like right-wing paranoia . . . .” And the quotes are numerous.

This book is an important corrective to the history profession’s biased glamorizing of early progressivism.

The liberal individualism of the Scottish Enlightenment was viewed by these early progressives as selfish and therefore un-Christian and immoral. Their evangelical focus shifted from saving souls to saving society, from the individual to the collective. The “public” or “common good” became the standard for policy.

Indeed, one of the motivations for founding the American Economic Association in 1885 was to counter and exclude the ideas of classical liberals Herbert Spencer and William Graham Sumner.* The promoters of progressivism were mostly economists and sociologists: Richard T. Ely, John R. Commons, Edward A. Ross, and Irving Fisher, plus many more.

Elitism and social engineering, not democracy, were their motivating aspirations.

Their form of elitism sought to exclude certain groups, believed to be inferior, from participating in much of society. For example, they eagerly sought to preserve race purity and maintain a living wage for workers of northern European extraction.

Among the groups targeted for exclusion were African Americans, women, and immigrants—especially the Chinese and those from southern and eastern Europe, which especially meant Jews. The disabled, feeble minded, and insane were also inferiors who were excluded to asylums and special farms away from the cities; in some cases they were sterilized.

The former three groups (the early progressives were not Marxists and did not use the term “classes”) were less skilled than their white Anglo-Saxon Protestant male counterparts, so by their willingness to accept a lower wage, they threatened to reduce earnings of the “more deserving” male workers.

Minimum wage and immigration laws were the progressives’ solution.

Maintaining racial purity was more of a challenge, but the “state-of-the-art science” of eugenics came to the rescue. “Well-born” is the meaning of the term, coined by Darwin’s half-cousin, Francis Galton. The aim of eugenics was hereditary control of the race through compulsory sterilization and euthanasia.

Up to 60,000 sterilizations were performed in the United States, as late as 1972. Justification for the practice was given in a Supreme Court decision in 1927, authored by the progressive justices William Howard Taft, Louis Brandeis, and Oliver Wendell Holmes. The legal argument said compulsory sterilization was no different from compulsory vaccination.

Cancelling compulsion in either case was not an option—to those who knew best.

American psychiatrists promoted and supported the Nazi sterilization program that ran from 1934-39. A few supported compulsory euthanasia. The euphemistically labeled “mercy killings” began in Nazi Germany in 1938—in gas chambers disguised as showers.

Eugenics is not much talked about today, or taught in the schools—for the obvious reason that modern progressives do not want to be associated with Nazi Germany. The eugenic connection to progressivism is also seldom mentioned or taught, but it fit the progressives’ program like a glove.

Fascist progressivism? Theodore Roosevelt saw race suicide as the greatest problem of civilization and, according to H. L. Mencken, whom Leonard quotes, “believed simply in government,” not democracy.

The “quality” of the vote, not quantity, was what counted for progressives. Wealth and literacy tests were recommended to determine who should be allowed. Voter turnout in national elections fell thirty percent between 1896 and 1924, even more in the Jim Crow South.

Woodrow Wilson praised those “sturdy stocks . . . [from] the North of Europe” and denigrated immigrants from southern and eastern Europe. He also derided inalienable rights as “nonsense.” The Wilson administration re-segregated the federal government.

Fascist progressivism? Much, much more can be found in Leonard’s book.

* “Social Darwinism,” Leonard points out in a journal article, was a less-than-accurate construct of Richard Hoftstadter in 1944. It then became a favorite pejorative of modern historians, used to disparage the Progressive Era’s capitalism and capitalism’s advocates. The phrase was hardly used during the period, least of all by Spencer or Sumner.

Sunday, June 12, 2016

From the Stick Motivation Department, Part Two: Class Participation

In last month’s post I discussed the coercive parenting technique of assigning chores to children. The alleged lesson of such coercion is to teach children the value of work, though it likely teaches them to hate it.

Coercive teaching contributes a number of techniques to the stick motivation department. Let’s take a look at class participation.

For middle and high schoolers, and even college students, teachers feel obliged—and claim the unquestioned right—to coerce quiet members of the classroom to “come out of their shells” lest they fail to succeed in life or live up to their potential; grades based on class participation, by as much as fifty percent, I have heard, is the brass knuckle approach to teaching this lesson.

Five to ten percent, perhaps as extra credit, may have some instructional value. But fifty percent? These classes of twenty to forty students are not courses in public speaking.

And grades, after all, are the carrots and sticks by which teachers maintain control of their charges. I’m not the first to suggest that grades be dumped from the classroom entirely.

In the old days, the traditional (and coercive) recitation technique of class participation required students to summarize the content of their reading assignment, or, frequently, to recite something from memory. If not accomplished to the liking of the teacher, the kids would have their knuckles rapped with a ruler, or worse. In ancient Rome, they were beaten with a stick.

The modern version is a mixture of old-style recitation and analysis. The former, as the new schoolmarms are wont to say, should be kept to a minimum, because “we have all read the assignment.” Memorization, of course, is scorned as authoritarian and having no place in school. The latter, analysis, can include putting the reading material into different words, evaluation, and, too often, the spewing of undefended opinion.

According to ed school edubabble, such discussions will help teach students how to think. It usually degenerates quickly into BS sessions. And some of the more talkative students have mastered the technique of impressing teachers by their glibness; the quiet ones are then marked down.*

In recent years, some class participation teachers have discovered—and have experienced revelations when discovering—that their heavy-handed approach to getting those pesky and resistant-to-talking quiet kids to speak up in class may not be the best thing for them.

This has come about largely due to Susan Cain’s 2013 book Quiet: The Power of Introverts in a World That Can’t Stop Talking and her subsequently established website Quiet Revolution.  (See my related posts here and here.)

Jessica Lahey is one such reformed class-participation tyrant. Writing in The Atlantic, she firmly defended her conviction that quiet kids must be forced to speak up. When she received an “avalanche of angry comments,” many of which, to put it mildly, declared her “uninformed,” she wrote a softened article on Cain’s website. Lahey acknowledges that she was influenced to alter her teaching by Cain’s book and other articles on the topic (See especially Schultz and Cain for examples of teaching quiet kids without putting them on the spot.)

The upshot of “class participation reform” is that introversion and shyness are not the same and that any behavior can be motivated by multiple causes, not just what the extroverted teacher assumes is operating in the quiet kid.

“Shyness is the fear of social disapproval or humiliation, while introversion is a preference for environments that are not overstimulating” (Cain, p. 12). A room filled with twenty to forty classmates can produce considerable overstimulation for an introvert. Extroverts prefer the stimulation.

And most extroverted teachers assume shyness and introversion are identical. They also do not recognize that quiet kids may be actively listening to the other talkers, waiting for the moment to speak up when they have formulated what they would like to say. They also might be taking notes, say, for a subsequently required paper. Or, something extroverted teachers usually do not want to hear or acknowledge, the student may not like the teacher or the class, or both.** Certainly, there are other motivations.

But just as not voting in an election is participation in the political process, so also is not speaking up in class a form of participation. Teachers need to respond to, and find techniques of, reaching all personality types sitting in their desks.

A “one size fits all” approach to teaching, such as the assumption that speaking up in class is good for everyone, invariably brings out the specter of stick motivation.

* I must point out one more time that John Dewey, the alleged father of progressive education—“alleged” because he gave the epithet to Francis W. Parker—lectured when he taught, expected excellent memorization from his students, and wrote a book in 1938 to repudiate many progressive techniques used in his name, such as the necessity of class discussion to teach students how to think. For Dewey, subject matter was fundamental, because it is the “working capital” of thought.

The premise of many teachers today, as one colleague said to me years ago, is that “we teachers talk too much as it is. We have to get the kids talking.” I took that to mean less work for the teacher, something that was explicitly stated by my grad school classmates as justification for group projects: one paper to grade instead of four or five.

** Yes, I know there are teachers who brag about how they don’t care whether or not students like them or their courses. But they should.

Wednesday, May 04, 2016

From the Stick Motivation Department: Chores

There are many ways in which adults lord their size and power over children.

It usually begins with spankings of a disobedient toddler by, say, a towering six-foot-plus dad who leers, yells, then hits the helpless tot. (See related posts 1, 2, 3.)

Why? Aside from the excuse that “that’s the way my parents treated me and their parents treated them,” etc., ad infinitum, the most common rationalization is that children have to learn to mind, lest they run out into the street and get killed, or turn into juvenile delinquents, or become criminals.

“Didn’t hurt me none” is the less than introspective response of some adults when asked why they did not question their parents’ stick-motivation techniques.*

The assignment of chores to kids so they allegedly will learn how to work hard and become responsible citizens of society is another form of stick motivation.

Webster’s Unabridged and the Oxford English Dictionary both define “chore” as a tedious task that must be performed regularly, such as washing the dishes or hoeing long rows. The OED also says the word is a colloquial Americanism.

What astounded me when examining these two reference works was the absence of any mention that chores are not optional. Yes, adults sometimes do refer to their own monotonous tasks, such as brushing their teeth twice a day, as chores, but the origin of the term in American culture surely is coercion of the young.

Chores for children are almost never voluntary. Just ask a child what happened the last time he or she refused to do one!

Fortunately, novelist Jane Smiley, writing in Harper’s Magazine (paywalled here, available here), has put the kibosh on the supposed benefits of this favorite of coercive parenting.

Smiley was born with a silver spoon in her mouth, so she never had to clean her room or wash the dishes. She did have a horse and, as she puts it, through her love of and interest in the animal learned to work hard to groom and feed it and clean its stall, which meant removing the poop.

Smiley’s husband, on the other hand, grew up in Iowa, less than wealthy, and was forced to do chores—mixing concrete with a stick at age five and, later, pushing wet, heavy wheelbarrow loads of it across the yard.

Guess which one, Smiley or her husband, enjoys life more today?

Smiley’s husband feels guilty playing golf when there is always more work—chores—to be done at home; when doing the chores, his motivation is to get them over with as quickly as possible.** He was taught well. Chores are tedious drudgery, which means work is drudgery.

And that’s because parents give kids the dirty work as chores. Says Smiley, “Mom cooks and Sis does the dishes; the parents plan and plant the garden, the kids weed it.”

In addition to teaching the “value” of work, Smiley points out that another apparent purpose of chores is to make sure the children contribute to maintaining the family, by sharing the work that needs to be done. Smiley comments:

According to this rationale, the child comes to understand what it takes to have a family, and to feel that he or she is an important, even indispensable member of it. But come on. Would you really want to feel loved primarily because you’re the one who gets the floors mopped? Wouldn't you rather feel that your family's love simply exists all around you, no matter what your contribution? And don't the parents love their children anyway, whether the children vacuum or not? Why lie about it just to get the housework done?
Why lie indeed? It is really a threat to withdraw love if the child is not obedient, similar to the withdrawals of love for disobedience that result in time outs and being sent to one’s room.

Smiley concludes: “It's good for a teenager to suddenly decide that the bathtub is so disgusting she'd better clean it herself. I admit that for the parent, this can involve years of waiting. But if she [mom] doesn't want to wait, she can always spend her time dusting.”

Parenting, after all, is a twenty-plus year contract chosen and signed by the parents. Children are not their slaves.

Presumably, the American concept of chores originated on self-subsistent farms, where there certainly was a lot of heavy, tedious work to be done to maintain the homestead.

The “justification” of requiring pre-school children to lug heavy pales of milk and to pluck chickens, however,  is not the assumed necessity of a division of labor in the family. It is the value system of nearly all American farmers, absorbed by their citified descendants, of Puritanism. American culture still today is highly Puritanical.

And what might that value system be? The duty ethics of Christianity reinforced by philosopher Immanuel Kant. As one middle American farm-raised father said not too long ago: “You do your job because it is your duty, not because you enjoy it.”

As Kant said, never act from inclination, but always in accordance with duty. Fun and pleasure are out. Chores are in.

In contrast, visit a Montessori school to see how children are taught without coercion to love work, to associate pleasure with it, and to learn the skill of intense concentration.

*Corporal punishment in all settings, which includes spanking by hand in the home, is now banned in forty-nine countries of the world. The United States is not one of them. Sweden was the first, in 1979, and surprise, surprise, those children who were not smacked or beaten did not turn into juvenile delinquents or criminals!

The website corpun.com archives a large number of video clips from around the world of both adult and child corporal punishment. I could only stomach watching one: a Sri Lankan military trainer hitting female recruits with a long stick.

**Stemming from the same value system, this is the motivation for children who eat their peas first to get the disgusting stuff out of the way so they can enjoy the good-tasting meat and gravy last!

Wednesday, April 06, 2016

On Involuntary Servitude:
“You’ll Do Something, Mr. Cook. . . . If You Don’t, We’ll Make You.”

The March 28 Time magazine cover story about Apple Inc.’s legal battle against the FBI and the lengthy interview with CEO Tim Cook are well worth the read.* So also is the earlier February 25 column on this case by Judge Andrew Napolitano.

There are several takeaways from the three pieces.

The FBI in February had ordered Apple to create new software to hack the encrypted iPhone of a dead terrorist. Apple contested the order, saying it would be a violation of civil liberties and that such software would put a master key in the hands of bad guys all over the world, including authoritarian governments. This, Cook says, is tantamount to banning encryption.

The case is now moot, because the FBI did what it should have done in the first place: it hired an independent firm to hack the phone, presumably achieved without creating new software. The order at the FBI’s request has been vacated, but the issues, including the possible future coercing of Apple and other tech firms, remain.

Tim Cook in the Time article and interview says that banning encryption means only the bad guys—such as terrorists—will have it, because encryption software is widely available beyond the borders of the United States. I doubt that Cook intended this, but he is making the same argument as the defenders of the Second Amendment: ban guns and only the bad guys will have them!

Cook says the court order amounted to a violation of the civil liberties of Apple’s customers, especially their right to privacy. Judge Andrew Napolitano made it an issue of due process, because Apple was not given proper notice, and, more significantly, a case of involuntary servitude.

Let’s take involuntary servitude first. The phrase comes from the Thirteenth Amendment to the US Constitution that outlaws slavery. The Supreme Court, however, has issued a number of rationalizations why a military draft and other forms of forced labor do not constitute servitude. The main excuse is that the amendment was passed specifically to apply to African slavery, not to other forms of forced labor. That is, all young, able-bodied men—and today, women—owe a duty, when so ordered, to perform work for their government and, if “necessary,” to go die for the old men (and women) in power in Washington.

The justices of the Supreme Court, not to mention legal experts and other intellectual leaders, both today and yesterday, have failed to understand that rights are absolute and universal. A freedom to take action, when not infringing anyone else’s freedoms, is a freedom to take action.

And slavery is slavery, as Judge Napolitano argued. Slave labor is precisely what Apple was asked to perform.

Indeed, the FBI vs. Apple case was an Atlas Shrugged moment on at least two counts. Several Apple engineers had stated that they would refuse to write such requested new software for the FBI, risking fines and imprisonment. Or quit. In effect, they were threatening to strike.

The working title of Ayn Rand’s novel was The Strike.

The case most amazingly was a Hank Rearden moment. I’m referring to the passage in the novel where the steel titan is ordered by James Taggart and his cronies to produce at a loss and therefore make the irrational work. When Rearden asks how he is supposed to accomplish that, Taggart responds, “Oh, you’ll do something.”

A major theme of the novel is that creativity and innovation do not work at the point of a gun, but that was what the FBI was asking and expecting Apple to do.

What our country needs more of today are business CEO’s with the integrity and courage of Mr. Cook—to stand up to their government.

In fact, this confrontation between the FBI and Apple would make an excellent business ethics case for future (or even current) executives to discuss.

The civil liberties issue that Cook talks about brings up the canard about privacy versus security. Cook wants to defend his customers’ privacy. The FBI and Washington don’t give a hoot.

When a crisis occurs, the politicians and bureaucrats scream security over privacy. Rights be damned. And the use of fear by the government usually succeeds in getting citizens to cough up their rights.

Cook points out that the government wanted Apple to create a master key and give up the privacy—which really means security and safety—of millions of people around the world in order to go after a “sliver” of bad guys.

Somehow the lawyers in Washington seem to have forgotten the training that taught them a most important principle of the free society, namely that it is better for a guilty person to go free—that would be Cook’s sliver of bad guys—than for an innocent one to be sent to jail.

Plus, as long as I am talking about involuntary servitude, this brings up the related Vietnam War era discussions of the prospects of an all-volunteer army. “There might not be enough volunteers,” the supporters of the draft yelped incredulously. Two answers were given, aside from the prickly issue of rights versus slavery: one, perhaps the war was not just and we shouldn’t be involved at all, or two, if the war is just and the country does not have enough volunteers, then the country deserves what is coming to it.

This last applies similarly to the FBI’s attempt to force Apple into involuntary servitude, for unjust means to a just end can never be moral. Coercing Apple to hack a dead terrorist’s phone to obtain information that might prevent the occurrence of a future event destroys the principle of justice and ethics.

If, however, in the name of justice the FBI refused to coerce Apple and, as consequence, failed to obtain such information, at the very least it could then stand tall and say that it upheld a cardinal principle of the free society.

The real—practical—issue here, though, is that the FBI (and government as a whole) needs to become proactive in creating better crossbows. In any weapons race, the bad guys will sooner or later obtain the latest crossbow, or encryption technology, which means the good guys must stay one step ahead of the bad. Apple has done, and is continuing to do, just that.

It is time for the government to do the same, instead of wasting money and resources trying the coerce Apple to correct the FBI’s own mistakes.

The FBI’s mistake was the order to reset the iPhone’s passcode, which resulted accidentally in the Bureau’s inability to access the phone’s information.

In the few weeks of this FBI standoff, Apple fortunately was not raided by gangs of armed, bulletproof vested SWAT teams. Apple is a high profile, well-liked firm and escaped—for now—such inexcusable tyranny.

Tennessee based Gibson Guitar a few years ago was not so fortunate.

After the SWAT teams left, Gibson CEO Henry Juszkiewicz, like Tim Cook, spoke up to defend his business. Never charged (for illegally importing wood from Madagascar and India), and, of course, no apologies given, Gibson was slapped with a fine and a gag order—to never again speak up to point out how unjust the US Justice Department is, which is to say: to never again attempt to defend itself.

*Dated March 17 in the digital versions.

Postscript. I cannot pretend to keep up with all the issues involved in this post’s encryption battle, but WhatsApp, the online messaging service, has just announced that it has encrypted all messages of its billion or so worldwide users. No one in the WhatsApp office can listen in to or hack what is being, or has been, said.

WhatsApp’s analogy to defend encryption is that what is now being done electronically has been done for centuries without the electronics, because it is just conversation that formerly was done at the water cooler or under an old oak tree. If the FBI wants the information that is being discussed, it either needs to subpoena the participants or send spies to the coolers and trees.

Spooks on the ground to gather intelligence. What a novel idea! It used to be done but, as I recall, budget cuts going back to the Clinton administration led to the post-9/11 hysteria about weapons of mass destruction in Iraq. The Bush administration had to rely on satellite photographs to verify information that should have been obtained with real people seeing with their own eyes.

Friday, March 04, 2016

The Communist Era and Capitalism vs. Democracy

Sidney Hook’s 600-page autobiography Out of Step provides a wealth of information about New York intellectual life in the twentieth century, especially the communist era from the 1930s to 1960s.

It also indicates that the main debate today is not, or should not be, capitalism vs. socialism, but capitalism vs. democracy.

As Marxist scholar, communist fellow traveler, anti-Stalinist, pro-Cold Warrior, anti-New Leftist, and adamant defender of democracy, Hook knew or was acquainted with nearly all of the players of the communist era.

The difference between Hook and his Communist Party colleagues is that he actually read and thoroughly understood Karl Marx, so if he had been a professor in a USSR university in the 1920s and ‘30s, he, like others before him, certainly would have been purged.

The names of many of these players should be familiar to anyone who has read about or lived through any portion of the communist era, for example, playwright Bertolt Brecht, who was worshipped by my 1960s New Left professors, and journalist Whittaker Chambers. Hook has stories about all of them.

Brecht, one day in Hook’s apartment in 1935, made a casual remark about a Stalin-assigned assassination: “The more innocent they are, the more they deserve to be shot.” Hook showed Brecht the door and never saw him again.

Chambers was a Stalinist spy in the 1930s who, when he came in from the cold, was immediately hired by Time magazine. Chambers later testified against Alger Hiss, accusing the high-level State Department official of also being a former spy. Hiss was subsequently convicted of perjury and went to his death denying it all. Hook concluded the evidence was against him.*

Never a card-carrying Party member, Hook became anti-communist after the Moscow Show Trial revelations of 1936-38. He supported US entry into World War II against the Nazis while Party members, who took their orders from the Kremlin, opposed any support for the evil capitalist regime of the United States.

During the Cold War, when communist apologists were advocating unilateral disarmament on the part of the US and saying it was better to be red than dead, Hook supported a strong defense and pushed the slogan “better free than slave.”

His chapter on the New Left’s spring 1969 uprising at New York University, that is, its occupation and disruption of academic life, is detailed and alarming. His description of the corresponding spinelessness of the school’s administration is equally detailed and alarming.

Not a friend of the New Left in the 1960s, Hook declared its campaigners “anti-intellectual” and “barbarians of virtue.”

And to set the record straight on whether or not, in earlier years, Communist Party members had infiltrated US educational institutions—the New Left had rewritten history to say otherwise—Hook cites Communist Party instructions to its members to teach Marxist-Leninism in every class without being caught or exposed. This confirms what I once heard Ayn Rand say, this time in Hook’s words: the duty of card-carrying communists was “to deceive and to cheat.”

The main political debate throughout Hook’s life, especially as stated by him, was democracy vs. totalitarianism. Nazism, fascism, and Soviet communism represented the latter, but as a lifelong socialist—on moral, not economic, grounds—the former meant democratic socialism.
For Hook and his socialist colleagues, socialism is the ultimate end of Jeffersonian democracy and the Bill of Rights. This is sometimes called social democracy, though more often in the US its close cousin is social or progressive—as opposed to classical—liberalism. Social liberalism is an alleged improvement on the classical type.

At the end of his life (and book), Hook acknowledged that collective ownership of the means of production—the socialist state as giant post office, to use Lenin’s metaphor—does not work. Thus, he describes himself as “an unreconstructed believer in the welfare state and in a steeply progressive income tax.” Interventionism, in other words, with a strong leftward bias.

Although he spent many of his last years at Stanford’s Hoover Institution and was awarded the  Medal of Freedom in 1985 by President Reagan, Hook was no conservative. He was a secular humanist (and naturalist), which means he was a lifelong atheist and ardent supporter of science and scientific method.**

Democracy for Hook, however, was primary. He and nearly everyone else in the world today, including the Marxists and communists, seem to advocate democracy. So what does Hook mean by it?

In the absence of genus and differentia, he gives descriptions, such as “free discussion,” “freely given consent,” “voluntary [consent], not subject to coercion,” and, most importantly, the absence of economic obstructions to that consent and to the pursuit of education, jobs, and happiness.

Hook’s moral basis for being a socialist was, of course, his unexamined assumption that capitalism exploits workers. Socialists are more moral because they are “nicer” (meaning more altruistic, though Hook does not use the term) than the capitalists who are mean and selfish. Therefore, a crucial prerequisite of modern democracy is that economic power must be put under political control.

I say the main debate today is, or should be, capitalism vs. democracy, rather than vs. socialism, because of the near-universal endorsement of democracy and equally near-universal failure to define it. Hook’s somewhat muddled understanding is how most currently see it.

Socialism, to be sure, still needs to be refuted, though Ludwig von Mises did it thoroughly in 1922. And telling a naive voter that the government often abuses its legal monopoly on the use of physical force is likely to produce a “but we are the government and we can change it” response.

What percentage influence does a voter have in a typical US presidential election? Less than a millionth of a percent!

As I suggested in an earlier post, the vote is not unimportant in a free society, but it is neither primary nor fundamental. Hong Kong, after all, did quite well for decades with no general elections. What it did have was the English constitution and legal system.

This means that if democracy is a term to be endorsed at all, it must be defined as voting restrained by individual rights and those rights must be clearly distinguished from the collectivized versions of the social liberals. Individual rights are freedoms to take action, not entitlements to things, that is, to food, shelter, clothing, education, jobs, and happiness.

I’m tempted to say that democracy should be tossed entirely in defenses of the free society. If capitalism is understood as a social system, not just economic, it can be put where it belongs—in philosophy—and therefore cannot be dismissed as “just economic,” which most opponents and the ignorant alike do when the term comes up.

Discussions of social systems come from the fourth branch of philosophy called social (or political) philosophy. Social philosophy defines the nature and proper function of government, which brings rights and ethics into the discussion of capitalism, which means egoism should also be brought in, as well as a theory of human nature, and a theory of consciousness and universals, among other fundamental issues of epistemology and metaphysics.

Sidney Hook was a philosopher who knew about discussions of this sort, at the fundamental level, and used fundamentals to defend Marx and socialism. He was an advocate of Enlightenment values: reason, science, technology, freedom, and, of course, rights and democracy, as most socialists of his era were.

Defenders of the free society cannot just say they are advocates of the Enlightenment values of reason, science, technology, freedom, rights, and democracy . . . and expect to win arguments.

What is required today for a proper defense is the elevation—that is, the boosting, heightening, raising up—of discussion from our current concrete-bound, trivial, and disconnected mess to universal and fundamental principles.

Socialism was a moral ideal in the 1920s, ‘30s, and, according to Ayn Rand, until the end of World War II. As a practical ideal it died with the USSR collapse in 1991, yet its flotsam lingers in 2016 to obstruct passage to a genuinely free society.

It lingers by default because of the lack of principled opposition.

Sidney Hook was a significant member of the generation that sought to promote a moral ideal. His book provides lessons for anyone in 2016 who wishes to do the same, this time, one would hope, promoting the ideal of laissez-faire capitalism and all that it rests on.

* Chambers became a neoconservative and wrote the infamously sleazy review of Atlas Shrugged in William Buckley’s National Review. “From almost any page,” says Chambers, “a voice can be heard . . . commanding: ‘To a gas chamber–go!’” Chambers was posthumously awarded the Medal of Freedom in 1984 by President Reagan.

** F. A. Hayek, in “Why I Am Not a Conservative,” points out that conservatives are not averse to using coercion to achieve their goals and he even suggests that coercion is the common denominator uniting “repentant socialists” (like Whittaker Chambers) and conservatism. True (classical) liberalism, says Hayek, supports liberty over equality or democracy.

Wednesday, February 03, 2016

Why Don’t Facts Matter?

In several previous comments I have in one way or another attempted to answer the question that titles this post.

My first encounter with the issue occurred when I complained to a colleague about other associates whose selective memories seemed beyond the pale, because I had assumed it was impossible for the latter to have forgotten what was said in a meeting not too long before the immediate incident.

The colleague gave me a dead serious glare and said, “Facts don’t matter!” I briefly responded with an embarrassed “you can’t be serious” chuckle, but soon realized that the glare was not going away.

Naiveté aside—I am aware that there are dishonest people in the world—I nonetheless have a hard time understanding those who seem to be honest, yet clearly are not sticking to the facts.

In 2006 I wrote an academic paper about Harry Frankfurt’s little book On Bullshit, in which Frankfurt distinguishes liars from BS’ers. Liars care about facts in order to say the opposite. BS’ers, however, don’t care because their goal is to impress and sway whether or not what they are saying is true. Are BS’ers dishonest?*

In my paper I argue that there are a couple of continua operating here, the relevant one ranging from the deliberately dishonest to sloppy thinkers who are unware of their premises or where the premises came from.

This may somewhat account for those who seem to be decent people but at the same time are habitual hyperbolizers and habitually selective rememberers. But where do these habits come from?

In a 2008 blog post I make the not too original point that we learn—that is, pick up habits—from our parents, teachers, and significant others, which means our significant others learned from their significant others who learned from theirs, etc. In the absence of an infinite regress, however, someone somewhere along the line had to have chosen to embellish his or her statements and selectively ignore certain facts. Why?

Free will, of course, dictates that anyone in the present, or past, can choose to ignore facts. Is that it? Isn’t there more to the sloppy thinking that many seem to exhibit?

Consider the following cases.

1. Philosopher Sidney Hook describes two instances from his travels in the mid-twentieth century (Out of Step, pp. 585-88). In Japan, Hook relates, he was confronted by his academic hosts and the Japanese press with nothing but complaints about the US bombing of Hiroshima, yet not a single word was said about the Japanese attack on Pearl Harbor. In India, the conversation centered on US race discrimination—without mention of India’s caste system. Near the end of his stay, Hook invited an academic host to dine with him at his hotel, but the host, after several evasions, finally admitted that he could not accept—because the waiters at the hotel were Muslim and the host was Brahmin.

Hook does not provide an explanation for the stark logical disconnects in either instance, other than to imply differences between Eastern and Western cultures.

My conclusion would have to specify the lack of Aristotelian logic in the East and its presence in the West. Most westerners, however precariously they may do so, cling to the notions of non-contradiction and non-fallacious thinking, which means they maintain some respect for facts that apparently the educated in the East do not.

Respect for logic means respect for facts.

2. Anthony Watts, former television meteorologist and current climate change doubter (to use the Associated Press’s preferred moniker for global warming skeptics), blogs on wattsupwiththat.com, a site that enjoys three to four million page views per month. Several highly qualified guest climatologists also regularly post their thorough, technical analyses of “climate change” issues.

Last June, Watts reported the details of a meeting he had with journalist, ardent environmentalist, and staunch global warming supporter Bill McKibben. Instead of fireworks and hostility between the two, Watts described their discussion as civil and friendly. They discussed their respective agreement and disagreement on numerous climate and environmental issues.

Concluding his report, Watts said, apparently to challenge strong opinions within the denier community, “I don’t think Bill McKibben is an idiot.” He then added, “But I do think he perceives things more on a feeling or emotional level and translates that into words and actions. People that are more factual and pragmatic might see that as an unrealistic response.”

Why don’t facts matter according to the scientist Watts? Because emotion sometimes trumps facts.

3. Ayn Rand in her article “To Dream the Non-Commercial Dream” (The Ayn Rand Letter, January 1, 1973, reprinted in The Voice of Reason) emphasizes the significance of emotion trumping fact. She says this about “impassioned advocates” of altruism and collectivism:

They are not hypocrites; in their own way, they are “sincere”; they have to be. They need to believe that their work serves others, whether those others like it or not, and that the good of others is their only motivation; they do believe it—passionately, fiercely, militantly—in the sense in which a belief is distinguishable from a conviction: in the form of an emotion impervious to reality. (Emphasis in original.)
Deep down, in their psychologies, it is emotion that dictates to these “sincere” people what is true. Facts don’t matter because emotion says otherwise. Altruism and collectivism have become their entrenched beliefs.

Rand adds that this “depth”—the “deep down” part of these unexamined psychologies—can be “measured by distance from reality” and that there exists a continuum, based on the distance, that runs from “sincere” to totalitarian dictator.

Rand puts “sincere” in scare quotes, which probably means she is not entirely endorsing the term, but I still have to ask: are those on the “sincere” end of the continuum . . . sincere? And honest? Who, really, after all is a bad dude?

Rand goes so far as to acknowledge that the “butcher of the Ukraine,” Nikita Khrushchev, was compelled to believe the “truth” (my quotes) and magical ritual of dialectical materialism. He had to, she says, lest he “face something more frightening than death” (Rand’s quotes).

This comment on Khruschev takes me back to The Criminal Personality by Yochelson and Samenow. Criminals certainly are bad dudes. They lie (and BS) as a way of life and enjoy getting away with the forbidden. (“If rape were legalized today . . . I would do something else,” one offender told the researchers.)

And criminals, like Khrushchev, don’t have much deep down, that is, they are considerably deficient in self-esteem. What is there, as Rand puts it, is distant from reality. “I am a nothing, a zero,” several criminals confessed, but added that if they routinely thought that way, they would have to kill themselves. So they live by substitute thoughts, or rather rationalizations. Their accumulated mental habits have taught them to believe and say: “that guy deserved it” or “everything in the store belongs to me” or “she really wanted me.”

Khrushchev substituted the communist mantra.

So how can these bad dudes seem “sincere”? This brings me back to the liar and BS’er. The goal of the liar and BS’er is to sound good. Most criminals are con artists, which means they are consummate liars and BS’ers to make what they say sound good.

The same applies to dictators. Many have been charmers at cocktail parties. Hitler was.

So would I want to be friends with someone on the “sincere” end of Rand’s continuum?

Sidney Hook and Anthony Watts did not seem to find offensive the disagreements they had with their associates, but those associates were presumably not on the extreme end of Rand’s scale.

I would say that friendships, whether professional or personal, depend on how distant one’s contact is from reality. That is to say, on a scale of decency—by adapting Rand’s continuum—honest, fact-oriented people are at the top, scummy criminals and Khruschevs are at the bottom, but most decent people, the “sincere” ones Rand was talking about, fit into the middle to upper tiers.**

The difficulty in forming professional and personal friendships is in understanding the other person’s psychology and discovering that distance from reality.

Facts do matter.

*Frankfurt thinks BS’ers are worse than liars—and more likely to be found among the highly educated because of their facility with language.

**In Rand’s article she was talking about a retired editor of the New York Times.

Thursday, January 21, 2016

Americanized Maoism, the “Narrative” of Political Correctness, and Racist Minimum Wage Legislation

Daniel Henninger in the Wall Street Journal has referred to our current political correctness madness as “a kind of Americanized Maoism.” This is an interesting characterization.*

China did not have a proletariat of factory workers, so Mao chose peasants as the oppressed class we should worship and model our lives on and, of course, protect from the evil capitalists.

Today’s American leftists certainly would not seem to mind having us all wear Mao tunics, nor would they mind reducing our standard of living to the level of Mao’s peasants.

Note a few of the consumer products that have been banned by those who know what is best for us: phosphates in laundry and dish detergents, high-flow water valves, incandescent light bulbs, plastic shopping bags, and the vent hole in the lowly gasoline can.

Jeffrey Tucker (1, 2, 3, 4, 5) has examined a number of these civilization killers. On the light bulb ban, he writes, “It’s the plot of [Ayn Rand’s] Anthem lived in real time.”

The gasoline can? Apparently, wealthy leftists have never had to mow their own lawns and don’t care to remember their elementary physics. That second hole makes it easy and spillage-free to pour gas into the mower’s tank. Tucker’s conclusion: the bureaucrats in power want us to reduce our lives to the misery of pre-capitalist eras.

The Americanized part of “Americanized Maoism,” however, is just another import from Europe. It is the post-modern rejection of Enlightenment values and establishment of what I referred to in a previous post as a virulent absolutism in an age of epistemological and moral relativism. (Some terms were borrowed from Stephen Hicks. See 1, 2.)

This is what has given us the word “narrative.” When challenging the left, the dismissive response will often be, “That’s only your narrative.” Which is another way of saying what’s true for you is not necessarily true for me. And it’s also Marx’s polylogism dressed up in new garb.

So why should we listen to the left? The unspoken and sometimes not so unspoken reply is, “We have the power. You don’t. Our narrative is in charge.”

One current “narrative” taken as a given is that opposition to minimum wage is racist. Fortunately, a recent column by Professor Williams has taught us an important history lesson about who really is the racist.

The 1931 origin and design of minimum wage legislation was to prevent African Americans from getting work. Nearly every economist in the United States knows minimum wage laws prevent the least skilled—mostly African Americans at that time, and still today—from being hired. Similar motivation operated in South Africa’s 1925 Apartheid legislation to prevent the hiring of “Natives.”

The true racists are the advocates of minimum wage, and since capitalism is the cure for racism, anyone who opposes free markets should be labelled haters of the minority disadvantaged and oppressed.

Trigger warning for the poor babies on college campuses:

The left has it wrong.

Capitalism—free markets and free speech—are what you should be studying and supporting. It’s time to get your feelings hurt. You might learn something in the process.

*Henninger also argues that the popularity of certain “outsiders” in the 2016 Republican presidential circus is a revolt of the politically incorrect, meaning that Americans, probably through their “you can’t push me around” sense of life, are sick of being badgered by the left and told what to think, feel, and do.

A Note on Correctness. The term usually means free from error, accurate, or precise, but in the pejorative sense in which the word is used today, it means conformity to an orthodoxy with deviation calling for punishment.

Penalties for failure to conform range from expressions of disapproval, shock, contempt, and condemnation to the more serious excommunication, expulsion, or termination to the ultimate of imprisonment, and death.

Today’s radical Marxist left—in the form of political correctness—is not unique in insisting on such conformity.

Just ask Socrates about Athenian correctness in the fifth century BC or Galileo about the Inquisition’s Catholic correctness in 1633.

Throughout history, religious, ideological, and intellectual movements have produced their share of correctness zealots. Christian and Islamic correctness, as in “radical Christianity” and “radical Islam,” are not inappropriate designations.

Nor is Freudian correctness. See Jeffrey Masson on his expulsion from the Freud Archives and other psychoanalytic societies over his view of Freud’s seduction theory.

The motivation for correctness zealotry is intolerance of difference, especially as manifested in language and behavior that deviates from the orthodoxy. The goal is control, initially censorship of language but in the end total control of thought and behavior.

Saturday, December 12, 2015

Crybullies, Non-Negotiable Demands, Micro-Totalitarianisms, Academic Fascism . . . and Cabaret

“It is nothing! Children on their way to school. Mischievous children! Nothing more!”—from the Broadway musical Cabaret.

The words are spoken by Herr Schultz, the Jewish fruit shop owner whose window has been smashed by a brick. The setting of the story is the eve of Hitler’s rise, 1931 Weimar Germany.

Too strong a comparison to make to the “children” on today’s college campuses?

Is it?

Roger Kimball, author of the 1990 book Tenured Radicals (2nd ed. 2008) has called protesting students crybullies. Those are the ones making non-negotiable demands for trigger warnings lest certain words or ideas they disagree with hurt their feelings.

The mothers of present-day crybullies apparently did not teach their children the familiar rhyme about sticks and stones . . . versus words.

Thomas Sowell prefers to call the new “micro-aggression” buzzword micro-totalitarianism. “Macro-aggression” supposedly means blatant physical force, including the battery of unwanted touching. But hurtful, offensive words are said to be small coercions that, if allowed, can accumulate to become just as bad as the macro ones.

More correctly, Sowell argues, the micro-censorships that the Marxist left is pushing are moving us “even if by small steps” more and more toward the macro silencing of dissent. This is the last step to dictatorship and total control.

The list of the left’s no-no’s that must be censored has now climbed to at least 80 and were it not such a serious issue would qualify for theater of the absurd. For example, “American is the land of opportunity,” “I believe the most qualified person should get the job,” and “Where are you from or where were you born?” are said to be racist micro-aggressions that should be banned from the home of academic freedom.

Violators of these prescriptions, the protesters demand, must be reprimanded, suspended, required to attend sensitivity training classes, or, preferably, forced to resign. Students at Emory University are demanding that course evaluations rate professorial micro-aggressions—the predictable ones that might offend (Marxist) class identities.

Walter Williams calls the current atmosphere on college campuses academic fascism.

From the Nazis to the Stalinists, tyrants have always started out supporting free speech, and why is easy to understand. Speech is vital for the realization of their goals of command, control and confiscation. Free speech is a basic tool for indoctrination, propagandizing, proselytization. Once the leftists gain control, as they have at many universities, free speech becomes a liability and must be suppressed. This is increasingly the case on university campuses.
Williams cites one English professor who in the process of expressing his opposition to what the left calls Israeli Apartheid (compare 1, 2) said we must “not be guided by cardboard notions of civility.”

The phrase means what it sounds like. Says Williams: “That professor's vision differs little from Adolf Hitler's brown-shirted thugs of the paramilitary wing of the Nazi Party in their effort to crush dissent.”

The resurrection of 1960s-style intolerance is not lost on older professors, including Sowell: “Storm trooper tactics by bands of college students making ideological demands across the country, and immediate preemptive surrender by college administrators — such as at the University of Missouri recently—bring back memories of the 1960s . . . .”

That is to say, non-negotiable demands followed by administrator capitulation are not new.

Not every university administration from the 1960s era, however, gave in. The University of Chicago (and my alma mater, the University of Denver) expelled and suspended numerous students who staged sit-ins at campus buildings.

Just mischievous children?

Some are ignorant, but the leaders are neither ignorant nor mischievous.

One classmate circa 1968-69 gave an impromptu speech at a protest crowd on the steps of my alma mater’s administration building. His voice boomed about struggle and revolution and his fist pumped.

The chilling thought that went through my mind was this: in 1917 St. Petersburg this classmate would have been on the front lines of Bolshevism.

Today, please, let us not stick our heads in the sand as did Weimar culture in interwar Germany.

Not seeing, or wanting to see, what was on the horizon of Germany’s future is the theme of Cabaret. Go see it, or if you have seen it, see it again.

Monday, November 02, 2015

Further Comment on Galileo’s Middle Finger

My previous post did not do justice to the Alice Dreger book Galileo’s Middle Finger. Here are a few additional comments.

Intersex people. Intersex infants, children, and adults, formerly referred to by the pejorative “hermaphrodite,” are born with ambiguous genitalia—for example, with external penis and vagina, usually of different sizes, or with an external vagina and internal testes but no uterus or ovaries.

Dreger’s doctoral dissertation focused on late nineteenth and early twentieth century hermaphroditism. Because such sexual differences were seldom ever talked about, most intersex people in that period lived relatively normal lives, presumably because they assumed that everyone else was built the same way. As Dreger put it, perhaps a little surprise on the doctor’s face when examining the patient was the only awareness anyone had of the medical issue!

Sometime during the twentieth century, doctors decided they should do something about the “shameful” condition. They decided, usually only telling the parents that some infant surgery was necessary, to play God and change intersex infants into boys or girls, based entirely on their judgment of which way the infant should go.

In recent times, it seems doctors have become more transparent by telling parents what they are doing . . . but rarely, even today, have doctors or parents told their patients and children what was done to them as infants.

“Shame, secrecy, and lies” is how Dreger describes the attitudes and behavior of doctors and parents. And it is this shame, secrecy, and lying that has incensed the human sexual identity activists. Intersex people are individuals with rights just like everyone else, but they have been denied honesty, have been discriminated against, and even denied choice—over which way they want to go, or whether to go at all.

Several early chapters of Dreger’s book detail her own activism to get the medical profession to fess up and change its ways. The stone wall she hit is part of the reason she felt the depression mentioned in my previous post.*

Congenital adrenal hyperplasia. Another stone wall was hit and described in the latter chapters of Dreger’s book. A doctor in New York City has made a career of administering dexamethasone, a powerful steroid, to in utero fetuses to prevent the formation of ambiguous genitalia and other sexual anomalies that can result from this inherited disease.

Dreger tallied a number of problems with this medical practice and lobbied hard, but failed, to stop it. The off-label drug—many drugs are so used—must be administered before there is any evidence the fetus is developing in an anomalous manner.

Dreger’s math found that only one out of ten such treated fetuses stood to benefit from the drug. On the other hand, the risks? Only one study—and only one—has been conducted to discern long-term consequences. The findings of that study indicated a significant minority of the sample suffered retardation, memory difficulties, and growth disorders; as a result, the study was shut down.

The controversy centered around informed consent, much of which seems not to have been given, and bureaucratic approval to proceed with such a treatment.

At one point, charges of fraud for phantom research projects were brought up, but the whistleblower, like many operating in bureaucratic environments, was attacked and threatened with psychiatric treatment. The Feds, responsible for protecting the public from risky medical practice, did little to stop a prestigious and well-established doctor.

Dreger lost the battle.

Social justice. Dr. Dreger views herself as an activist fighting for social justice. This has pushed me to clarify in my mind the difference between social and individual justice. “Social justice” has a long history, so it is not unique to Karl Marx, but today’s advocates use it in a distinctively Marxian flavor.

Is Dreger an advocate of social justice? Not really, though I’m sure she would disagree with my interpretation of her work.

Social justice, as I define it using today’s Marxian flavor, is the virtue of fairly and accurately judging oppressed classes as underprivileged and granting them restitution in the form of additional wealth, education, employment, along with other favors that they otherwise have not been able to attain. The underprivileged include anyone who is deemed unsuccessful, but especially African Americans, women, and LGBTs. This is the collectivist definition.

Individual justice is the virtue of fairly and accurately judging individuals—oneself and others—according to the standards of honesty, integrity, courage, independence, and especially productiveness. This is the individualist definition.

I think Dr. Dreger, because of her uncompromising commitment to facts, is closer to practicing the latter form of justice than the former. This, I would say, is why she could not accept her Marxist colleagues’ epistemological relativism. Yes, African Americans, women, and LGBTs have been badly discriminated against, even enslaved, but each individual must be judged on his or her own merits. No “class,” to use Marx’s terminology, owes any other “class” anything, especially when restitution is made at the point of a gun.

To use a reductio argument against the Marxists one might say this: Ayn Rand wrote that the individual is the smallest minority on earth. Turning the thought around, can we not say that the group or “class” of individuals is the largest “class” on earth? And therefore the largest “class” on earth that has been discriminated against and oppressed??

Individuals of the world should unite! And fight off their oppressers!!

Marxists should be advocates of individualism if they are seriously concerned about justice for the oppressed.

Free speech at Northwestern. An unwavering defender of First Amendment rights, Dreger has, since the publication of her book, performed a little flipping off herself. She has resigned from the Northwestern University Medical School over her dean’s attempts to censor the content of a faculty magazine she edited. The content? About sex, of course, but also possibly “offensive” content—to the hospital’s brand name!

Sigh! As a marketing prof, I have to make one last comment. Bureaucrats, whether in academia or government, have no clue what sound marketing, including branding, means. They think the usual BS that marketing is just that and that a brand image is something made up and pawned off on the helpless, unsuspecting public. This is just good Marxist thinking about business.

Sound branding—that is, product identification—of a first class hospital should run something like this.

We use the latest, most advanced knowledge and techniques to treat and cure our patients. In the process we entertain and examine all ideas—the wilder and more offensive the better.

The better because we will then know that we have left no stone unturned in order to come up with treatments and cures to do justice [there’s that word again!] for our patients.

*To the sheltered, like yours truly, this was an eye-opening read. It also struck me as the perfect “borderline case” in the philosophical problem of universals. The existence of intersex people (and animals) demonstrates that there is no intrinsic maleness or femaleness “out there, in the thing” as the intrinsic theory of essences claims. It also took my teenage daughter to explain the difference between gender, which is social (actually, psychological), and sex, which is biological. Now I understand!

Saturday, October 31, 2015

The Galilean Personality vs. Wall-to-Wall Marxism and Human Sexual Identity

Medical historian and bioethicist Alice Dreger, in her provocatively titled book Galileo’s Middle Finger, provides a variety of descriptions of what she calls the Galilean Personality:
It consists of “men and women who are smart, egotistical, innovative, and know they’re right” (p. 180), who “tend to believe that the truth will save them, and to insist on the truth even when giving up on it might reduce their suffering” (181). Such personalities are “pugnacious, articulate, [and] politically incorrect.” Like the namesake of their personalities, they believe they are “right in the fight but never infallible” (185).
Confidence, independence, integrity, and, above all, commitment to facts. These traits apply equally to Dreger, as to the several heroes she chronicles in her book.

The title, as some reviewers have noted, is a bit misleading, because the book is not a history of scientists from Galileo’s day to the present who rebelled against dogmatic authorities. Nor is it particularly about Galileo’s middle finger, though after observing the scientist’s mummified digit in a Florence museum Dreger did get inspired by the thought of Galileo flipping off the Pope.*

Galileo’s science that confirmed the Copernican revolution, as Dreger observes, asserts that human identity is not what we thought it was, because humans, as consequence of Galileo’s work, can no longer be understood as occupying the center of the universe. The Pope took exception.

Similarly, scientists today who assert their research outcomes on human sexual identity find themselves engaged in battles with the dogmatic authorities of sexual identity politics. This theme became central to Dreger’s book.

“Wall-to-wall Marxism”** refers to the activist intellectual context in which Dreger operated while researching and writing the book. Dreger would probably describe herself as a “moderate liberal,” but it was her Galilean commitment to facts that got her into hot water with the radical Marxist left. They didn’t like what she said and wrote, let alone what the scientists she wrote about had said and written.

In fact, in one depressed moment during her research—depressed because of the hostility and, at one point, threat, thrown at her—she captured the essence of her modern Marxist colleagues and reported her feeling in the book:

We have to use our privilege to advance the rights of the marginalized. We can’t let people [like two good guys] say what is true about the world. We have to give voice and power to the oppressed and let them say what is true. Science is as biased as all human endeavors, and so we have to empower the disempowered, and speak always with them. (p. 137)
These are Dreger’s words describing the way her Marxist colleagues think. The two good guys are J. Michael Bailey and Craig Palmer.

Bailey’s research reported that many men who have sex change operations do so for erotic reasons, not, as transgender political activists insist, because they are “born with the brain of one sex and the body of the other” (p. 9).

Palmer co-authored a book asserting that rape often includes a sexual component, meaning that rapists do not always rape solely for reasons of power and conquest, but also because they enjoy sex.

The activists fiercely attacked Bailey and Palmer, charging them, among other alleged crimes, with rights abuse of research subjects and falsifying data. One scientific journal, cited by Dreger, published an article saying Palmer and his co-author deserve to be hung (p. 116).

Dreger’s role in this, as a historian of fact, was to pore over everything relevant to the controversies, ranging from the works of the scientists involved to all of the various criticisms offered, some of them found in forgotten transcripts and archives.

Bailey and Palmer fought valiantly to defend themselves, which is why Dreger gave them the accolade of Galilean personality. Dreger’s work has cleared their names—at least, to anyone interested in reading the facts.

Bailey and Palmer are not the only ones profiled and defended in Dreger’s book. Napoleon Chagnon spent many years studying the Yanomamö tribe in Venezuela, describing them as a fierce, male dominated tribe that fought violently over females, practiced domestic brutality, used drugs ritualistically, and couldn’t care less about the environment.

This was not the right thing to say.

Chagnon’s enemies unleashed a torrent of character assassinations, from the usual charges of cooked data to hints and not-so-subtle hints of beliefs in eugenics and intentional use of a bad vaccine that infected the whole tribe.

Dreger’s indefatigable efforts to dig for facts also cleared this Galilean personality.

So what is Dreger’s conclusion from these stories? Facts don’t matter—to today’s identity activists, as summed up in her depressed feeling quoted above.

In a somewhat understated way, she does acknowledge that the activists get their motivating ideology straight from Karl Marx, but I would add: Marxist polylogism is emboldened by our current atmosphere of post-modern epistemological relativism. Only the “oppressed classes” have changed.

The premise remains that opposition to dogma must be silenced. And Dreger’s book makes it clear that relativism results in the same authoritarianism as does religion.

* The book’s dust jacket shows half of an 1873 painting with Galileo sitting in front of a globe, his right hand obscured. A student to whom Galileo is lecturing was cut out of the picture and it is Galileo’s index—not middle—finger that is extended in the original painting.

**The phrase “wall-to-wall Marxism” is from the feisty and indefatigable Christopher Monckton, Viscount of Brenchley. Monckton was referring to the National Socialist Workers’ Party in Scotland and the Royal Society in England, but the words seem an appropriate description of our current cultural environment. Monckton is a prominent “climate change doubter,” as the Associated Press’s revised stylebook now prefers to call “climate deniers.”