Friday, January 13, 2023

 On Science and the Scientific Method

The purpose of science is to explain and guide, and the method scientists use, in briefest essence, is observation to raise questions followed by frequent testing and trying of alternatives to find correct answers.
In a single word, scientific method is conceptualization, the mental process of generalizing to identify universals and applying previously formed universals to understand particular cases. The former is a process of induction, the latter a process of deduction (1, 2).
We all use both cognitive processes, but scientists use them more intensely and in more concentrated areas.
Scientific method may or may not involve controlled experimentation with an elaborate array of instruments and it may or may not produce mathematical equations and hosts of quantitative data.
The ultimate aim of science is practical, to enhance human life.
The most fundamental science is philosophy or, as some have said, philosophy is the science of all sciences, with epistemology providing the principles of method to guide scientists in their choices and actions. The current post is based on Ayn Rand’s epistemology, especially her theory of concepts.
There are three fundamental special sciences: the physical that study inanimate matter and its actions; the biological that study living organisms and their behavior; and the human that study homo sapiens—the highest living organism that possesses a consciousness with the capacity to reason—and its behavior.
Special sciences representative of each are physics and chemistry, botany and zoology, psychology and economics.
All other sciences derive their basic premises from one or more of the fundamental ones. Derivative sciences are more applied than the fundamental ones and at some point may be called technology or applied sciences—engineering, medicine, psychotherapy.
Applied sciences are largely, though not exclusively, deductive. The fundamental sciences are largely, though not exclusively, inductive.
To return to our beginning concepts, the aim of science is not explanation and prediction as the logical positivists insist, but explanation and guidance. To explain means to describe the nature of an entity, its attributes, and its causal origin and effects. To guide means to provide principles of action to help human beings achieve specific goals.
Principles of guidance can be used to build spaceships and bridges; to diagnose diseases in plants, animals, and humans to recommend treatments; to manage small and large businesses; to make baskets (pp. 299-301); and to decide whether a specific action would be honest.
Ethics is a fundamental branch of philosophy that provides important principles of guidance throughout our lives.
Testing and trying are the etymological roots of the word “experiment” and testing and trying is what anyone who possesses the capacity to reason does to conclude that an essential distinguishing characteristic has been identified.
A child, for example, may throw, bounce, and chew a round, spongy thing before concluding that the round, spongy thing is a ball that bounces and rolls (but is not food). Aristotle opened twenty fertilized chicken eggs, one per day, to observe the development of chicks from beating speck of blood to live baby chicken. In this way, he tested and confirmed the hypothesis that baby birds are not born pre-formed, but grow gradually stage by stage within the egg (see Harré, chap. 1).
Today’s scientists test and try many options, sometimes using controlled experimentation to eliminate sources of extraneous variation. In the physical sciences, they may identify algebraic equations to explain the relationship between two or more entities or their attributes or actions.
The biological and human sciences, however, because of the nature of their subjects cannot be so precise. Since all concept formation, as Ayn Rand has shown (chap. 1 &2), involves measurement—measurement omission to arrive at the universal—these sciences can be quantified to a limited extent, using approximate measurements, such as greater than or less than, higher or lower, more important or less important.
The human sciences, of course, have the added proviso that free will reduces to absurdity the formulating of equations to predict behavior. All sciences work with universals by omitting—but certainly not forgetting—the measurements.
Technology or applied science uses measurements of individual cases, such as the engineer who needs to decide the best location of a new bridge or the sea captain who needs a reliable estimate of the tides in a specific harbor. The medical doctor needs to know a patient’s temperature and blood pressure and the psychotherapist needs an estimate of the kind and quantity of thinking errors an unhappy patient has made over the years.
Essentialization is the fundamental thinking method of scientists (and everyone else) because the essential distinguishing characteristic or essence of a concept explains and causes all or most of the other characteristics. The flat, level surface of a table explains why my glass of water sits stably on it and causes the liquid to stay within the glass. The motions of the sun and moon explain and cause the motions of water on earth.
Essentialization can be thought of as thinking in causes and effects.
Principles of guidance also explain the nature of the goals one is seeking and the steps that will cause that accomplishment. To sell a product, the salesperson must tailor his or her presentation to the prospect’s needs and wants. To build a bridge, the engineer needs to know many specifics, including precise measurements, of the width of the river and the nature and history of its soil.
Where does history fit in to this? Science, whether basic or applied, can be referred to as theory. History requires and uses theory to identify the nature and causes of a specific case, such as sedimentary rock embedded at a forty-five degree angle several thousand feet up on a mountainside or the nature and cause of a specific war.
What is called natural history uses theory from the physical sciences and human history uses theory from the human sciences to explain the nature and causes of the cases each study.
History is applied science.

Tuesday, December 13, 2022

How Do We Know Concretes?

To the layperson the title of this post must seem strange.
To academics, especially philosophers, the issue for a couple of millennia has been a big deal.
Of course we know concretes, says the layperson. We open our eyes and look out at the world—to see specific, individual, concrete people, animals, plants, and things. Reminiscing about our childhood, we can recall many specific, individual concrete events, ranging from games we played with relatives or next door friends to vacations taken with the family to the years spent in various schools.
And historians specialize in reporting the concretes of the past, do they not? Well, throughout the nineteenth century, and still today, historians, along with psychotherapists, medical doctors—and anyone working in applied fields—are said not to be true scientists. Only theoreticians are scientists. This is the dispute over theory versus history.*
To put the issue in more philosophical terms, we live in a world of particulars, yet with the exception of proper names our knowledge consists of universals.
Theories, science, and principles are all expressed in terms of universals. How do we use those universals to know particulars?
The problem arises from the theory sometimes called naïve realism—Aristotle’s improvement on Plato—that says universals are external to our minds, lodged, so to speak, in the things of reality, “out there,” but not in Plato’s separate realm.
Aristotle said that the form is embedded in the matter, form being his word for what in more modern terms we call essence or essential distinguishing characteristic. Matter, as in the stuff that something is made of, is what makes the thing a unique, concrete individual entity.
How can we know concretes when nearly all knowledge is universal? The glass on my desk, so the theory goes, contains “glassness,” the universal essence that we grasp and understand as a thing we drink liquids out of. The glass, or in my case, plastic, that the container is made of is its matter, and this includes its color and the number of ounces of liquid it can hold.
The matter of the glass, we could say, is that the glass on my desk is plastic, its color is blue, and it holds twelve ounces. Does that mean we have understood the concreteness of this glass?
No, say the critics of Aristotle’s common sense realism, because we are using universals in every word (“plastic, blue, twelve ounces”) to describe this individual concrete. We fail to get to “the thing in itself” (the thing as it really is, its identity), a problem that led John Locke, in exasperation, to conclude that the individuality of a thing is “something I know not what.” Further problems led David Hume to fail to find a “necessary connection” between cause and effect, heading us down the road to Immanuel Kant who said we can never know true, noumenal reality (where the “thing in itself” is presumed to reside).
And that is where we are today—reality is unknowable, concepts are arbitrary, and nothing is universal—with bad consequences in ethics and politics: values are subjective and there is no objectively valid, justifiable political or economic system.
Thomas Hobbes said we need a strong “public sword” to keep the peace among warring groups, each with its own arbitrary values, but today’s postmoderns want to crush any group with different values, especially those who advocate capitalism.
In other words, dictatorship follows from philosophy’s failure to solve the so-called problem of universals.
But let’s go back to our layperson who looks out at the world and sees a myriad of concrete things and their attributes. Although such a layperson may still think in Aristotle’s realist tradition by saying, “I just look and see glassness in the glass,” the conclusion is not correct, because there is in fact no essence of glassness in the thing out there. The layperson’s mind is doing more than he or she thinks.
What Aristotle called abstraction—mentally separating the form from its matter—is a more complicated process than he knew.
Concept formation, or conceptualization, is the human being’s method of turning percepts into universal concepts. How does this occur?
All knowledge begins at birth with perception, by observing concrete particulars. Later, when we begin to talk, we learn to form concepts, which become our universals.
Ayn Rand’s theory (ch. 1 and 2) holds that concepts and essential distinguishing characteristics are in our minds, not “out there” in the thing, but if we correctly identify what’s “out there,” the essences and concepts are objective, in an epistemological sense, not intrinsic or metaphysical as both Plato and Aristotle thought.
The process proceeds as follows. We perceive many similarities and differences among the things of the world. When we focus on one group of similar things that are somewhat similar to, but also different from, another group, we have isolated something we want to identify with a concept and word.
To form the concept, we focus on the characteristic(s) that explains and causes most of the others, then omit its measurements. Measurement omission in the process of abstraction is what enables us to identify the essence of the concept.
Using Rand’s example of forming the concept “table,” we isolate (from chairs and beds) those objects that have flat surfaces and are designed to hold smaller objects. Individual, concrete tables differ according to their measurements, but the measurements of their height, length, width, color, oval vs. rectangular top, etc., are omitted to make the concept of “tableness.”
The omitted measurements are still there, in reality and in the concept formation process, but they are not used (or necessary) to form the concept. Identifying that tables are flat and hold other objects is all we need to know to distinguish tables from chairs and beds and to give us the essence or essential distinguishing characteristic of tableness. Particular height, width, weight, color, shape of top, etc., are left out. A word and definition are finally assigned to complete the process of concept formation.
All the varying aspects of each entity in the world, including the varying measurements of all its qualities, and including its location and time of existence, are what give each entity its unique individual, concrete identity.
How do we know the entity as a concrete? To say it one more time, initially through perception.
We then identify it with a concept and word. Then, our ever expanding accumulation of concepts—our increasing knowledge—enables us to know and describe the concretes in their extensive variety, their attributes and actions and their myriad differences according to their varying measurements.
Assuming our concepts accurately describe the concretes they represent, that is, their referents out there in reality, and are not detached or floating, we have objective knowledge.
And that objective knowledge can be theory or history, basic science or applied science, general knowledge or personal knowledge. In all cases, it is knowledge of the concretes of reality.
Our minds are not “mirrors of nature,” as critics of “naïve” Aristotelian realism often say. They are active processors of it. Rand’s recognition of the mind’s active nature and her incorporation of it into her theory provides a major improvement on and defense of Aristotle’s epistemology.
We might, although she may not like it, even call Ayn Rand’s theory a “mature” realism about knowledge.
* Ludwig von Mises clarified the issue by rejecting the Hegelian-Marxist view that there are laws of history. Rather, Mises recognized that there is a valid science of individual historical events using the theories of basic sciences, especially psychology and economics, to understand and describe those events. Psychotherapists and medical doctors, in addition, use the theory of basic sciences to identify and treat their patients’ problems, which to the therapist and doctor are de facto historical events. This is called applied science.

Thursday, November 17, 2022

Two Types of Knowledge: General and Personal

General knowledge, that is, book reading, school learning, and the ability to cite or quote sources at will, does not by itself make us interesting or even a really knowledgeable person.
Personal knowledge does, though it often is not even considered knowledge.
Our emotions evoked by general knowledge and more particularly our personal reactions to, and experiences with, all aspects of our lives give us personal knowledge that no else possesses.
In Independent Judgment and Introspection, I make passing reference to these two types of knowledge. I also discuss Edith Packer’s distinction (pp. 226-38) between universal and personal evaluations when identifying emotions. Personal evaluations constitute a significant part of our personal knowledge.
Consider two quotations from my book that summarize general and personal knowledge:

To survive and flourish, humans require general knowledge, or education, in the form of concepts and principles to guide their choices and actions. Each individual, in addition, requires specific knowledge, or a set of concepts and principles unique to his or her experience, to direct action to the achievement of health and happiness (p. 54).
Systematic bodies of concepts and principles constitute our sciences; specific bodies of concepts and principles, unique to our own experiences, constitute our personal knowledge (p. 73).
General knowledge comes largely from others; it is essentially our education broadly construed, not just what we learn in school. Personal knowledge, which includes the general, is what we have experienced over the years and makes us who we are as individuals.
Personal knowledge comes first. It is what we begin acquiring early in life, and continue to acquire throughout. It starts developing before we can speak, perhaps even in the womb, in reaction to pleasureful and painful experiences. The pleasure-pain aspects of these experiences provide the foundation of our emotions, and the emotions we associate with concrete experiences in childhood and youth influence how healthy and happy we will be as adults.
Personal knowledge, as a result, might also be called experiential knowledge, the acquisition of knowledge based on the objects, persons, events, and even ideas we come in contact with, or rather, experience firsthand.
When we learn to talk, we form concepts and elementary, childlike principles, such as “all animals that walk on two legs are human beings,” which is to say at this point we are beginning to form universals that constitute the basis from which we go on to acquire general knowledge. The principles are called “childlike” because they may not agree with what a more knowledgeable adult might say.
General knowledge is what we learn from older people, initially our parents, siblings, and others around us. More importantly, it comes from teachers and books.
General knowledge is our education and does not stop, one would hope, after receiving high school or college diplomas. It is the systematic bodies of concepts and principles that constitute our sciences, both basic and applied, and we learn and remember the portions of each that are relevant to our lives.
General knowledge becomes part of our personal knowledge when it is individualized to our interests and goals.
Personal knowledge indicates who we are as unique, individual human beings and forms the basis of our personalities, our distinctive ways of thinking and acting. Much of our personal knowledge is specific to concrete objects, persons, and events, such as a ball bouncing and rolling (seen as a young child), a recent performance of a Brahms symphony, or the practical competence of changing a tire on the family car.
Personal knowledge is not universal in the way general knowledge is, though it can become general if, say, we continually increase our general knowledge enabling us to write a book about Brahms symphonies or the repair of certain automobiles. Most of what we do with our personal knowledge is apply the general we have learned previously (1, 2).
Personal knowledge is a collection of experiences—with parents and other relatives; with school subjects, teachers, and classmates; of the time we learned how to ride a bicycle; of happy and sad times at summer camp; and, as an adult, of work, family, and leisure. Throughout life.
The application of knowledge, whether general or personal, is not an emotion-free process.
All knowledge contains evaluations, universal and personal, and those evaluations are what produce our emotions.
Every emotion, as Edith Packer has written (1, pp. 226-38; 2, 151-53), expresses a universal evaluation that is present in all instances of that one emotion. For example, quoting Packer,  joy says “I have achieved one of my most important values” and anger says “an injustice has been done to me.”
Behind every emotion and its universal evaluation, we have our own personal experience of the emotion that draws a correct or incorrect conclusion about the experience. For joy: “I got into my first choice of school and feel like dancing around the room!” For anger: “The teacher said I’m no good at math—I’m never going to like or trust teachers.”
It is these inner conversations or voice, as Packer describes the personal evaluations, that constitute a key part of the content of our personal knowledge.
The quantity of general knowledge that we each hold obviously varies from person to person, as does its quality, that is, its degree of truth or falsity. But the variation in personal knowledge is even more diverse, as we all come from different backgrounds, family cultures—and experiences. Which is to say, we’ve all felt diverse emotions throughout our lives, which essentially is what gives us unique personalities.
Personal knowledge is what makes us and other people interesting—happy or sad, funny or obnoxious, caring or mean.
Unfortunately, personal evaluations associated with the many emotions we have had in our lives are precisely what most of us are not aware of and have not been taught to identify. Nor have we been taught how to change mistaken evaluations.
General knowledge by itself can be thought of as a sort of sterile academic content in our brains. Personal knowledge makes us engaging—or repugnant.
Personal knowledge is what gives meaning to the expression “variety is the spice of life.”

Monday, October 17, 2022

The Two Senses of Self-Esteem and Pride—Moral and Psychological

Self-esteem and pride, respectively, are a moral value and virtue. Both rest on the more fundamental attributes of psychological self-esteem and pride.
The two kinds, moral and psychological, interact to produce the degree of self-respect or self-worth that we hold about ourselves.*
Self-esteem in the moral sense means holding one’s self as one’s own highest value, which means never sacrificing to others or others to oneself. The virtue of pride is the action of living up to one’s own moral values, which includes being true even to one’s non-moral, rational values, living for the sake of our own happiness.**
Moral values are what we seek to acquire and maintain to support our lives as a being that possesses the capacity to reason. Virtues are the acts or practice of acquiring and maintaining those values.
Ayn Rand describes pride as “moral ambitiousness,” always doing what one judges to be right. This means holding, and always acting on, the more fundamental moral values (and corresponding virtues) of reason, independence, honesty, integrity, courage, justice, and productiveness.
For this reason, Aristotle calls pride the “crown of the virtues”—the climax or point of culmination, so to speak, of morality.
Self-esteem and pride are required to uphold and practice a rational egoism that promotes and exhibits one’s self-interest and joy in life—without harming oneself or others.
However, neither self-esteem nor pride in the moral sense can be achieved or maintained without a store of psychological self-esteem and pride.
As I wrote in Independent Judgment and Introspection (pp. 95-96), self-esteem in the psychological sense:

is the degree of confidence or certainty we have in ourselves as a valuable person and as someone competent to correctly and rationally choose values and actions to make us happy in life. The two interacting and reinforcing components of self-esteem are worthiness and efficacy. Both are mental, that is, psychological, not existential or physical as in our high or low competence in changing a tire, though existential competencies derive from and are influenced by the mental ones.
Psychological self-esteem is our conviction of worthiness and mental competence to live life to the fullest. Pride in the psychological sense is an emotion, the emotional consequence and expression of self-esteem.
Pretending to or hoping to be worthy and competent, along with protestations that “I am a proud person” or “boy, I’m great,” are not genuine. They are the defensive result of psychological problems and conflicts, making the moral counterparts more difficult to achieve and practice.
Genuine self-esteem and pride produce a “quiet confidence,” as psychologist Edith Packer (p. 230) says, and a feeling that “I am fit for life.”
The development of psychological self-esteem begins in childhood and requires, for the worthiness component, an unconditional love from parents and other adults around the child. For cognitive competency, the child needs to be taught what is necessary to use his or her mind properly, which in particular means an unconditional commitment to reason and facts, along with methods of identifying the nature and causes of his or her emotions.
A child and, later, adult who has been given strong doses of love in childhood will conclude “I am loved and am capable of being loved by others because I am confident in and reliant on myself.” The child who feels competent concludes “I can and do use my mind well to guide me throughout life.”
As psychologist Nathaniel Branden (p. 130) points out, self-esteem and pride in both senses of the words often begin at the same time early in life—crawling, walking, and banging a spoon on the table, for example, can produce an emotion of the efficacy of psychological self-esteem combined with the virtue of moral pride.
Unfortunately, most of us were not taught much of anything about our psychologies. Thus, the psychological problems we suffer in childhood arise from errors in thinking about ourselves, other people, and the world in general—Edith Packer’s mistaken core evaluations (chap. 1). These errors not only undercut developing self-esteem in the psychological sense, but they also confound our understanding and practice of the moral concepts.
A weak worthiness or confidence component of self-esteem—“I’m no good and the world is out to get me”—for example, may affect the adult’s practice of integrity and courage. A weak mental competence—"I’m so stupid I can’t do anything right”—may affect the adult’s independence, sense of justice, and pursuit of a productive career.
Mistakes in morality due to the influence of our psychologies, it must be emphasized, do not necessarily deserve condemnation, as long as our fears and failures do not cause harm to others. This is called an error in knowledge, not a breach of morality.
The difference in briefest essence of the two meanings of self-esteem and pride are as follows. In the moral sense: I am good (my own highest value) and I do good (always doing what I know to be right). In the psychological sense: I have confidence in myself as good and worthy and can think and do things well to make me happy, with the resulting pleasure in accomplishment.***
The worthiness component of psychological self-esteem gives us confidence essentially to do whatever we choose to do and is a direct connection to moral self-esteem. The pleasure of psychological pride feeds into the desire for “moral ambitiousness.”
* It is unfortunate and sometimes confusing, though not uncommon, that the English language uses the same word for essentially different referents. That the moral and psychological interact with each other only makes clarification of the present terms more challenging. A brief discussion of the moral and psychological meanings of self-esteem and pride can be found in Branden (pp. 298-99).
** Non-moral, rational values are optional in the sense that they do not have to be accepted and practiced by everyone in order to have a good character. They might be universal for our physical well-being, such as the generic value of food, but if we specify vanilla ice cream, then that value is clearly not one that everyone has to accept. This optional nature of non-moral values can also apply to choice of career, romantic partner, or the means of building a bridge, though in some cases universal moral values may be relevant. See my discussion here.
*** Criminal personalities, on the other hand, according to Stanton Samenow (chap. 3), think of themselves as both worthless (“I am a zero, a nothing”) and evil, but “If I thought of myself as evil,” said one offender, “I couldn’t live.” Criminals, thus, are masters at rationalization to prevent themselves from being aware of their inner selves.

Wednesday, September 14, 2022

Dictatorship by the Administrative State

In an earlier post, I wrote the following:

Bureaucracy encourages a legalistic, rule-bound mentality. It says, in effect, you can only do what has been codified. This leads to the generation of hundreds of thousands of rules and laws to control behavior, coupled with the impossible-to-follow proviso that ignorance of the law is no defense. This is why the bureaucratic state has become the modern form of dictatorship, a system of excessive law (Applying Principles, p. 328; see also, pp. 117-21).
Bureaucracy and administrative state are essentially the same thing. Bureaucrats are unelected administrators and policy makers who run the various governmental bureaus—about 432 in the United States to be exact. (Another, more recent term to disparagingly describe these bureaus is “deep state,” though it is broader including think tanks, military leaders, and the national media.)*
In contrast to the administrative state, as I continue in the above earlier post:
A truly free society . . . says you can do whatever has not been codified, i.e., you can do whatever you choose provided you do not violate the rights of others. Rules and laws are few and they are abstract principles. Communication, persuasion, and appeals to inner values become the primary means of relating to others.
And according to defense attorney Alan Dershowitz, criminal law during the early days of this country was simple enough to be read and understood while running!
What we have discovered in recent years, however, is that excessive law, vague and overly broad law, and complicated law that cannot be “read and understood while running” are not the only obstacles to a truly free society.
Many top bureaucrats in the administrative state are highly partisan, and in some cases, power-mongers who have little respect for their duties as administrators and policy makers, often following the progressive mantra of  “logic, consistency, and truth be damned.”
In our American bureaucracy, to administrate means to execute or implement the policies of the sitting president. As policy makers, the obligations of the various bureaus are to make recommendations to the current administration. Nothing in the job description says that bureaucrats may refuse to comply with the Executive’s orders.
“Refusal,” however, is the least of what occurred in our previous presidential administration.
James Sherk, domestic policy advisor to the President Trump, states that there are about 4000 “at will” political appointees in the bureaucracy, but a total of 2.2 million—essentially tenured—bureaucrats, because it is nearly impossible to fire them. The “at will” appointees, in contrast, can be fired by the Executive. Delegation of work to be done is usually handed down to those senior, unfireable bureaucrats.**
It is the higher echelons of this semi-permanent deep state that are the ones who can cause problems for a sitting president. For example, Sherk summarizes what such career bureaucrats have often been guilty of:
• Withholding information;
• Refusing to implement policies;
• Intentionally delaying or slow-walking priorities;
• Deliberately underperforming;
• Leaking to Congress and the media; and
• Outright insubordination.
Among the worst of bureaucratic subterfuge and insubordination were the self-appointed dictators of public health who led the charge to establish, and continue to rule over, the past two years of covid totalitarianism. Refusal to listen to, implement policies for, or cooperate with the sitting president was worse than palpable. Deception, manipulation, and glaring incompetence due to lack of medical qualification are now being bragged about in post-covid books (see analytical reviews by Lerman and Tucker).
Not reading the science or respecting the judgments of such people as Dr. Scott Atlas was typical. Atlas, for example, would walk into meetings of the coronavirus task force with arms full of recently read and analyzed scientific studies—only to be greeted with blank stares and sometimes explicit statements that said, “We don’t care about that.”
And Dr. Paul Alexander reports that the bureaucrats he dealt with told him that neither he nor the President had any power, which meant he should toe the line of the deep state or resign. When he was threatened to have his career destroyed—initially, the bureaucrats at Health and Human Services, who hired him, refused to pay him—he resigned.
To counter the disingenuousness of these career bureaucrats, Sherk spearheaded the development of executive order Schedule F, signed into law a couple of weeks before the 2020 election, but rescinded almost immediately by the incoming administration. The order focused on making it easier to remove ill performing or insubordinate career bureaucrats. Sherk comments: when the deep state caught wind of the order, “it lit their hair on fire.” Hence, the immediate cancellation.
The source of the growing bureaucracy began with the Pendleton Act of 1883 that eliminated the “spoils” (crony) system, though the difficulty of firing bureaucrats did not begin in earnest until the 1940s.
The solution to getting rid of this approach to dictatorship is to start closing down government agencies. As columnist Jeffrey Tucker said, talking about the US Department of Education:
Pull the plug on the whole thing and sell the real estate. . . . New CEOs do it all the time. They shut down whole divisions, let go thousands of employees, end relationships with suppliers, sell off properties, and do anything possible to save the company. They do it in order to survive. The company in this case is the United States and it too needs saving.
Send the bureaucrats packing to find honest jobs in the marketplace where their primary task is to satisfy paying customers. Most have likely never held such a job. They have no clue what working in business is really like.
Once accomplished, the government may then get back to its constitutional function of protecting individual rights—rather than violating them.
* Regulatory agencies in particular are unconstitutional because they combine the executive, legislative, and judicial into one governmental organization, violating the separation of powers. The progressive supreme court of the 1920s and ‘30s disagreed, allowing the alphabet soup of agencies to grow seemingly exponentially (see Philip Hamburger, The Administrative Threat; also Applying Principles, pp. 81-83; and here).
** The bureaucratic process of trying to fire someone allows for a variety of appeals, requests for reconsideration, and, ultimately, arbitration, which after several months or years often ends in declaring it unjust to fire the employee. Not unlike the academic world.

Sunday, August 14, 2022

The Meaning of Justice

The challenge of defining justice is that it is both a moral and a legal concept.
Reason, courage, integrity, and independence, for example, are moral values with no direct legal counterpart. They each are, to use Ayn Rand’s succinct definition (p. 15), a “that which one acts to gain and/or keep” and derive from the fundamental standard of human life as a rational being. Reason therefore is the highest moral value and standard of ethics. The corresponding virtue is rationality, namely the use of reason as “one’s only source of knowledge, one’s only judge of values and one’s only guide to action” (p. 25). The other three virtues, as I have written before, are, respectively, “acting against great odds or opposition, remaining loyal to one’s fundamental values, [and] relying on one’s own mind to perceive reality.”
“Value” is what we act to attain. “Virtue” is the action of acquiring and sustaining the value.
In that earlier post, I said the virtue of justice means “judging oneself and others by conformity to moral and legal standards.” But there is an acting component to both the moral and legal concepts that I did not include. Justice is not only a matter of judging, but also a matter of acting in relation to the person being judged.
Let’s take the legal concept first, as it is relatively easy to explain. Justice in our legal system means judging a person based on existing law and exonerating or punishing accordingly.
Laws in the current system, however, can themselves be judged by more fundamental moral principles, such as by identifying whether the laws violate or support individual rights. Thus, we can also judge what should be law in a proper, more rational capitalistic society, then praise or condemn based on projected law. Big businesses, for example, may be praised for all of the wealth they create, rather than condemned by the non-objective regulations, as happens today.
Definitions or at least descriptions of justice in the moral sense abound in the Ayn Rand literature. Let’s examine several.
Rand in The Virtue of Selfishness (p. 26) writes “that one must never seek or grant the unearned and undeserved, neither in matter nor in spirit.” But aren’t earned and deserved synonyms of, or closely related terms to, justice?
In Introduction to Objectivist Epistemology (p. 51), she writes that justice is “the act of judging a man’s character and/or actions exclusively on the basis of all the factual evidence available, and of evaluating it by means of an objective moral criterion” (Rand’s emphasis). This does not include the component of acting in relation to the person being evaluated. Note that justice is itself a moral value and virtue that requires other more fundamental moral values and virtues as criteria by which a person is to be judged, for example, “That person was not honest with me!” Justice is derivative from those other values.
Leonard Peikoff (p. 276) does mention the acting component: justice means “judging men’s character and conduct objectively and . . . acting accordingly, granting to each man that which he deserves” (Peikoff’s emphasis). But, again, isn’t “deserve” a synonym of justice? And we might also ask what does acting accordingly mean?*
Galt’s speech in Atlas Shrugged (p. 1019) implies justice means  “that every man must be judged for what he is and treated accordingly, that just as you do not pay a higher price for a rusty chunk of scrap than for a piece of shining metal, so you do not value a rotter above a hero.” Galt here is speaking in elegant, dramatic terms about the principle of trade, which indeed seems to be an acting component of justice.
Rand (p. 31), finally, drives the point home: “The principle of trade is the only rational ethical principle for all human relationships, personal and social, private and public, spiritual and material. It is the principle of justice.” Trade is an action, so it seems to be the acting component we are looking for.
Let us now try to sort out from these various statements what justice is and concisely state its genus and differentia.
As in the legal system, justice is a judgment, which means identifying facts about the person in relation to a standard. And as in the legal system, justice calls for action in relation to the person judged. Thus:

Justice is the value and virtue of accurately identifying facts about oneself and others and comparing those facts to the standard of what is beneficial or harmful to human life as a rational being, then when rationally appropriate engaging the others by exchanging value for value or rejecting any such exchange, which can include punishment.**
In short, justice means accurately judging and—when rationally appropriate—praising or condemning. (“Oneself” must be included in the formal definition because we can be just or unjust to ourselves.)
“Value” and “virtue” is the genus, judging and acting in a certain way is the differentia. The qualification of “when rationally appropriate” must be included because we judge people, events, businesses, and governments all the time, but do not have a “duty” to praise or, especially, to condemn them. Much of the time we are not even close to the person being judged, such as a talking head on cable news! And as I often told my students—when judging people we are close to, including ourselves, or when judging events, businesses, governments, and talking heads—be sure to dig, dig, dig for the facts. “Do you have all of them?” I would ask.
One final moral concept of justice must be mentioned and dispensed with: social justice. I have defined it as the virtue of “accurately judging oppressed classes as underprivileged and granting them restitution” with a variety of handouts taken from those who are morally competent and successful (Applying Principles, p. 96). In other words, from those who have earned their wealth and property. Social justice is the moral, and by extension, legal concept of collectivist victimology.
What it really means is to take from the alleged thieving rich and give to the alleged underprivileged poor. It is a dishonest concept and is therefore unjust!
* “Fairness” is another word that is often thrown in when discussing justice, but it, too, is a synonym.
** “Exchanging value for value” is the general meaning of trade, but it is important to note that in economics and business “trade” has a narrower meaning: the buyer values the product more than the money paid and the seller values the money received more than the product sold. In the more general moral sense of the term, the exchange is still a value-for-value trade.

Monday, July 18, 2022

What Americans Need to Learn about the Left

A retired English professor from Emory University recently wrote: “Stop wasting your time yelling, ‘Hypocrisy!’ Don’t bother pointing out the contradiction. They don’t care. Consistency is not a liberal virtue. Only the outcomes matter.”
Instead of “liberal” virtue, a more correct designation would be “progressive,” as in “far left progressive” virtue. Liberals are still around who think of themselves as moderate (and honest) mixed economy Democrats.
The professor was talking to conservatives and others who still think the communist-fascist leftists in our midst pay attention to things like logic, consistency, and truth.
“Don’t be so naïve,” psychologist Edith Packer, who herself escaped the Nazis, would often say. If bad people are going to kill you, throw you in solitary confinement with no recourse to habeas corpus, remove you from your tenured professorship without just cause, etc., ad nauseam, they will find a way to get rid of you. They are not going to pay attention to logic, consistency, or truth. “You can’t reason with these people, can you?” Dr. Packer would add. The answer to her question was rather obvious.
Logic, consistency, and truth be damned, say the leftists! It’s the outcome, the socialist Garden of Eden, that matters.*
There are three points many fail to understand about the left. (1) Today’s far left progressives are in fact advocates of communism or fascism or some combination. (2) The communist-fascist progressive leftists are convinced that they are the ones who are doing what is moral and everyone else is not. And (3) the campaign to destroy capitalism and replace it with socialism, communism, or fascism has been going on in the United States for over a century (Applying Principles, pp. 110-13).
Thus, it appears to us that  “they just don’t care.” And they don’t care—about anything that relates to capitalism, individualism, or egoism. We are evil and, consequently, they hate us. Tear down the statues of America’s founders. That’s moral in their view. Oppose teaching children that they are racist oppressors. That’s immoral.
Today’s leftists are following Marx’s premise of the inevitability of socialism and the necessity of capitalism’s eradication. The sooner capitalism collapses—literally through physical destruction, preferably turned to ashes—the better.
To attempt a logical argument with the communist-fascist left is futile because Marx gave us that theory of many logics called polylogism (Applying Principles, pp. 309-310). We subscribe to bourgeois (updated to white racist) logic and socialists to proletarian (updated to victim) logic. The two groups—us and them—cannot talk to each other. The “logics” are contradictory.
Sound familiar? “Your truth versus my truth”?? Postmodern epistemology is not so new!
Let me now elaborate point three above with a historical sketch of the left’s attempt to take over the United States in the last 120-30 years.
As I have written before (Applying Principles, pp. 110-13), the first progressives, from the 1880s to the early twentieth century, were educated in Germany by democratic socialists. They brought those ideas back to the United States to replace the move toward classical liberalism with a more “moderate” or “compassionate” social liberalism (Applying Principles, pp. 36-39; see also 1, 2). This gave us, among other increases in government power, the Pendleton Act of 1883, unelected “expert” and difficult-to-fire bureaucrats, and regulatory agencies to “regulate” businesses to make them more “compassionate.”
By the 1920s and ‘30s, with the latter called the “red decade,” communism and fascism were openly recognized and admired replacements for what was understood to be American capitalism. Communists and fascists at the time were bosom buddies until Hitler invaded Russia in 1941. Thereafter, the communists started calling anyone who disagrees with them a fascist. They continue to do so.
Nikita Khrushchev’s revelations about Joseph Stalin in 1956 shook the American communist world such that the likes of David Horowitz’s parents (card-carrying communists) stopped calling themselves communists and resorted to progressivism as their preferred political moniker.
Horowitz himself became cofounder of the New Left, editor of Ramparts magazine, and participant throughout the 1960s and early ‘70s with such communist organizations as the Black Panthers and Students for a Democratic Society (SDS). Indeed, Horowitz, who has now become conservative, asserts that the turmoil and riots at the 1968 Democratic convention in Chicago enabled the left to take over the Party, which, he says, it still controls.**
In the years since progressives have only become stronger and stronger, and more and more irrational, moving their cause further and further left, with seemingly little rational or articulate opposition from the right. Early Party leaders had to moderate their views and intentions. In 2011, however, Barak Obama talked explicitly about “fundamentally transforming” the United States. To what? To socialism, following the lead of one of his influencers, Saul Alinsky, community activist and organizer who advocated open confrontation.
Though attributed to a member of SDS, the following could be the motto of Alinsky: The issue is never the issue, the issue is always the revolution. Which means, as Alinsky clearly acknowledges: the end justifies the means. As in: logic, consistency, and truth be damned, and physically destroy capitalism in order to rebuild a society of socialism.
The election of Donald Trump in 2016 exposed the intellectual bankruptcy of the left and their panic at the thought of losing the political war, especially to an unintellectual businessman who represented the American sense of life and catered to that sense of life in his constituents.
The 1960s erupted all over again, only worse. Putsch (German for coup) is the word Ayn Rand used to describe the “revolution” the 1960s leftists wanted to achieve. Putsch is the correct description of today’s mob terrorism. Its purpose, as Rand says, is to establish tyranny.
Logic, consistency, and truth be damned.
It’s the outcome, the socialist Garden of Eden, that matters. As the communist-fascist progressive leftists say, “We are the ones who are doing good. You capitalists are evil and need to be destroyed by any means that works.”
* Many links in this post are references to previous posts where I have touched on the topics discussed. The purpose of the present blog is to give a more historical perspective on progressivism and its rise in the United States.
** Horowitz is not the only person to turn away from the socialist Garden of Eden. Max Eastman, a prolific writer and editor on the left in the early twentieth century admired Lenin and visited Russia in 1922 and ‘23. Over twenty years or so, he gradually abandoned socialism and started writing free-market articles, many in The Freeman, publication of the Foundation for Economic Education (FEE). During the years he was affiliated with FEE, he came to know Ludwig von Mises.

Wednesday, June 22, 2022

The Benevolence of Advertising

[After a delay this month dueling with a medical monster (whom I believe I have defeated), I decided to dig into my archives for the current post. It is the final chapter of my 1994 book In Defense of Advertising. The chapter’s title is similar to one used by George Reisman in some of his writing. I highly recommend not just Dr. Reisman’s magnum opus Capitalism: A Treatise on Economics (esp. pp. 471-73), but also his Kindle monograph The Benevolent Nature of Capitalism and Other Essays. As Professor Reisman eloquently points out, benevolence and capitalism go together. Here is chapter 8 of In Defense of Advertising.]
Advertising is just salesmanship.
It is not a drooling ogre, waiting to feed on the helpless consumer. Nor is it a vaudevillian’s hook that has the power to yank consumers out of their socks (and wallets) to force-feed them unwanted products. Nor is it a vaudeville show, as many people, including some advertisers, seem to want it to be.
Advertising is just salesmanship, the product and expression of laissez-faire capitalism. Unfortunately, this is precisely why the critics hate advertising; namely, that it is the means by which millions of self-interested individuals become aware of the self-interested, productive achievements of millions of other individuals. Advertising is the means by which millions of people learn how to enhance their tastes and increase their standard of living above the ordinary, humdrum existence of their forebears. It is the means by which the masses—including the “proletariat,” the “bourgeoisie,” and the “intelligentsia”—are given the opportunity to live far beyond the wildest fantasies of the rich nobility of earlier years. Advertising, indeed, is the intellectual conduit by which everyone can seek the good life.
Daniel Boorstin calls advertising the symbol of American “voluntariness.” “It is an educational device to provide opportunities for freedom of choice.” In societies in which there is no such opportunity, states Boorstin, there also is no need to advertise. Advertising’s presence, he says, is a “clue to the increasing opportunities for choice.”*  These opportunities, which originate as political and economic freedom from government-initiated coercion, manifest themselves to consumers as the many new products the entrepreneurs offer for sale.
It was through newspaper advertisements in 1652 that English consumers were first introduced to coffee. In 1657 they were similarly introduced to chocolate and in 1658 to tea. Indeed, advertising, as Boorstin points out, played a critical role in the founding and settling of the United States:

Advertising, of course, has been part of the mainstream of American civilization, although you might not know it if you read the most respectable history books on the subject. It has been one of the enticements to the settlement of this new world; it has been a producer of the peopling of the United States; and in its modern form, in its worldwide reach, it has been one of our most characteristic products.**
Boorstin sees advertising “perhaps even as a prototype of American epistemology . . . a touchstone of the sociology of knowledge, of the ways in which Americans have learned about all sorts of things.”***
If advertising is as valuable as Boorstin maintains, and as I have argued throughout this book, then when will it begin to gain the respect it deserves? Not, I am afraid, until egoism and capitalism are no longer defiled as unquestioned evils, and thus are allowed to gain the respect that they deserve. Not until intellectuals of all types acknowledge that man, as an integrated being of mind and body, possesses not only the capacity to reason, but also a consciousness that is volitional. Not until an objective theory of concepts—the foundation of objectivity and scientific induction—becomes internalized on a wide scale. And not until the objectivity of values and the existence of rational options become accepted and understood.
To borrow a phrase from Ayn Rand, I ask you to “check your premises”—to introspect and examine the ideas on which your value appraisal of advertising rests. If you do this conscientiously, I think you will find that your negative evaluations stem from the anti-reason, anti-man, anti-life, authoritarian world view that permeates our culture. It is this world view that paints such a satanic, malevolent picture of advertising. It is this world view that also paints such a satanic, malevolent picture of capitalism.
If, on the other hand, you examine these ideas in light of Ayn Rand’s pro-reason, pro-man, pro-this-earth philosophy of Objectivism, and in light of the pro-individualist laissez-faire economics of Ludwig von Mises—that is, in light of a truly liberal world view—I think you will begin to look at advertising differently and begin to react to it differently. You will begin to see that advertising and capitalism both are life-giving and benevolent institutions. You will begin to see that capitalism is the social system that provides man with continuous economic progress. And you will begin to see that advertising is the beacon that guides man to the fruits of this progress.
Nothing, as far as I am concerned, could be more benevolent than advertising, beacon of the free society.

* Daniel J. Boorstin, “The Good News of Advertising,” Advertising Age, November 13, 1980, 20. The recent lifting of the American Bar Association’s ban on advertising by attorneys has brought “opportunities for choice” in legal aid to many more people, especially the middle classes. Prior to this change in attitude toward advertising, legal help was available primarily to the wealthy, who could afford the monopoly prices lawyers were (and still are) able to charge because of their government-granted privileges, and to the poor, who received legal aid from lawyers who were paid for their time under other government-granted privileges. The middle classes simply went without legal services.
Studies of attitudes toward advertising by professionals provide revealing insight into the motivation of some of these professionals. One study of dentists showed that the majority of older, established dentists opposed advertising, while the majority of younger, unestablished dentists—the ones who most needed some means of finding new customers—not surprisingly favored advertising. So much for principled thought among licensed professionals—not that they are more pragmatic than any other segment of our society.
** Daniel J. Boorstin, “Advertising and American Civilization,” in Yale Brozen, ed., Advertising and Society (New York: New York University Press, 1974), 11.
*** Ibid., 13.


Professor Boorstin, whom I cite extensively throughout this book, was trained in law, but was attracted to history, especially “for the achievements of the ‘amateur’ in history,” according to the New Georgia Encyclopedia. That made him a match to my understanding of advertising as a knowledge tool of everyday buyers and sellers, including the benevolence of America and its sense of life. Which is not to say Boorstin did not have a brief foray into the U.S. Communist Party during the 1930s and does have some negative comments to make about advertising—the usual ones: advertising corrupts American culture, which are part of the litany I discuss throughout the book. Professor Boorstin ended his career as the prestigious Librarian of Congress from 1975–1987.

Wednesday, May 04, 2022

Personality and Style versus Honesty and Justice

Here is a statement sometimes heard, sometimes vociferously: “I can’t stand that person’s personality or style.” The person referenced might by a client, a coworker, a relative—or a former US president.
Is the person referred to immoral? That is, dishonest or unjust?
Sometimes an elaboration follows about a business or personal relationship: “It’s a personality conflict. We just clash too much and can’t get along!”
At my midtown Manhattan job several decades ago, it was clients who were said to have those “bad” personalities and, therefore, were impossible to get along with. Somehow I ended up with three such clients with one each abandoned by my two coworkers and boss. After working with the clients for a short time, I concluded, “There’s nothing wrong with these people. They are quite nice!”
On the surface, getting upset over a “personality conflict” or “style difference” seems bizarre. After all, personality or style may mean that one person likes to talk a lot and the other is quiet. These two personalities cannot get along?? Admittedly, the talker may have to ask a few questions to draw the quiet person out. But seriously?
Without saying so explicitly, one person in the personality or style conflict—probably both—think the other is immoral, requiring the ending of the relationship.
Let us define our terms. As I wrote in Independent Judgment and Introspection (pp. 55-56), personality

is our distinctive method of thinking and acting; it includes all of our beliefs and values—the moral ones, as well as the ones that form our psychologies, that is, beliefs and values about who we are as a person, beliefs and values about other people, and beliefs and values about the environment in which we live. The outward behavioral manifestation of beliefs and values are called traits and the traits that stand out, the distinctive ones, define our personality.
Moral character, then, is a part of personality but must not be confused with our psychologies or our other traits that might be moral or immoral. Gregariousness and shyness are not immoral.
The outward behavioral traits define our style, or as the unabridged dictionary says, “an individual’s typical way of life.”
Personality and style are obviously related, but not identical. “Style” can be called the acting part of personality. The thinking part is what gives us mental habits that guide our choices and actions.
Two more terms to define: honesty means telling the truth, though not always. Exceptions would occur when someone is pointing a gun at you or threatening your privacy, or when the truth might unnecessarily hurt the other person. Justice means correctly judging a person as good or bad or somewhere in between and responding appropriately (1, 2).
Judging how honest or just a person is can be challenging in everyday situations, especially if you are not aware of, or allow for, the influence of psychology operating in the other person. Subconscious defensive habits often cancel free will to such an extent that the acting person, the one with the “style,” is not aware of what he or she is doing, such as talking too much or hyperbolizing or feeling afraid to speak up. None of these traits is immoral, or dishonest or unjust in any way.
The so-called difficult clients in my experience may have been demanding or even pedantic or confused and unclear about what they wanted. They may have been distrustful and angry over poor service in the past and wanted to be assured that this time everything will go smoothly.
My conclusion was that patience and anticipation of possible problems was key to keeping clients happy, and my coworkers and boss did not always indulge such patience, which apparently I did. I believed that my clients just needed to be listened to and that I had to practice that old business-as-usual adage, “promise only what you can deliver and deliver what you promise.”
No BS or hyperbole, in other words. Our clients were not dishonest or unjust, though they certainly exhibited many different personalities!
So what about the “style,” let’s say, of a former US president? One that called the country’s press the enemy of the people or expressed harsh criticism of anyone who did not do a good job or who criticized him. And was also known to hyperbolize.
I heard all of the above types of statements, sometimes with elevated volume, in my years living and working in New York City. Our former president was simply a New York businessman. And that is precisely what his naysayers did not like. He got things done and called his critics names. They declared his personality and style inappropriate, meaning immoral.
After all, he did not believe in turning the other cheek, and that was intolerable to anyone who believes self-sacrifice is the highest virtue in personal and professional life. And the “anyone” in this statement is bipartisan, as the doctrine of self-sacrifice is practiced by both leftists and conservatives.
Our former US president was not an altruist. He was an egoist who expressed, though not always articulately, the American sense of life. His constituents who also represent the American sense of life responded by electing him president. (See 1, 2, 3.) He was a rare and unusually strong supporter of the US’s national self-interest. Unfortunately, he leaned toward mercantilistic ideas when it came to international trade, but he also used tariffs as bargaining chips in his negotiations.
What is essential about our former president is that his “personality” and “style” expressed what is uniquely American by producing significant accomplishments and keeping us out of war for four years.
The appropriate response to all of the above types of statements, whether from a coworker or a president, is: “So what?”
Get back to work and on with your life.
A final note about the mainstream press. We now know how biased, not to mention dishonest and unjust, they were during the previous presidential administration and how eager they are today to silence alternative points of view.
Media that advocate censorship are indeed enemies of the people. They are the ones who will welcome a dictatorship to our country (1, 2).
See “How Free Speech Dies,” by Wall Street Journal columnist Mary Anastasia O’Grady and my post on the same issue.

Saturday, April 09, 2022

On the Separation of Church, Science, Education, and Business from the State: Avoiding Repressive Fascism

A suggested revision of the First Amendment of the US Constitution:

Congress shall make no law respecting an establishment of religion, scientific research, education, or business activity, or prohibiting the free exercise thereof.
When the state meddles, bad things happen—besides violating our rights.
The origin of the notion of a dividing line between church and state, or more correctly, “a theory of two powers,” as writes, goes back to Mark 12:13-17 when Jesus replied to questioning by the Pharisees who were attempting to trap him in a dilemma: either offend his followers by saying taxes should be paid to Rome or be arrested for treason for telling them not to pay.
Jesus replied: “Give to Caesar what belongs to Caesar, and give to God what belongs to God” (Mark 12:17, New Living Translation).
Prior to this statement, church and state were inseparable. Throughout the early Middle Ages, the Church continued to dominate life, though by the tenth century numerous secular rulers had arisen to compete with and manipulate the Church. Over the centuries, conflict between church and state, as well as conflicts between the newly founded religious sects, led to many bloody wars. In the eighteenth century the notion of individual rights and separation of religion and state became expressed in the US’s First Amendment.
Classical liberals of today understand the separation as complete, as in “leave us (the citizens) alone” to pursue religion or not and in the manner we choose. The state should stay totally out of religious life.
As writer Collin Killick put it: “Laws that establish religion in government, even if created with the most benign intent, could put our nation on a path toward repressive theocracy” (emphasis added).
And “repressive” is how the state has been relating to science and business.
Former Harvard epidemiologist Martin Kulldorf, though not fully calling for laissez-faire of science by the state, is calling for the decentralization of scientific research. Kulldorf challenges the domination of government string-pulling in science because the government, especially in public health as controlled by the National Institutes of Health and the Centers for Disease Control, dispenses most of the research money, deciding who gets it and which problem will be studied. Two-thirds of research money comes from federal, state, and local government sources, with well over half from the US government.
The gatekeeping, not to mention censorship, by the government on scientific research became apparent throughout our recent past two years of covid totalitarianism, as I have described the ordeal.
Kulldorf’s coauthor of the Great Barrington Declaration, Sunetra Gupta, calls the science-controllers cartels: government agencies, journal editors, and peer reviewers, all of whom determine promotion, tenure, and research in academia.*
“Repressive scientism,” using F. A. Hayek’s term for a “pretense at science,” is what we seem to have been given. The disastrous effect of logical positivism on science today cannot be overstated. Quoting from the description of Hayek’s book, The Counter Revolution of Science, at
There was once such a thing as the human sciences of which economics was part. The goal was to discover and elucidate the exact laws that govern the interaction of people with the material world. It had its own methods and own recommendations.
Throughout the twentieth century, however,
the economy and people began to be regarded as a collective entity to be examined as if whole societies should be studied as we study planets or other non-volitional beings.
As molecules, in other words, or billiard balls and other inanimate objects. “Science had turned from being a friend of freedom into being employed as its enemy.” From a methodological individualism, where the individual entity or person was the unit of analysis, to a methodological collectivism—the group, or collective, as the unit.
The new, repressive method now applies to all sciences. And that is the collectivization and herd conformity (or groupthink) of science that we have today with the government in charge.
What we are left with is a narrow range of conventional research, sometimes flawed (or even fraudulent), and neglect or repression of creative thinking and disagreement with the establishment.
Decentralize all research to the university level, says Kulldorf. Let universities distribute the money and publish their own scientists’ findings through open (not blind) peer review. The process would speed up research and publication and perhaps lead to innovative findings.
The best solution, of course, would be to separate education completely from the state, but that would mean making universities businesses, which they are, as are churches. They just are not profit-making businesses, which they should be. (See Applying Principles, pp. 187-90.)
The fundamental issue is to completely separate business and state. Paraphrasing Killick, “Laws that regulate and control businesses could put our nation on a path toward repressive . . . fascism.”
Which is what fascism in its essence is. Socialism owns everything and everyone; fascism, a variant of socialism (perhaps we should call it the “Omicron” of socialism??) leaves some property private, but only in a nominal sense. It still controls everything and everyone at the governmental level.
It is the total, airtight control we have endured over the past two years.
* See my discussion of academic research, the peer review process, and its effects on science in Applying Principles, pp. 123-32, 140-42.

Wednesday, March 09, 2022

Science and Great Experiments: The Search for Universals

The 1981 book Great Scientific Experiments by Rom Harré presents in popular format and in only 200 pages the gist of twenty influential scientific experiments, from Aristotle to the twentieth century.

Interestingly, none of the experiments cited uses statistical samples of any size divided into experimental and control groups. All samples are small and one is a sample of one! How is this possible?

And if true, may we conclude that Sigmund Freud and Jane Goodall were great experimenters?

It is the logical positivism of modern science that tells us a sound theory of induction does not exist, because universals do not exist, only statistical probabilities. Therefore, anything close to causal must be found through at least two groups of large samples to control sources of “extraneous variation.” Hence, all we can find through our many studies are “successive approximations.” This what John Stuart Mill’s followers called the hypothetico-deductive method that drives nearly all research today and condemns Freud and Goodall to the realm of “pseudo-science.”

Let us back up and define our terms. Science is a systematic study of an aspect of reality that explains its domain descriptively and causally, and if applied (as technology) provides guidance for human choices and actions to achieve specific goals.

The product of any science is a body of knowledge—a collection of integrated, universal concepts and principles the aim of which is to enhance life.

The second term to look at is experimentation. The Latin root of the word experiment means to try. Paraphrasing the Oxford English Dictionary, to experiment means to test or try something to identify what has previously been unknown. This means experimentation is trial and error. It may involve experimental and control groups (and the manipulation of one variable to determine its effects on another), but it does not necessarily have to. It also may involve an elaborate apparatus or it may be entirely conceptual—in the head, sometimes called a thought experiment.

Science historian John P. McCaskey has traced the history of induction and the role of experimentation in it  (1, 2). Up to the last couple of hundred years, McCaskey found, induction and experimentation were based on the work, among others, of Francis Bacon who based his work on Aristotle’s formal cause. And Aristotle in turn developed his theory of induction from what during the Renaissance was called “Socratic induction.”

Socrates, when taunting his know-it-all Athenian conversationalists, was looking for universal essences that apply in all instances.

Universals, those exceptionless concepts and principles, not correlations or probabilities, are what constitute the essence and foundation of science.

Thus, conceptualization, or concept formation, is the fundamental method of science, because large samples are often not necessary to identify the universals. McCaskey, as do I, gives the nod to Ayn Rand for understanding the role of concept formation in science, specifically the inductive process of identifying essential distinguishing characteristics.

Examples from McCaskey: Robert Koch’s identification of the comma bacillus (Spirillum cholerae asiasticae) as the defining, universal characteristic and cause of cholera, not such competing but correlational hypotheses as season or foul water; Charles Wells’ conclusion, after many small experiments to test a wide variety of independent variables, that water condensation was the essence of dew; and Lord Kelvin’s reasoning to describe tides by their definition and causes, namely “motions of water on the earth, due to the attractions of the sun and of the moon” (quoted in McCaskey).*

Harré acknowledges in his book (p. 191) that one or a few cases, with careful experimentation, that is, trial and error, can yield “defining properties of all samples similar to them.” He calls the small samples “intensive design.” In contrast, he calls the larger (and statistical) versions “extensive.”

The intensive design and sample of one in Harré’s book (chap. 2) is the nine years of experimentation conducted by army doctor William Beaumont on Alexis St. Martin, whose stomach did not close completely after a musket wound. Access to St. Martin’s stomach contents, especially his gastric juices, enabled Beaumont to answer the question, are gastric juices chemical solvents or is the process of digestion some vital force and the juices just “inert water”?

Feeding different foods to his subject and observing his digestive processes and by removing gastric juices to test its effects in a glass jar were the experiments that Beaumont conducted to make the well-respected conclusion of chemical solvency, even of “the hardest bone” (quoted in Harré, p. 41).

The concept correctly identified by Beaumont was that gastric juices were indeed—and universally so—chemical solvents, not inert water.

In my book Independent Judgment and Introspection (pp. 78-79), I praise Sigmund Freud for his work in clarifying and defining psychological repression and Jane Goodall for her discoveries about chimpanzees. Both conducted great experiments, the former in psychology, the latter in biology. They both used the method of conceptualization and deserve to be recognized as great experimenters. Science, in other words, is not confined to the fields of physics and chemistry, or imitations of those fields, as too many positivists are prone to assume.

Freud spent over thirty years in talk therapy with many different patients before he finally formulated the essential meaning of repression as an unconscious (or subconscious) response to anxiety that mutes the experience of or blocks entirely thoughts, memories, or emotions from conscious awareness. He also had to distinguish repression from defense mechanism in general, which for many years he tended to equate.**

Jane Goodall’s work, beginning in 1960, was to observe the behavior of chimpanzees in the Gombe Stream Chimpanzee Reserve (now Gombe National Park) in Tanzania. Over the course of many years, she discovered that chimps eat meat, make elementary tools to obtain food, can be violent and even cannibalistic, and have personalities. She gave the chimps names and interacted with them after they had gained her trust.

Goodall’s work required patient observation and, especially, application of analogous concepts from human psychology to the higher mammals, “patient observation” meaning testing and trying different ideas before drawing conclusions.

It is the logical positivists who dismissed Freud’s work as anecdotal at best, literature at worst. And it was the “men of hard science” at the beginning of Goodall’s career who dismissed her as an uncredentialed woman frivolously and anthropomorphically giving names to the nonrational, dumb animals. Paleontologist Stephen Jay Gould, however, called Jane Goodall’s work “one of the Western world’s great scientific achievements” (quoted in Goodall, p. xvii).

It is the logical positivists and many of those “men of hard science” who are not particularly scientific, or great experimenters!

* In the study of tides there may have been confusion over the difference between theoretical and applied science, or technology. Sea captains need to know the timing and heights of tides at a specific location, and statistics can help make these predictions. But this is applied science where the universals of theory are used to uncover correlations, tolerances, and averages to guide concrete, practical decision making.

** Repression is a defense mechanism, or defensive habit, as I prefer to call it, but it is not identical to it. Freud’s daughter, Anna, said it was in his 1926 book Inhibitions, Symptoms and Anxiety when Freud identified the correct place of repression in psychology. The muting or barring of an emotion is accomplished by muting or barring the thought that stands behind it. Traumatic memories, I have suggested (fn 11, p. 146), are not repressed and should not be thought of as part of the definition. Subconscious is the more preferred modern term and is discussed in my appendix to Independent Judgment and Introspection.

Thursday, February 10, 2022

Mass Psychological Conformity

Thinking about our current covid totalitarianism, columnist Roger Simon recently wrote:

What we have witnessed throughout the world [today] is millions, really billions, of people taking orders without thinking or, in the majority of cases, even seriously investigating what they have been told.
Simon states that witnessing this has helped him answer his long-held question (also held by many, including me): how could educated people in World War II Germany do what the Nazis did to Jews?

The best he could come up with:
We live in a culture of pervasive obedience….It’s everywhere—people giving up their personal agency, even their ability to reason, out of fear and willingly adhering to the mass.
A generation of conformists has been created as never before in our history.
Which certainly is true, but leaves the question “why?” unanswered. The best I can come up with is “mass psychological conformity,” or to elaborate, “mass psychologically-generated callous conformity, indifferent to harm caused to others.” The harm can range from “lockdowns” to mass extermination. In the case of the latter, “inhumanity” may be substituted for callous conformity.

Which still leaves the questions: “Why?” and “How?”

Various attempts to name this syndrome have been made: group psychology, groupthink, the madness of crowds, or true believers in a mass movement. Tulip mania, the current situation is not, though “herd conformity” is a phrase I have used before to describe it. Most recently, the terms “mass delusional psychosis” or “mass formation psychosis” have been offered.

“Delusion” is correct because in its simplest definition delusion means belief in something that is false, whether it’s “I’m probably going to die if you breathe anywhere near me” or “I’m Jesus Christ.”

And the syndrome is “mass” because so many people worldwide have gone along with their authoritarian public health and political leaders.

But these followers are not psychotic in the sense that they completely withdraw from reality, suspending conscious control over life and allowing the subconscious to take over.* Because thinking errors are the cause of neuroses, all psychological problems can be said to some extent to be delusional.

“Inhumanity,” according to Merriam-Webster means “being cruel or barbarous” and “the absence of warmth or geniality.” The past two years have certainly seemed inhumane, at least to those who have been harmed the most: children, small businesses that have closed and workers, if they still have a job, many of them single moms, who have to mask up to serve the laptop elite while the latter dine and shop. Then there are the non-virus related deaths, many by suicide.

Callous indifference, indeed! (See my post “They Just Don’t Care—Rationalization and the Need to Look Good.”

Who exactly am I talking about? The intelligentsia in particular: mainstream media, leftist teachers and professors, certain left-leaning entertainers and business leaders, politicians, and the public health cadre of unelected deep staters.

The latter two may even have a worse psychology. The past two years seem to have brought out their inner totalitarian, as in “we’re telling you who you can have in your home and when or if you can travel. Obey!”

Other people, such as the laptop elite who support the intelligentsia’s doctrines, would have to be included. The cause of all psychologies varies widely, in this case likely ranging from plain ignorance of the doctrine’s consequences to deliberate envious glee for those harmed (akin to many Germans during World War II).

Today, most people are just scared, thanks to the unrelenting propaganda campaign waged by the intelligentsia.

The root of the syndrome is psychological dependence, a psychology that does not depart completely from reality as a psychotic does, but one that shifts reality to other people as their source of beliefs and values. It is a passive acceptance of what those significant others think, feel, and do—a suspension of independent judgment to go along to get along . . . with the crowd.

We all often fail to seriously investigate, because we learn from books and other people and can’t escape the need to rely on experts.

But why the suspension of independent judgment? On the mass scale, this is where there is again a wide variety of reasons. The culture’s philosophy contributes in large part to explaining Simon’s observation of “pervasive obedience.” Germany’s duty ethics of self-sacrifice to the state (or Führer) eclipses independence. “It’s your duty to obey.”

The United States holds a nearly as strong duty ethics based on its predominant Protestantism. Just look at attitudes toward the military draft: “It’s your duty to die for your country.” (Should a big war break out, a new draft would be passed in a heartbeat by Congress and supported by the public. See The Ominous Parallels.)

For over 120 years, American culture has been assaulted and battered unendingly by the progressive left demanding that to be moral we must sacrifice ourselves to the collective, the group, the state.

And in non-Judeo-Christian cultures, asceticism and self-denial are widely held doctrines, with authoritarianism not even questioned.

Hence, worldwide, pervasive obedience.

Psychologically, independence derives from a strong sense of personal identity and self-responsibility. Courage, integrity, and self-esteem are consequences. Parental and formal education (the “how” of this issue) are both crucial in helping us develop these traits, but preaching self-sacrifice and victimhood erodes or blocks the development of independence.

“Dependent personalities,” as I have written before (p. 105), “gravitate to groups as the source of their identity, such as their religion, nation, race, class, ethnicity, or private clubs. They gravitate to the government as their caretaker.”**

Depending on the level of deficiency in self-esteem, dependent personalities will blindly accept whatever the government and its public health officials say, even if they are asking us to give up our rights.

It takes a confident mind to stand up to the irrational onslaught we have been going through over the past two years.

Perhaps the best explanation of mass gutlessness is the bureaucratic state. Bureaucracy is how governments manage their affairs and rules and laws are their tools. “Rules are rules” is the battle cry and “I don’t make ‘em, I just enforce ‘em” is what we have been up against for many years.

Those who respond by saying “Oh, okay”—without understanding or questioning what is being asked of them—encourage the totalitarians to continue with more total control.

The rules and laws—far too many of both in the United States, for about a century—all allow both citizens and bureaucrats to rationalize what they are doing as good. “I’m just following (or enforcing) the law.”

Rationalization is a strong defensive habit that allows us to make excuses for our behavior. Criminals thrive on it. So did many Germans in Nazi Germany.

Deference to authority comes first. Then, the obedience. Rationalization does not require or allow examination.

The bureaucratic state of Nazi Germany had gangs of secretaries typing orders to send Jews to the death camps. How could they do it? The explanation has to be that they thought they were doing something good! See “The Reductio of Bureaucracy” (Applying Principles, pp. 117-21) and William L. Shirer’s book.

Deference sacrifices independent judgment. Obedience makes one a follower and in extreme cases a killer.

Are we going to reclaim our rights, our personal agency, and, most importantly, our ability to reason to assert independence from the madness of crowds?

Or are we going to continue to go along to get along?

* Psychotics usually have episodes. They are not constantly “out of their minds,” living in a “waking dream,” as psychosis has sometimes been described. Even statuesque catatonics are aware of their surroundings and occasionally will respond to a nearby conversation before retreating to their trance-like states. And one psychiatrist asked a psychotic to “stop acting crazy now so I can talk to you.” The response? “Oh, okay.”

** Psychologist Edith Packer (p. 264): “Such people want to be taken care of, and in return they will gladly obey. A nation that breeds a dictator is a nation of people who are afraid of life.”