Infantile Egoism and Environmental Science

Psychologists, perhaps most notably including Jean Piaget in the early-to-mid 20th century, have established the existence of a psychological phenomenon sometimes referred to as "infantile egoism". This phenomenon has been identified, defined, and studied in small children, and is characterized by an excessively personal perspective in which the children tend to see themselves as causing events around them. Examples are generally based on guilt induction, and include children blaming themselves for the deaths of close friends or family members or, in cases where their parents divorce, thinking the divorce is a result of something they have done. Interestingly, and pertinently for present purposes, the children will often respond by undertaking childlike efforts to repair the rift they see themselves as having caused in the first place. They try to draw their parents back together to heal the parents’ breakup, for example, thinking they have some idea of how to do so.

A species of infantile egoism appears to occur amongst humans more generally in cases where they see themselves as similarly responsible for natural environmental dysfunctions and destructions. The primary symptom is a perception that human activity is the probable cause of dysfunction and destruction in otherwise fully functional and harmonious natural systems. The by now both famed and largely, if not entirely, debunked mass extinction of honeybees is a notable recent example: honeybees have been said to be dying in the mysterious but certainly, according to believers, human-triggered phenomenon known as colony-collapse disorder (CCD). Its colloquial names include “bee-pocalypse” and “bee-mageddon.” People throughout the U.S. have been warned to prepare for life without honeybees, including the dietary deprivations that would result from an end to both honey production and, far more destructively, natural crop pollination.

Human development and use of neonicotinoid pesticides, known colloquially as neonics, and genetically modified crops are the best-known alleged culprits. The neonics have by now been largely ruled out, as CNN published in a less widely known but equally human-agent suggestion in 2010: CCD may be, at least in part, the result of cell phone use. But no evidence has ever been produced in support of the cell-phone theory.

A less well-publicized but perhaps more probable CCD cause may be a function of chaotic population decline-and-expansion patterns, which are poorly understood and unpredictable. They are also not discussed in media coverage, as they provide no guilt with which humans may rationalize imposing burdensome penitential power upon themselves. As Bjorn Lomborg pointed out in 2013,

Honeybee deaths are also nothing new. The Breakthrough Institute reports that, in 1853, Lorenzo Langstroth, the 19th-century bee-keeper who invented the modern hive, described colonies that were ‘found, on being examined one morning, to be utterly deserted.  The comb was empty, and the only symptom of life was the poor queen herself.’  In 1891 and 1896, large clusters of bees vanished in a case known as May Disease.  In the 1960s, bees vanished mysteriously in Texas, Louisiana and California.  In 1975, a similar epidemic cropped up in Australia, Mexico and 27 U.S. states. There were heavy losses in France from 1998 to 2000 and also in California.

In a 1979 report, Nancy Wertheimer and Ed Leeper suggested a correlation between high-voltage power lines and childhood leukemia incidence around Denver, CO. Since then there have been various, and variably panicked, studies expanding on the correlation claim and reclassifying it as causative. Despite creative thinking having led to extensive testing of the idea, no one has been able to document anything other than statistical correlations. It does not take an expert in risk assessment to realize that the people who employ statistics for such studies as these are really engaged nothing more than a high-stakes numbers racket. No causative mechanism has ever been identified or verified that would link any power lines with specific cancer cases; instead, the matter is based on population studies and probability calculations. The whole thing may simply be a matter of coincident proximity -- and it should be noted that there might be an unrecognized confounding variable to explain the observed effect. Potential confounding variables are generally controlled for in such studies, but those variables must first be identified. In short, no one knows what unrecognized factor might be at play here in the evident absence of direct causation.

On the other hand, in 2013 Epidemiology published a study by P. Elliott et al. on the incidence of various adult cancers near high-voltage power lines. The study reached the following conclusion: “Our results do not support an epidemiologic association of adult cancers with residential magnetic fields in proximity to high-voltage overhead power lines.” The study did not address the potentially non-trivial extent, if any, to which results covering adult cancers could be generalized to include children. It did, however, tangentially suggest the extent to which humans default to invoking human activity as causing damage.

The tendency to see humans and human activity as causative in natural-world disruptions has also been extended to include the prehistoric phenomenon of megafauna extinctions, which fossil evidence places in the Pleistocene epoch during a time when regional continental glaciation was increasing and temperatures were dropping. There are several suggestions for causation, or at least acceleration and completion, of the extinctions. Two continue to hold enough credibility to support paleontologists’ ongoing research. The first draws a causal connection between the extinctions and natural climate change -- it was getting colder, and the megafauna died out because their optimal climate disappeared. The other generally states that Ice Age humans hunted the beasts to extinction, killing them for both sustenance and sport.

No one has been able to produce evidence ruling either theory out. However, the scientific advantage appears to lie with the advocates of climate change being causative. Among the most interesting things about the competing theories may be their difference regarding whether or not humans are the most likely destructive agents -- some people seem to think that if humans could have done it, then humans almost certainly did do it. It is infantile egoism, writ large (if old).

And of course there is the current consequential matter of so-called anthropogenic global warming -- more currently and, conveniently for its supporters, dubbed “climate change.” Now, Earth’s climate has been changing since as far back as geological methods allow us to look -- some hundreds of millions of years, by now, by both direct and proxy evidence. There have been times in the geological past when there was no glaciation anywhere on Earth. Because there were no humans during those times, humans cannot have had anything to do with causing them.

Much of what is adduced to support a finding of climate change by human causation in the present Holocene epoch has been altered if not both deliberately leaked and/or outright faked, rendering most (if not all) of it highly suspect. Other evidence has been denied, such as the by-now nearly 20 year period without warming that should, following the scientific method, be considered to have ruled the entire theory out -- as Richard Feynman famously said, “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” I would expand his statement to include a theory’s being wrong if it does not agree with real-world evidence. Instead, frequent attempts are made to neuter the contrary evidence by claiming that the theory itself accounts for it, rendering the theory itself unfalsifiable -- it is credited with predicting everything that happens, so nothing can rule it out. There are a great many sources confirming these statements; daily high-level and in-depth updates, drawing on numerous solid research authorities on the matter, are available at the Watts Up With That blog, among others.

Most people have little understanding of geological time -- what John McPhee winsomely dubbed “Deep Time.” If you tell them you are talking about things that happened as recently as the 1500s, they will sense that is about as far back as it gets, even when they know better. There are impertinent questions that would present themselves and force attention if they were not so roundly ignored. For example, how do the “warmists” account for the fact that there have been times in the geological past when there were no glaciers anywhere on Earth?  In the Cambrian period, for example -- a 65,000,000-year period that began 570,000,000 years ago -- the entire Earth was warm enough to prevent any glacier formation. Few people ever bother to point this fact out, much less to ponder accounting for it. There were no humans then, and no attendant fossil-fuel-sourced carbons entering the atmosphere. 

As we see in this abbreviated presentation of examples, there is evidence that humans tend to see themselves as the causes of a great many natural-world disruptions, which allows applications of political pressures imposing behavioral control on humanity. I suggest we may be looking at a kind of infantile egoism extending into individual and collective maturity and throughout our species; human agency as causative should accordingly be looked upon with skepticism. The view seems to be that if it were not for human activity, the natural world would not change much, and certainly not much for the worse. When challenged with specifics on such matters, most people would be quick to acknowledge that there are indeed other engines of destruction, but there is a default tendency to blame people and the things they make and do first. Any other possible causes are generally evaluated later -- not only after human agency has been ruled out, but also after media attention on the matter in question has faded.

Some few of these people almost certainly know better. We can only speculate about their motives. I myself find it almost incomprehensible that they think their own accumulation and exercise of power will be unimpeded by reality. In similar fashion, it is difficult believe that they can be motivated by the gaining of wealth. They already have wealth in nearly monopoly quantities, and while it is clear that they are not merely interested in living extravagantly and intend, instead, to purchase more power, they have not many serious opponents in this world for that either. They come perilously close to being whisperers -- the kinds of faint seductive whisperers who inspire humanity with, “You will be like God!”

Others, a greater number, think they are indeed like God. They think they understand science; they think science tells them how things must be by telling them how things are.

Betsy Gorisch is a professional geologist with an interest in current events.

Psychologists, perhaps most notably including Jean Piaget in the early-to-mid 20th century, have established the existence of a psychological phenomenon sometimes referred to as "infantile egoism". This phenomenon has been identified, defined, and studied in small children, and is characterized by an excessively personal perspective in which the children tend to see themselves as causing events around them. Examples are generally based on guilt induction, and include children blaming themselves for the deaths of close friends or family members or, in cases where their parents divorce, thinking the divorce is a result of something they have done. Interestingly, and pertinently for present purposes, the children will often respond by undertaking childlike efforts to repair the rift they see themselves as having caused in the first place. They try to draw their parents back together to heal the parents’ breakup, for example, thinking they have some idea of how to do so.

A species of infantile egoism appears to occur amongst humans more generally in cases where they see themselves as similarly responsible for natural environmental dysfunctions and destructions. The primary symptom is a perception that human activity is the probable cause of dysfunction and destruction in otherwise fully functional and harmonious natural systems. The by now both famed and largely, if not entirely, debunked mass extinction of honeybees is a notable recent example: honeybees have been said to be dying in the mysterious but certainly, according to believers, human-triggered phenomenon known as colony-collapse disorder (CCD). Its colloquial names include “bee-pocalypse” and “bee-mageddon.” People throughout the U.S. have been warned to prepare for life without honeybees, including the dietary deprivations that would result from an end to both honey production and, far more destructively, natural crop pollination.

Human development and use of neonicotinoid pesticides, known colloquially as neonics, and genetically modified crops are the best-known alleged culprits. The neonics have by now been largely ruled out, as CNN published in a less widely known but equally human-agent suggestion in 2010: CCD may be, at least in part, the result of cell phone use. But no evidence has ever been produced in support of the cell-phone theory.

A less well-publicized but perhaps more probable CCD cause may be a function of chaotic population decline-and-expansion patterns, which are poorly understood and unpredictable. They are also not discussed in media coverage, as they provide no guilt with which humans may rationalize imposing burdensome penitential power upon themselves. As Bjorn Lomborg pointed out in 2013,

Honeybee deaths are also nothing new. The Breakthrough Institute reports that, in 1853, Lorenzo Langstroth, the 19th-century bee-keeper who invented the modern hive, described colonies that were ‘found, on being examined one morning, to be utterly deserted.  The comb was empty, and the only symptom of life was the poor queen herself.’  In 1891 and 1896, large clusters of bees vanished in a case known as May Disease.  In the 1960s, bees vanished mysteriously in Texas, Louisiana and California.  In 1975, a similar epidemic cropped up in Australia, Mexico and 27 U.S. states. There were heavy losses in France from 1998 to 2000 and also in California.

In a 1979 report, Nancy Wertheimer and Ed Leeper suggested a correlation between high-voltage power lines and childhood leukemia incidence around Denver, CO. Since then there have been various, and variably panicked, studies expanding on the correlation claim and reclassifying it as causative. Despite creative thinking having led to extensive testing of the idea, no one has been able to document anything other than statistical correlations. It does not take an expert in risk assessment to realize that the people who employ statistics for such studies as these are really engaged nothing more than a high-stakes numbers racket. No causative mechanism has ever been identified or verified that would link any power lines with specific cancer cases; instead, the matter is based on population studies and probability calculations. The whole thing may simply be a matter of coincident proximity -- and it should be noted that there might be an unrecognized confounding variable to explain the observed effect. Potential confounding variables are generally controlled for in such studies, but those variables must first be identified. In short, no one knows what unrecognized factor might be at play here in the evident absence of direct causation.

On the other hand, in 2013 Epidemiology published a study by P. Elliott et al. on the incidence of various adult cancers near high-voltage power lines. The study reached the following conclusion: “Our results do not support an epidemiologic association of adult cancers with residential magnetic fields in proximity to high-voltage overhead power lines.” The study did not address the potentially non-trivial extent, if any, to which results covering adult cancers could be generalized to include children. It did, however, tangentially suggest the extent to which humans default to invoking human activity as causing damage.

The tendency to see humans and human activity as causative in natural-world disruptions has also been extended to include the prehistoric phenomenon of megafauna extinctions, which fossil evidence places in the Pleistocene epoch during a time when regional continental glaciation was increasing and temperatures were dropping. There are several suggestions for causation, or at least acceleration and completion, of the extinctions. Two continue to hold enough credibility to support paleontologists’ ongoing research. The first draws a causal connection between the extinctions and natural climate change -- it was getting colder, and the megafauna died out because their optimal climate disappeared. The other generally states that Ice Age humans hunted the beasts to extinction, killing them for both sustenance and sport.

No one has been able to produce evidence ruling either theory out. However, the scientific advantage appears to lie with the advocates of climate change being causative. Among the most interesting things about the competing theories may be their difference regarding whether or not humans are the most likely destructive agents -- some people seem to think that if humans could have done it, then humans almost certainly did do it. It is infantile egoism, writ large (if old).

And of course there is the current consequential matter of so-called anthropogenic global warming -- more currently and, conveniently for its supporters, dubbed “climate change.” Now, Earth’s climate has been changing since as far back as geological methods allow us to look -- some hundreds of millions of years, by now, by both direct and proxy evidence. There have been times in the geological past when there was no glaciation anywhere on Earth. Because there were no humans during those times, humans cannot have had anything to do with causing them.

Much of what is adduced to support a finding of climate change by human causation in the present Holocene epoch has been altered if not both deliberately leaked and/or outright faked, rendering most (if not all) of it highly suspect. Other evidence has been denied, such as the by-now nearly 20 year period without warming that should, following the scientific method, be considered to have ruled the entire theory out -- as Richard Feynman famously said, “It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” I would expand his statement to include a theory’s being wrong if it does not agree with real-world evidence. Instead, frequent attempts are made to neuter the contrary evidence by claiming that the theory itself accounts for it, rendering the theory itself unfalsifiable -- it is credited with predicting everything that happens, so nothing can rule it out. There are a great many sources confirming these statements; daily high-level and in-depth updates, drawing on numerous solid research authorities on the matter, are available at the Watts Up With That blog, among others.

Most people have little understanding of geological time -- what John McPhee winsomely dubbed “Deep Time.” If you tell them you are talking about things that happened as recently as the 1500s, they will sense that is about as far back as it gets, even when they know better. There are impertinent questions that would present themselves and force attention if they were not so roundly ignored. For example, how do the “warmists” account for the fact that there have been times in the geological past when there were no glaciers anywhere on Earth?  In the Cambrian period, for example -- a 65,000,000-year period that began 570,000,000 years ago -- the entire Earth was warm enough to prevent any glacier formation. Few people ever bother to point this fact out, much less to ponder accounting for it. There were no humans then, and no attendant fossil-fuel-sourced carbons entering the atmosphere. 

As we see in this abbreviated presentation of examples, there is evidence that humans tend to see themselves as the causes of a great many natural-world disruptions, which allows applications of political pressures imposing behavioral control on humanity. I suggest we may be looking at a kind of infantile egoism extending into individual and collective maturity and throughout our species; human agency as causative should accordingly be looked upon with skepticism. The view seems to be that if it were not for human activity, the natural world would not change much, and certainly not much for the worse. When challenged with specifics on such matters, most people would be quick to acknowledge that there are indeed other engines of destruction, but there is a default tendency to blame people and the things they make and do first. Any other possible causes are generally evaluated later -- not only after human agency has been ruled out, but also after media attention on the matter in question has faded.

Some few of these people almost certainly know better. We can only speculate about their motives. I myself find it almost incomprehensible that they think their own accumulation and exercise of power will be unimpeded by reality. In similar fashion, it is difficult believe that they can be motivated by the gaining of wealth. They already have wealth in nearly monopoly quantities, and while it is clear that they are not merely interested in living extravagantly and intend, instead, to purchase more power, they have not many serious opponents in this world for that either. They come perilously close to being whisperers -- the kinds of faint seductive whisperers who inspire humanity with, “You will be like God!”

Others, a greater number, think they are indeed like God. They think they understand science; they think science tells them how things must be by telling them how things are.

Betsy Gorisch is a professional geologist with an interest in current events.