The scientific method and ‘real science’

The following is a comment left by a reader at Vox Popoli about a year ago, in response to another reader who was concerned about the current state of science. I had written the following response with the intention of posting it here, and then forgot about it. Surak is about to offer some commentary on a disturbing development in science that bears on this, so I figured now was a good time to dig it up and post it.

To answer your question as to what ever happened to the scientific method, here’s the shocking truth: Science does not operate according to the scientific method unless there’s a crisis. Never did.

Science, just like every other avenue of human endeavor (why should it be different, honestly?) operates under the thrall of a power structure. Always has.

The scientific method only applies when challenges come up against prevailing paradigms. Then, it is utilized, and don’t be a fool understand that every effort is made, always, to doom the challenger and to favor the prevailing paradigm.

The great merit of the scientific method is that under these rare conditions reason and proof hold sway. But please do not be so foolish as to assume that science is governed by the scientific method on a basis, because it is not.

Science is governed by egos. And nothing more.

It is true in a grand Kuhnian sense that crisis precedes advancement. It is also true that egos are a factor in science. But so what? Science is the triumph of the human mind over ego and a multitude of other human failings—limited perspective, misleading emotions, dominant philosophies that act as closed boxes, and the corrupting effects of the universal desire for fame, fortune, and/or political power. The scientific method is the means by which these frailties are remedied. Since these obstacles to advancements in knowledge will always be with us, there will always be a turbulent interplay between human nature and the pursuit of science.

The key element of the scientific method that keeps it from flying off in the direction of wild, unsubstantiated speculation is the peer-review process. If you want to know if the scientific method is alive and well in any branch of science, simply observe how rigorously the peer-review process is being used. I go through the peer-review process on several levels every time I submit a research paper for publication.

The first hoop I have to jump through is the judgment of the referee assigned by the journal in which I hope to have my paper published. The most important thing the referee does is check how well I have accomplished the observe –> hypothesize –> predict –> test –> theorize part of the process. If the judgment is that my work is scientifically sound, the paper is published. Then the whole body of my profession passes judgment on my work by deciding whether or not to cite my work. At the next level of the peer-review process, decisions are made about which scientists are deserving of funding, tenure, and promotions. At the final level, judgments are made about which work is deserving of awards. The end result of this in physics is a steady advance in knowledge where occasional detours from truth are corrected and dead ends are usually recognized and reversed.

I accept that there are some areas of science in which the scientific method does not currently function as it should. So-called “climate change science” is the most obvious example of science being corrupted by politics, money, and dogma. Surak will have something to say about this soon with regard to a disturbing development in this field. Meanwhile, there is a simple test one can apply in this regard: any time the name Al Gore or the terms “scientific consensus” and “the debate has been settled” are used in regard to any branch of science, it has undoubtedly strayed from the scientific method.

Biology certainly suffers from an ego problem to the extent that it is nearly impossible to get a mainstream biologist to utter the words, “Darwin was wrong about some important things.” He was wrong about some important things, and a paradigm shift is long overdue in the field of evolution. But, it must be acknowledged that a multitude of biologists are doing very good work that is firmly based on the scientific method.

The real test of any field’s application of the method is whether that field petrifies into dogma or if it routinely accepts change. I must speak in defense of my field of physics/astrophysics. It has a long history that includes the initial establishment of the scientific method as well as continuous successful applications of its process. After the Copernican revolution and the invention of precision clocks, experimental methods were sufficiently advanced that it didn’t take all that long to accumulate enough evidence to overthrow old ideas and adopt new paradigms. To name but a very few examples: Newton’s uniting the heavens and the earth under one set of laws, Maxwell’s unification of electricity and magnetism, Poincaré’s relativity of time and space, Planck’s quantum, Hubble’s confirmation of other galaxies and the expanding universe, Einstein’s new view of gravity, Lemaître’s big bang theory, Zwicky’s dark matter, and the supernova teams’ accelerating universe.

You say this is rare, but how often do you think this is supposed to happen? How often can it happen on such a large scale? The Hubble/Lemaître paradigm is an especially important example of the scientific method working as well as it possibly can. Most physicists did not like the idea of a universe with a beginning, but the scientific method is so firmly established in physics that the vast majority of them accepted it once there was sufficient evidence to overcome all reasonable objections. Those who clung to the notion of the eternal universe for reasons of ego and non-scientific concerns were discredited for straying from the scientific path.

The application of the scientific method does not have to be perfect to be functional. My own everyday experience in the field of astrophysics has been that the method sometimes proceeds as the classic observe -> hypothesize -> predict -> test -> theory. But quite often it is something very different: observe -> huh? -> observe -> what the … ?! -> hypothesize -> predict -> test -> getting close to a theory! -> test again -> wait, what? -> OH! -> hypothesize -> test, and so on. As long as it is evidence- and prediction-driven throughout the confusion, that’s good enough.

As for the system being set up to doom the challenger, how else would you have it? That’s the way it should be, as long as this resistance is not rooted in ideology (e.g. “climate change science”). It’s not unlike a court of law, where the presumption should be the innocence of the accused and the burden of proof lies with the accuser.

Egos, admittedly, often get in the way of true science, but on the other hand I doubt science could proceed without them. Scientists will always be fully human and infinitely closer in nature to Captain Kirk than to Mr. Spock. The vast majority of people I work with are truly driven by a desire for truth, but also the competitive hope for recognition and reward (which is why science has always been a traditionally masculine endeavor). And yes, they also have an understandable instinct to protect the fruits of their labor.

The point of all this, do not confuse the inevitable imperfect application of the scientific method for its absence.

Zombie science

There’s a simple reason for the corruption of biology and the social sciences: these studies are not based on Christian beliefs and faith the way science originally was and must always be. Modern science developed in only one place—Christian Europe. If you look up the great pioneers of physics and astronomy, you will find that they were almost all devout Christians, from Copernicus to Galileo to Newton to Maxwell to Planck to Lemaître.

The one glaring exception was Einstein, but even he famously said, “I want to know His thoughts; the rest are details.” Even though Einstein was not Christian, he was the product of the Christian European culture that gave birth to science, and he was a willing participant in a process based on Christian principles:

But science can only be created by those who are thoroughly imbued with the aspiration toward truth and understanding. This source of feeling, however, springs from the sphere of religion. To this there also belongs the faith in the possibility that the regulations valid for the world of existence are rational, that is, comprehensible to reason. I cannot conceive of a genuine scientist without that profound faith. The situation may be expressed by an image: science without religion is lame, religion without science is blind. (Albert Einstein, 1941)

The prime motivation of Einstein and so many other great figures in science was to uncover divine truth and know the mind of God. People who feel they are doing God’s work are far less likely to succumb to human frailties and engage in activities that corrupt the search for truth. That tradition remains strong in physics, the original science. That is why the field of astrophysics was able to resist the degenerative effects of an increasingly atheist society. When the devout Lemaître conceived of the primeval atom (aka big bang theory) and demonstrated that the Genesis account of a universe with a beginning was scientifically sound, the stubborn resistance of scientists with a hatred for the idea of God was quickly overcome by the evidence.

The other branches of science have not fared as well. Atheists stole science from Christians in the mid and late 19th century with the false social science of Marx and behavioral science of Freud as well as the misuse of Darwin’s theory of evolution and the gross misrepresentation of Christian scripture. Over the last century and a half, secular humanists have successfully alienated Christians from the scientific method the faithful created and taken over most of its areas of study. Physics still has a substantial minority of Christians (and people with a general belief in God), and much good work is still being done. The social and behavior studies, on the other hand, are the tools of secular humanism and the zombies of the scientific world—active but not alive. Biology was bitten long ago and is gradually succumbing to the humanist infection. There is an easy way to tell a zombie biologist from a true biological scientist; ask him to say the following words, “Darwin was seriously wrong about some important things.” If he can’t bring himself to say this, you are speaking with one of the walking dead. Climate change ‘scientists’ are just garden-variety corrupt hacks who have sold out for money, prestige, and political favors. Bundle up for the coming ice age or thank the polluters for preventing it.

The lesson here is that the further any area of study is from the Christian foundations of true science, the more corrupt it is. The United States has been the source of a great deal of the productive science done in the 20th and early 21st centuries. It is also the most Christian of all developed countries. If atheists succeed in turning the United States into anything similar to what the formerly Christian European nations now are, science will die and humankind will experience a dark age.

Replay: The free frontier

Traffic’s up after the informal announcement of the publication of our Astronomy and Astrophysics curriculum, so we’re replaying some of our more important posts from the archives for our new readers.

Yesterday [April 12, 2011], on the 50th anniversary of the first man in space, The Atlantic featured an article by Jim Hodges lamenting the decline of American exceptionalism in space:

[In the 1960s] Americans didn’t talk of their exceptionalism. They did exceptional things, and the world talked about it. In many places around the world, in science labs and classrooms, the NASA “meatball” was as recognizable as the Stars and Stripes.

People remember that President Kennedy said, “I believe that this nation should commit itself to achieving the goal, before this decade [of the 1960s] is out, of landing a man on the moon and returning him safely to the Earth.”

Forgotten is that just before that challenge, he said this as a preamble to it: “I believe we possess all of the resources and talents necessary [to lead the world into space]. But the facts of the matter are that we have never made the national decisions or marshaled the national resources required for such leadership. We have never specified long-range goals on an urgent time schedule, or managed our resources and our time as to insure their fulfillment.”

The government is certainly not doing that now, and we can’t count on it to do these things ever again.

However, I do not see this as occasion to despair. As well-intentioned as NASA has been, government almost always does things slower, costlier, and with less innovation than private enterprise. In fact, while government has been slashing NASA’s budget and scaling back its goals, private companies out in Mojave have been quietly innovating like crazy:

Twisted history

Alex Berezow and James Hannam systematically dismantle a post by atheist evolutionary biologist, Jerry Coyne, who manages to get nearly every one of his claims about science and religion wrong. Example:

Coyne:

If you think of science as rational and empirical investigation of the natural world, it originated not with Christianity but with the ancient Greeks, and was also promulgated for a while by Islam.

Berezow and Hannam:

This is only half-true. Science is a lot more than just reason and observation. You need experiments too. For example, the Greeks, following Aristotle, thought that heavy objects must fall faster than light ones. It takes two seconds to disprove that by an experiment that involves dropping a pebble and a rock. But for a thousand years, no one did. There didn’t seem to be much point in testing a theory they already thought to be true. That’s probably why the Greeks were so good at geometry, as Dr. Coyne notes, because progress in mathematics is largely based on reason alone.

I’ll further point out that Aristotle — hero of humanism and champion of reason — was wrong about just about everything in terms of science, and the acceptance of his model of an eternal geocentric universe in particular held back progress in science for nearly two thousand years. Until it was revolutionized by a bunch of Christians.

Read the whole rebuttal.

The authors have not addressed all of Coyne’s claims, as, they have pointed out, there is “an impressive amount of error and misunderstanding [in] a very small space.” He certainly manages to cram a lot of error into the following unaddressed point:

If religion promulgated the search for knowledge, it also gave rise to erroneous, revelation-based “scientific” conclusions that surely impeded progress. Those include creation ex nihilo, the Great Flood, a geocentric universe, and so on.

By all appearances, the universe was created ex nihilo. Physicists have struggled to explain the origin of the universe in a way that avoids an ex nihilo creation event, without success. As this Reasons to Believe article points out, the Bible mentions a worldwide flood, not a global flood. A Great Flood, as described in Genesis, that wiped out all of human and animal life in the Mesopotamian region — the entire known world at the time — is scientifically plausible. And, geocentric theory began with the ancient Greeks. I suppose you could say that since the Greeks were religious, religion is therefore responsible for geocentric theory, but that would be a gross oversimplification. And, anyway, as Coyne is lumping this in with other biblical conclusions, one can reasonably assume he’s pinning this one specifically on Christianity. But, as we all know, Aristotle was responsible for promulgating the idea, which was later elaborated upon by Ptolemy. Yet, the erroneous notion persists that Christians were to blame for this faulty cosmology. As with the Galileo and Bruno affairs, this is the result of atheist myth-making.

The more commentary I read from atheists, the more I’m convinced that these self-styled champions of fact and reason are anything but.

Christians must reclaim science

Modern science exists because of the Christian faith. That is a provable fact. So, why is there so much conflict over the supposed conflict between science and Christianity? In terms of explaining the atheist myth-making about the supposed conflict — having once been an arrogant atheist, myself — I can tell you that it’s born of either total ignorance (as was the case with me) or the kind of hostility that makes a person blind to the truth or willing to distort it. In terms of Young-Earth Creationism, however, I’m still trying to figure that one out. Modern science is one of the many blessings of the Christian faith, and I can only surmise that YECs have allowed the atheists to frame the argument and have accepted a gross distortion — and outright omission — of historical facts.

The Stand to Reason Blog explains that, in contrast to atheist fables, science and Christianity go way back:

The myth begins with the notion of the “dark ages,” a time when the church suppressed education. It’s just not true. Scholarship was alive and well prior to Copernicus. In fact, scholars were working on heliocentric theories before Copernicus. He learned these in university and built on them when he published in final work. His theory didn’t emerge from a dark vacuum, but from rich science that had been nurtured in the universities, many of them established by the church.

In fact, as the article goes on to point out, sociologist of religion, Rodney Stark, found that 50 out of 52 of the key figures of the scientific revolution were religious.

Hugh Ross goes even further and explains how the scientific method comes straight from the Bible:

The Bible not only commands us to put everything to the test, it shows us how. Christian scholars throughout church history, from early church fathers to present-day evangelical scientists, philosophers, and theologians, have noted a pattern in biblical narratives and descriptions of sequential physical events such as the Genesis creation account. Bible authors typically preface such depictions by stating the narrative or description’s frame of reference or point of view. In the same statement or immediately thereafter comes a listing of the initial conditions for the narrative or description. The narrative or the description itself follows. Finally, the author describes final conditions and conclusions about what transpired.

Furthermore, there is not just one narrative or description of physical events in the Bible. There are dozens. Because the Bible is inspired by God––for whom it is impossible to lie or deceive––these dozens must be consistent with one another. Therefore, each of these dozens of descriptive accounts can be used to test the validity of the interpretation of the others.

In the near future, I’ll be posting an article about the concept of linear time that’s necessary for the emergence of modern science, and how it comes from Christianity.

It’s simple: the pillars upon which modern science stand — the notion of scholarship as a form of true worship, the scientific method, and the concept of linear time / cause-and-effect — were all built by the Christian faith. As the influence of the Christian worldview wanes in the West, replaced by a worldview that actively hammers away at the pillars of science, so will the quality of science diminish. This is why Christians must reclaim science instead of turning away from it.

Why are Americans skeptical of some scientific ideas?

A recent poll indicates that Americans are skeptical of evolution by natural selection, global warming, and the big bang theory. Surprisingly (for me, anyway), the biggest scientific loser is the big bang, with 51% of the respondents skeptical that the universe was created approximately 14 billion years ago.

Frankly, this astonishes me. There are sound reasons to be skeptical of the theory of evolution by natural selection (TENS) and anthropogenic (man-made) global warming, but the evidence and arguments for the big bang theory are excellent — and also consistent with the Bible, which is no small thing, since the poll indicated that religiosity is correlated with disbelief in the big bang.

So, why are most Americans skeptical of the big bang?

This is not a rhetorical question; it’s something I’m striving to understand. Some Christians make the argument that a literal interpretation of the Bible requires a young Earth and young universe, but it appears to me that this belief is inspired, or supported, by an argument against evolution. Many people who are skeptical of TENS (particularly evangelical Christians) believe that it requires billions of years to work, therefore if the universe and the Earth are only thousands of years old, TENS doesn’t work. Never mind that billions of years can’t even begin to help TENS, that doesn’t explain why more people are skeptical of the big bang than evolution.

In any case, scientists are, understandably, distressed by these results. Randy Schekman, a Nobel laureate in medicine at UC-Irvine, said, “Science ignorance is pervasive in our society, and these attitudes are reinforced when some of our leaders are openly antagonistic to established facts.”

Schekman is both right and wrong. If quizzed on why he or she disbelieves in certain scientific ideas, I’m confident the average individual would not be able to explain the best evidence and arguments for and against the ideas. However, I don’t believe it has anything to do with leaders (presumably, he means religious and political leaders) being antagonistic to facts, but rather a vocal minority of scientists and their advocates being openly antagonistic to religious belief.

The poll highlights “the iron triangle of science, religion and politics,” said Anthony Leiserowitz, director of the Yale Project on Climate Change Communication.

And scientists know they’ve got the shakiest leg in the triangle.

To the public “most often values and beliefs trump science” when they conflict, said Alan Leshner, chief executive of the world’s largest scientific society, the American Association for the Advancement of Science. [emphasis added]

Of course values and beliefs trump science in a conflict. Unlike science, values and beliefs comprise an entire worldview, one that has been around much longer than modern science and has been much more influential.

But there was a time, at the beginning of the era of modern science up until the mid-20th century, when the Christian worldview and science largely went hand-in-hand. In the 17th century, many if not most natural philosophers (what scientists were called at the time) were Christians, and they saw their work as glorifying God. Mitch Stokes, in his brief biography of Newton, writes:

According to metaphor, God has written two books—Scripture and Nature—and He is glorified by the study of either one. This view, this “belief in the sacral nature of science,” was prevalent among natural philosophers of the seventeenth century. As Frank Manuel, one of Newton’s most important twentieth-century biographers, says:

“The traditional use of science as a form of praise to the Father assumed new dimensions under the tutelage of Robert Boyle and his fellow-members of the Royal Society, and among the immediate disciples of Isaac Newton. … In the Christian Virtuoso, demonstrating that experimental philosophy [experimental science] assisted a man to be a good Christian, Boyle assured readers that God required not a slight survey, but a diligent and skilful scrutiny of His works.”

Although Newton’s intensity while pursuing his work ranges from humorous to alarming, it is put into a different light if we see it as a measure of his devotion to God. For Newton, “To be constantly engaged in studying and probing into God’s actions was true worship.” This idea defined the seventeenth-century scientist, and in many cases, the scientists doubled as theologians. [emphasis added]

There was only occasional conflict for scientists like Newton in the form of struggling to understand how certain aspects of nature are consistent with their interpretation of scripture.

The antagonistic sort of conflict we see today goes back at least as far as Thomas Huxley using Darwinian evolution to undermine Christian belief. Huxley knew TENS had insurmountable problems, but he saw it as a useful weapon to attack Christianity, which he despised.

Unfortunately, this sort of practice has become increasingly commonplace into the 20th and 21st centuries. Global warming isn’t by its nature useful as a direct attack on Christian belief, but it does represent an attack on the Christian ideal of limited government. The historical misuse of biology as a weapon against Christian belief began with Huxley and continues with modern biologists and their supporters — so much so that the public has little idea how much the most recent findings of evolutionary biology support the Christian view of creation. The misuse of physics to undermine Christian belief, however, is relatively new. I find it distressing not only because it is my field of study, but because the field of physics has historically led the way for the other sciences and represents the greatest scientific support for the Christian view of creation.

As a scientist — and irrespective of my Christian beliefs — I find the behavior of the attackers perplexing. The majority of Americans are either Christian or hold some general belief in a supreme being, so why do some scientists go out of their way to alienate a majority of people who support science by sending their children to universities and by paying taxes for government-supported science programs? At some point, they’re just not going to see the value of either. And they’re certainly not going to make the effort to become more literate in a topic that they’re told is in opposition to their faith. Modern scientists like Stephen Hawking who use their considerable scientific knowledge to attack religious belief are therefore doing a tremendous disservice to science. I don’t know what Hawking’s motivation is, but if he dislikes Christianity to the extent that he’s trying to undermine it, as Huxley did, then he is only indirectly realizing this goal and at the cost of eroding confidence in good science.

Poll results notwithstanding, big bang theory is good science — in fact, it is arguably the crowning achievement of modern science — and it is not only compatible with Christian belief, but in my opinion mandated by it. (I will expand on this in a future post.)

Meanwhile, there’s no use blaming political and religious “leaders” for the lack of confidence in science, because, if history has taught us anything, it’s that they don’t tend to lead the way, but jump out in front of the direction in which people are already going. If good science is going to flourish in America, two things must happen. Christians must become scientifically literate — which is something I hope to encourage with my ministry — but scientists have got to stop the public antagonism toward Christian belief.

Self-correction in science

A common claim about the superiority of science over other ways of knowing is that science is self-correcting; science may take wrong turns from time to time, but it eventually finds its way back on the right road. As a supporter of science, I believe in the power of the scientific method; and generally speaking, it’s true that science self-corrects. However, it’s important to understand how human limitations—scientists are human, after all—sometimes undermine the process of self-correction.

Science will never give full understanding of anything. All that we can hope for are useful approximations of the objective reality we hope is out there. Under ideal circumstances, science is certainly self-correcting in the sense that it provides a process for arriving at consistently closer approximations. But, in too many instances the self-correcting potential of ideal science cannot overcome common human frailties. The most famous example from the field of astronomy will illustrate this.

Physics, like all of the sciences, started out as ‘natural philosophy,’ which functioned as an integrated branch of the whole philosophy/religion of the ancient Greeks. Science in its rudimentary form was thus shackled to the Greek worldview that placed humans forever at the center of the universe and effectively limited scientific thought to what would become Ptolemaic theory. This geocentric view of mankind’s place in the universe also prevailed because it conformed nicely to what the ancients observed with their limited senses, and because it had a powerful appeal to human emotions that subsequent theories could never have.

It is testimony to the power of the human mind and the potential of science that at least one individual was able to overcome all of this and figure out a closer approximation of the truth. An ancient Greek astronomer named Aristarchus proposed a heliocentric universe in the 3rd century B.C. Unfortunately his hypothesis was quickly squashed by contemporaries who condemned his idea as impious and foolish—in other words, it didn’t conform to the dominant philosophy/religion of the day. The Copernican revolution did finally take place, 1,800 years later, but those who have faith that science is the best way (or only way) to know things shouldn’t take much comfort from this example. Yes, the scientific method was eventually successful, but the self-correction was at best tragically slow.

This example has some scary implications, because the weak sister of modern science, the study of human behavior, is currently at a stage comparable to physics 2,000 years ago and shows no signs of correcting itself. With all of the social and behavioral problems facing an increasingly complex and technological world, it is possible that modern society cannot survive another 2,000 years without viable theories of individual and group behavior. So, it is important that all of us who depend on science to solve (or at least mitigate) the world’s problems understand how the three major things that prevented physics from correcting itself for about 2,000 years—the debilitating effects of ideology, the limiting nature of human perspective, and the immense power of emotions to mislead—are still at work today preventing the newer branches of science from correcting themselves.

To appreciate the ways in which science’s ability to self-correct can be thwarted, one has to be very clear about the basics of the scientific method. They can be outlined in rudimentary form in the following manner:

  • Preliminary observations of some natural phenomenon are made
  • A scientist brainstorms possible explanations of what is observed
  • A workable hypothesis is formed
  • An organized plan for additional observations and experiments is made and carried out
  • If additional evidence for the hypothesis is found, it advances to a level of confidence higher than that of a hypothesis but lower than that of a theory (we can call it a conjecture)
  • The newly elevated conjecture is then presented for the peer review process, in which some scientists will find evidence to support the conjecture and others will try to tear it apart
  • If the conjecture survives the peer review process and gains additional evidence and the support of a large number of scientists it will eventually become accepted as a viable theory

The two parts of this process that make self-correction possible are the brainstorming and peer review stages. Unfettered brainstorming makes it possible for scientists to consider all possibilities—that’s how we got Einstein and Georges Lemaître, the father of the big bang1. If religion or philosophy makes some ideas unthinkable, the brainstorming stage will be inhibited, and ‘unpopular’ possibilities will be missed. The humanist philosophy that dominates the behavior and social sciences departments today is making the self-correction in those fields impossible just as much as the philosophy of the ancient Greeks made physics impossible.

The peer review process makes it possible to challenge popular but false notions. In modern times this stage has become highly susceptible to the negative influences of politics and government funding. The controversy over global warming / climate change is a good example. Regardless of a person’s views on climate change, it should be deeply disturbing that one side of what should be a scientific debate has been corrupted by government funding into political advocacy. When any scientist becomes an advocate of policy, he is no longer a scientist, because science can only serve one master—the search for truth. It is even more disturbing that those on the other side of the scientific debate have been tagged with the vicious label of ‘deniers.’ Those who use this label in such a pejorative manner are trying to preemptively shut down the peer review process and mandate scientific orthodoxy.

It is dangerous, therefore, to assume that science has an inherent ability to overcome human failings to the point that we can depend on it to be self-correcting. That it can effectively reach that goal is demonstrated by the fairly rapid acceptance of big bang theory in the mid-20th century over the strong objections of those who were philosophically opposed to it.  But what has only recently become true of a branch of science that is over 450 years old, is not true of the newer sciences. Biology, psychology, and the social sciences are nowhere near the stage where self-correction is automatic.

Now, there is one assumption I’ve made in this discussion, that self-correction means we’re making better and better approximations of reality. But there is another issue: what if there isn’t always an objective truth that we can get closer to by self-correction? That’s an altogether different topic, but let me say that relativity and quantum mechanics do suggest that this might be true. This is a topic for another article.

[1] “Father” in more than one sense: Lemaître was also a priest.

Recommended reading:

  • The Structure of Scientific Revolutions by Thomas Kuhn

Christianity and the center of the universe

Not long ago, someone asked me if I’d seen the documentary, The God Who Wasn’t There (2005), which explores the “Jesus myth” and Christianity in general. It’s been out for several years, and despite the fact that it’s viewable for free on YouTube, I haven’t bothered to watch it, because it looks like an uninspired retread of common challenges to the Christian faith that tend to be very weak. However, from what I can tell, it does perpetuate one historical distortion that is worth refuting. From a partial transcript on IMDb, TGWWT puts forth the idea that it was primarily Christians who were wrong about the Earth-centered universe:

Narrator: The Earth revolves around the Sun. But it wasn’t always that way. The Sun used to revolve around the Earth. It was like that for hundreds of years, until it was discovered to be otherwise, and even for a few hundred years after that. But, ultimately, after much kicking and screaming, the Earth did, in fact, begin to revolve around the Sun. Christianity was wrong about the solar system. What if it’s wrong about something else, too? This movie’s about what happened when I went looking for Jesus.

Or, more likely, what happened when he went looking for anything but Jesus, but never mind. The problem with this statement is that it implies only Christians were wrong about the solar system, when the truth is that just about everyone was wrong about the solar system at one time or another. So why single out Christians? Without having seen the movie, I am fairly confident of the answer (hint: look at who appears in the movie). Unfortunately, the notion that the medieval Church was scientifically ignorant and held back scientific progress is a fairly easy misconception to perpetuate, because people who believe it are usually already eager to believe misconceptions about Christianity and/or they do not know enough to evaluate its validity.

I made a point to cover geocentric theory in my astronomy 101 courses, so let’s explore what my college freshmen students knew about this subject that TGWWT‘s writer/director Brian Flemming apparently did not (or did not want you to know about).

The geocentric model of the solar system, which places the Earth at the center of the universe, is an idea that is found in nearly every ancient culture. In Western Civilization, the idea is usually attributed to the ancient Greek philosopher Aristotle (384 BC – 322 BC), and was later systematized by the Alexandrian astronomer Claudius Ptolemaeus (aka Ptolemy, ~64 AD – 165 AD). The geocentric model persisted for more than 1,700 years, and while medieval interpretation of biblical scripture seemed to loosely support the idea, its formulation had nothing to do with Christianity.

To understand why the geocentric model persisted for so long, I want you to place yourself, just for a moment, in the ancient world where there is no such thing as telescopes, astronauts, or satellites. Your only notion of the Earth’s place in the universe is based on what your human senses tell you about the apparent motions of the heavens. You notice that the Sun and Moon make daily journeys across the sky from east to west, and that the stars at night travel in the same daily east-west direction. The familiar constellations also seem to drift across the sky over the course of weeks and months. To your human senses, it appears that the Earth is stationary and that objects in the heavens move about it in very predictable cycles. Armed only with these observations, it is entirely reasonable to assume that the Earth is at the center of the universe.

We owe a tremendous debt of gratitude to the Greeks, who were the first to seek natural explanations for the phenomena they observed. This reliance on natural explanations heralded the birth of science. But what is science? It is actually a difficult concept to define. Most of us understand science to be the search for knowledge, but knowledge can be acquired by other means. The scientific method works by making observations and asking questions in a very systematic way. One observes a phenomenon in nature (say, the motions of the heavens) and posits an educated guess about the nature of the phenomenon (everything in the heavens orbits the Earth, which is stationary). This educated guess is referred to as an hypothesis. The hypothesis then makes a prediction (where objects in the sky will appear on a certain date), and one carries out tests or observations to determine how well the hypothesis performs. If the hypothesis fails the test or cannot account for new observations, then it must be revised or abandoned in favor of a new hypothesis.

One such test of the geocentric model came in the form of retrograde motions of the planets. The Greeks observed that a handful of objects in the heavens moved in a way that was different from the other objects. For one thing, their positions were not fixed like the stars, but appeared to wander over a period of months. (The word “planet” comes from the Greek word for wanderer.) This retrograde motion, or apparent looping back of the planet’s path in the sky, is now understood in the context of the Sun-centered (heliocentric) model, but in ancient times it represented a significant challenge to the geocentric model. This challenge was resolved by placing each of the planets in a smaller orbit, called an epicycle, upon its larger orbit about the Earth. This was a key feature of the model put forth by Ptolemy, which is referred to as the Ptolemaic model.

The Ptolemaic model persisted for almost two millenia, because, clunky as it was, it made accurate predictions about the motions of the planets. Moreover, several key objections to the heliocentric model were unresolved. Centuries before Ptolemy, the Greek astronomer Aristarchus (310 BC – ~230 BC) proposed a Sun-centered solar system, but was ridiculed by his peers for it. First, the idea that the Earth was moving was counterintuitive, because of the apparent motions of the heavens. But the most significant objection was that stellar parallax was not observed. This is the apparent shifting of position of closer stars relative to more distant background stars, which must occur if the Earth is moving around the Sun. As this was not observed, it was reasonable for Aristarchus’ fellow Greeks to reject his idea.

Fast-forward almost two millennia to Nicolaus Copernicus (1473 – 1543 AD), who was a true Renaissance man. In addition to being an astronomer, he was also a physician, scholar, cleric, and military leader. Like Aristarchus before him, Copernicus went against popular sentiment and proposed a heliocentric system. There is evidence that Copernicus knew he was recycling Aristarchus’ ancient model, but his genius was in recognizing its potential as a much more elegant and compelling model than the geocentric model. It is true that Copernicus’ book stirred some controversy within the Church, but contrary to popular belief, the Church was not monolithically opposed to, but rather divided on, the subject of heliocentrism. Secular scientists at the time likewise held to the Aristotelian school of thought, and mostly rejected Copernicus’ ideas. There was good reason for this, as the major objections to the heliocentric model had not yet been overcome. In particular, since Copernicus used circular orbits for the planets, instead of what we now know to be elliptical orbits, the predictions of the Copernican model were less accurate than those of the Ptolemaic model. Heliocentrists also had to contend with the lack of observed stellar parallax, and there were still more objections based on Aristotelian notions about nature. For instance, long before Newton developed his laws of motion, Aristotle held that all objects naturally come to rest, which meant that if the Earth was moving it would leave airborne objects (birds, clouds, etc.) behind. It was not until Galileo anticipated Newton’s first law (objects in motion tend to stay in motion) with simple experiments and made some key observations with his telescope—among them, that the stars are too far away to observe parallax1—that these objections were overcome and the Copernican Revolution was solidified.

It is important to understand that there was as much objection to the Copernican model from secular scientists as from the Church. Perhaps more. (For instance, it was supposedly a secular rival who reported Galileo to the Inquisition, illustrating that scientific enterprise has always been a little cut-throat.) The objections of the Church were only partially founded on Christian doctrine, which was based at that time on interpretation of scripture that was consistent with the Aristotelian school of thought. There is, in fact, nothing in scripture that dictates an Earth-centered system. The politics of the time also complicated things, with the Catholic Church struggling to come to grips with the tremendous effects of the Reformation. The most influential figure of the Reformation, Martin Luther, strongly objected to the ideas of the “upstart astrologer” Copernicus, and the Catholic Church was anxious to stay abreast with Protestantism on such an important issue. It is also important to understand that Copernicus was eventually shown to be incorrect in his placement of the Sun at the center of the universe; we now understand that there is no ‘center’ to the universe, an idea that is difficult to accept for many people.

What can we conclude from all of this? We can conclude that the most important factor preventing wide-spread acceptance of the heliocentric model was simple human nature. As clever as we sometimes are, we are constrained by limited perspective and emotion. Limited perspective prevented scientists from perceiving the stellar parallax that was predicted by the heliocentric model. Human emotion means cherished ideas often have a powerful hold on people, especially when it comes to accepted ideas that have served mankind well for many centuries. Put these two constraints together and you have the very non-linear progression from old ideas to new ideas that is evident throughout human history.

Having not seen TGWWT, I can only surmise from the partial transcript that either Flemming knows very little about scientific history, classical thought, and theology, or he is being deliberately disingenuous to make Christians look bad. Which is unfortunate, because, with just a few changes to the quote from the transcript, I think we could have turned his movie into a much more interesting narrative on the fallibility of human reason:

Narrator: The Earth revolves around the Sun. But it wasn’t always that way. The Sun used to revolve around the Earth. It was like that for hundreds of years, until it was discovered to be otherwise, and even for a few hundred years after that. But, ultimately, after much kicking and screaming, the Earth did, in fact, begin to revolve around the Sun. Mankind was wrong about the solar system, but eventually figured it out. What is it today that we don’t yet understand that will be obvious to mankind hundreds of years from now? Let’s speculate…

[1] With the advent of larger and more sophisticated telescopes, stellar parallax was indeed observed.

Recommended reading:

  • What’s So Great About Christianity by Dinesh D’Souza

Questions from Christian Students, Part 8

Sarah was recently invited, along with two other scientists, to take part in a panel discussion for a group of mostly Christian students. After the main discussion, students were invited to submit questions via text message; there was very little time to address them, so only a few were answered. The questions were quite good, so over the next few weeks, Surak and Sarah will answer most of them here. All of the questions are listed in the Intro to this series. See also: Part 1Part 2Part 3Part 4Part 5Part 6; Part 7

What is the most important piece of knowledge you have come to learn about evolution since becoming a believer?

Darwin was a great scientist and pioneer in the field of biology. In that regard he is similar to Copernicus in the field of physics. Physicists honor Copernicus, but they also recognize and readily admit his shortcomings. He helped accomplish the first great paradigm shift that set science on its present course of discovery. What a great thing to do! But he was wrong about some important things; the Sun is not the center of the universe, and planets do not travel in perfectly circular orbits. We forgive Copernicus for his mistakes, because he did his work before the basic tools and higher mathematics of astronomy were developed. How could he be expected to get everything right four hundred years ago?

Darwin worked under similar limitations decades before the revolutionary discoveries of the Burgess Shale fossils and genes. So, once again, how could the great pioneer in the field of biology have gotten everything right all the way back in the mid 1860s? That would not be a fair expectation on the part of either supporters or detractors of Darwin.

The main principles of Darwinism are common descent, random mutation, natural selection, and gradualism. Each of these components is a necessary part of current evolutionary theory, which is important because people often confuse evolution theory for just one of these parts—the common descent of all animal life. It is true that common descent has all but been ‘proven,’ about as well as any scientific belief can be proven, and there can be little remaining doubt about it. But, the massive evidence in favor of common descent neither establishes the truth of evolution theory as a whole nor undercuts Christian beliefs. The most that can be said is that Darwin’s championing of this principle counts as a great success of the same magnitude as Copernicus’ heliocentric theory.

But, Darwin, just like Copernicus, got some things wrong. The fossil evidence does not support gradualism. In the words of one of the most respected biologists of modern times, Niles Eldredge, curator at the American Museum of Natural History in New York City: “The fossil record we were told to find for the past 120 years does not exist.”

It was Darwin who told biologists what to expect in the fossil record, and this mistake was a significant failure on his part.

In fact, the fossil evidence contradicts Darwin so badly, it compelled Eldridge and his more famous partner, Stephen Jay Gould, to offer something they called ‘punctuated equilibrium’ as an alternative to strict Darwinism. There are other serious problems with Darwinism (the mathematics of random mutations doesn’t work and there is a fatal lack of empirical evidence for natural selection), but, for the sake of brevity, it is enough to say that without gradualism Darwinism is seriously undone. In other words, Darwin was wrong about at least one major thing.

In light of this, the most important thing any scientist can come to learn about evolution is that biologists are generally incapable of saying the following words: “Darwin was wrong.”

Physicists can say without hesitation that there were times when “Galileo was wrong, Newton was wrong, and Einstein was wrong.” In spite of their mistakes, Galileo, Newton, and Einstein are still considered giants in the field of physics. Why can’t biologists make the same and obviously true statement about Darwin? I believe it is because strict Darwinism has become anti-Christian gospel, and many biologists are betraying science by promoting and defending this dogma.

Would the discovery of intelligent life on another planet disprove the existence of God?

Why would God be limited to creating one group of beings with souls in just one part of a vast universe? While the Bible is addressed to and concerned about the conscious beings inhabiting the Earth, there is nothing in the Bible that says that life was created only on Earth. “In my Father’s house are many rooms” (John 14:2). See here and here for further discussion.

“All the evidence we have says that the universe had a beginning”

So says Tufts University physicist, Alexander Vilenkin, who made this statement at a meeting in January in honor of Stephen Hawking’s 70th birthday. (I’m a little late getting around to this, but it’s worth commenting on.)

To fully appreciate the magnitude of this statement, consider that the prevailing view of cosmology for more than two thousand years was that of an eternal universe. This view began to change in the 1920s, when astronomer Edwin Hubble discovered that the spectra of most galaxies are redshifted, and the further away a galaxy is from the Milky Way, the more its spectrum is redshifted. What this means in plain English is that almost all of the galaxies he observed are rushing away from each other, and those that were further away are rushing away faster. Incredibly, it appeared the universe was not only changing, but expanding. If you imagine running the expansion in reverse, so that galaxies rush toward one another as you go back in time, you end up with a point at which the expansion started — a beginning in time and space.

Belgian physicist and priest, Georges Lemaître, anticipated this discovery with what he called the “hypothesis of the primeval atom,” based on his solution to the Einstein field equations. The universe’s beginning was predicted to have been very energetic and violent, and was therefore dubbed as the “big bang.” Four decades later, physicists Arno Penzias and Robert Wilson discovered the predicted afterglow of this big bang, which eventually earned them Nobel prizes. By the late 1980s, sophisticated satellites were mapping the tiny fluctuations in the intensity of the big bang afterglow, which allowed physicists to calculate an age for the universe. By the end of the 20th century, there was near-consensus that the universe had a beginning that occurred some 11-17 billion years ago. (The cosmological model-based number is ~14 billion years.)

The big bang has had its detractors. It was astrophysicist Fred Hoyle, out of deep skepticism for the idea, who sarcastically applied the term “big bang” to this cosmological model. (Let it not be said that physicists are overly sensitive — the term stuck and has been used in all seriousness ever since.) Hoyle’s collaborator, astrophysicist Geoffrey Burbidge, famously ridiculed physicists who had hopped on the big bang bandwagon as “rushing off to join the First Church of Christ of the Big Bang.” There were two reasons scientists reacted this way. First, some scientists found the idea of a universe with a beginning uncomfortably close to the Genesis account of creation. Second, from the point of view of physics, mathematics, and philosophy, a universe with a beginning is far more messy to deal with than an eternal universe, which requires no explanation. Even still, the evidence for a beginning is now so overwhelming that most physicists have come to accept it, and the big bang has become the prevailing paradigm governing all of physics.

Nevertheless, some physicists had not given up on the idea of an eternal universe, but the focus changed to devising sophisticated models for an eternal universe that fit the observed data — in other words, an eternal universe that incorporated key features of the big bang model. Some of these features are explainable by invoking what’s called inflation, which refers to an early period of exceedingly rapid expansion. This idea was proposed by Alan Guth in the 1980s, and it can also be applied to an eternally inflating universe in which regions of the universe undergo localized inflation, creating “pocket universes.” This inflation continues forever, both in the past and into the future, and so in a sense it represents an eternal universe. Another idea was the cyclical universe, which posited that the universe is eternally expanding and contracting. In this way, the big bang that occurred 14 billion years ago would be just one of an infinite number of big bangs followed by ‘big crunches.’

All of the evidence indicates ours is a universe undergoing perpetual change. To replace Aristotle’s age-old idea of an eternal, unchanging universe, physicists came up with hypothetical eternal universes that were perpetually changing. This was an ingenius approach, but as Vilenkin announced last month, they just don’t work. Guth’s idea turns out to predict eternal inflation in the future, but not in the past. The cyclical model of the universe predicts that with each big bang, the universe becomes more and more chaotic. An eternity of big bangs and big crunches would lead to a universe of maximum disorder with no galaxies, stars, or planets — clearly at odds with what we observe.

As the journal New Scientist reports, physicists can’t avoid a creation event. Vilenkin’s admission exemplifies the reason physics is the king of all the sciences — physicists are generally willing to admit when their cherished ideas don’t work, and they eventually go where the data and logic lead them. Whether this particular realization will pave the way to serious discussion of God and consistency with the Genesis account of creation remains to be seen. Physicists can be a stubborn bunch. As Nobel laureate George P. Thomson observed, “Probably every physicist would believe in a creation if the Bible had not unfortunately said something about it many years ago and made it seem old-fashioned.” Still, some physicists are open to the idea. Gerald Schroeder, who is also an applied theologian, has written profoundly on the subject. His book, The Science of God, is an illuminating discussion of how the Bible and biblical commentary relate to the creation of the universe.