Present-day science has serious limitations which most scientists ignore. The belief that science will eventually explain all there is to explain about Nature is a matter of faith that has no solid factual basis.
The title of this article is derived from a quotation by Sir John C. Eccles, a distinguished Australian neurophysiologist who was awarded the 1963 Nobel Prize in Physiology or Medicine for his pioneering work on synapses – the terminations of nerve cells on one another. Eccles was a Roman Catholic and a dualist, that is, a believer in the separation between mind and brain. On the human condition, Eccles states: “I maintain that the human mystery is incredibly demeaned by scientific reductionism, with its claim in promissory materialism to account eventually for all of the spiritual worlds in terms of patterns of neuronal activity. This belief must be classed as a superstition.”1 This is a pretty strong opinion that many would find quite unjustifiable. On the other hand, it cannot be denied that some claims by scientists are also unjustifiable. This article examines such claims in the context of the limitations of present-day science.
It must be stated at the outset that scientific thought and progress are assuredly one of the greatest achievements of humanity, if not the greatest. Apart from expanding knowledge, science – through engineering – has spawned technologies of immense benefit to humans, as in diagnostic and therapeutic medicine (MRI, X-rays, vaccines, medications, etc.), agricultural technologies (mechanization, increased crop yields, etc.), and industrial products (airplanes, cars, household appliances, computers, mobile phones, etc.). It is the fault of humans, not science, that some technologies have been abused, such as nuclear technology and genetic engineering.
Basically, science seeks to understand natural phenomena. According to the Science Council (UK), “Science is the pursuit and application of knowledge and understanding of the natural and social world following a systematic methodology based on evidence.”2 What distinguishes science from other means of acquiring knowledge is a systematic scientific method that, in essence, involves the following steps:
(i) objective observation of a phenomenon, that is, the observation that is not biased in any way;
(ii) formulation of a hypothesis that provides a plausible explanation of the observed phenomenon;
(iii) experimental, measurement-based testing of deductions and predictions that follow from the formulated hypothesis; and
(iv) rejection, modification, or generalization of the formulated hypothesis based on the experimental results.
It follows that the phenomena that are amenable to scientific investigation, within the present framework of science, must be observable and repeatable, eventually leading to an explanatory hypothesis that allows testable deductions and predictions, or falsification. Repeatability is required not only for enhancing the reliability of testing through repetition but also to enable other scientists to replicate the experiments and validate the results. However, this places a severe restriction on the range of phenomena that can be scientifically investigated. Evidently, science cannot prove or disprove the existence of a divine being, for example, as neither assertion can be tested experimentally.
In talking about science, it is important to distinguish between science as an institution that embodies work in progress according to scientific methodology and the status of present-day science. John Wheeler, a distinguished American theoretical physicist, postulated three eras of physics, which is a solidly empirical basic science:
(i) Era I physics, from earliest times to that of the Italian scholar Galileo Galilei who lived in the 16th and 17th centuries CE. During this era, natural phenomena were described, such as the attraction or repulsion between objects that were rubbed together, but without formulating the laws obeyed. In describing the motion of objects, Galileo postulated that an object projected along a horizontal frictionless plane will continue in uniform motion if not disturbed. This contradicted Aristotle’s view, generally accepted at the time, that objects not acted upon by some force will come to rest. Galileo, however, did not formulate any general laws that govern the motion of objects.
(ii) Era II physics, starting with the English scientist, Isaac Newton, who is recognized as one of the most influential scientists of all time, with contributions in many fields, including mathematics, physics, astronomy, and theology. Newton was born in 1642, the year Galileo died. Newton made Galileo’s observation about the motion of objects more precise and generalized it into what became known as Newton’s First Law of Motion. But he went further. Inspired by a falling apple, according to legend, Newton realized that the apple’s velocity increased as it fell, that is, its motion accelerated. In line with the first law, this required a force, which was nevertheless invisible. This contrasted sharply with Aristotle’s view that objects fell because they were naturally attracted to earth, the center of the cosmos; no force was involved. But not only did Newton postulate a force of gravity, based on knowledge accumulated hitherto, he argued that since heavy and light objects fall at the same rate in the absence of air resistance, as Galileo had observed, that is, they have the same acceleration, the force F must be directly proportional to the mass m, that is F = ma, where a is the acceleration. This became known as Newton’s Second Law of Motion. Newton further proposed what became known as Newton’s Universal Law of Gravitation, because it applied to all objects, from apples to planets. According to this law, every object attracts every other object with a gravitational force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between their centers. Newton verified the laws he formulated by applying them to the motion of the planets, such as that described early in the 17th century CE by the German astronomer, Johannes Kepler. Newton found that he could reproduce astronomical data using his equations. Thus, Era II physics, which continues to the present, is characterized by formulating laws that govern natural phenomena but without explaining the meaning of these laws. When challenged to explain how his postulated force of gravity acts at a distance between isolated objects, Newton responded: Hypotheses non fingo (I make no hypotheses). It was sufficient that the laws worked, that is, they can be applied to particular problems to give the correct answers.
(iii) Era III physics, which according to Wheeler is yet to come, is the meaning physics, or recognition physics, which deciphers physical law, that is, understands and explains the why of physical phenomena rather than just the how of Era II physics.
The distinction between Era II physics and Era III physics can be illustrated by a long-standing debate between two friendly adversaries, Niels Bohr and Albert Einstein. Bohr was a distinguished Danish physicist who made foundational contributions to the theory of atomic structure and to quantum theory in the early 20th century CE, which was advanced to explain phenomena at the atomic level that contradicted classical physics. Although Einstein was an early contributor to quantum theory, he was disturbed by some of its implications, particularly the most common Copenhagen interpretation championed by Niels Bohr and his associates. But no matter how hard he tried to dispute the correctness or completeness of quantum theory Einstein was unsuccessful. The predictions of quantum theory have always been correct no matter how weird they seemed. Bohr’s viewpoint was: “It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.”3 On the other hand, Einstein sought to understand Nature: “I want to know how God created this world. I am not interested in this or that phenomenon, in the spectrum of this or that element. I want to know His thoughts; the rest are details.”4 Despite their apparent differences, both Bohr and Einstein were in a sense right about physics, and implicitly science in general. Bohr was really talking about present-day physics, Era II physics that describes what we can say about Nature but does not explain it. Einstein was thinking of Era III physics, whose ultimate goal is to understand and explain Nature.
Because of its nature, present-day, Era II science has some serious limitations. First, it is fragmented, that is, lacking in unity, and therefore incomplete. For example, the motion of bodies is described by three distinct sets of laws depending on the scale: quantum mechanics at the atomic level, general relativity at the cosmic level, and classical mechanics, to a high degree of accuracy, for everyday objects. These laws are very different in nature, and all attempts to reconcile quantum mechanics and general relativity have not been successful so far. Yet, material objects all have the same basic constituents, so one would expect one set of laws to apply irrespective of scale. Action at a distance, even in the case of gravity alone, also has completely different interpretations. Conventionally, it is explained in terms of fields, whose exact nature is conjectural. According to general relativity, gravity is a warping, or deformation, of space-time, which is a conceptual model that combines the single dimension of time with the three dimensions of space to give a four-dimensional space-time. Quantum field theory explains gravitational attraction in terms of virtual particles that form for extremely short periods of time and then disappear again, even in a perfect vacuum; however, the reality of these virtual particles remains controversial.
Second, in order to account for observations such as the accelerating expansion of the universe and the high orbiting speeds of stars in the outer parts of galaxies, cosmologists postulated the existence of dark energy and dark matter, whose nature is still unexplained. Surprisingly, it was estimated that the universe consists of about 70% dark energy and about 25% dark matter, leaving about 5% for all the observed mass and energy of the universe, including Earth and all life on it, which present-day science, for all its glory, deals with. Even then, it would be some solace if present-day science could explain everything there is to know about this 5%, but sadly this is not the case. So much is still unknown about us, human beings, and our tangible world. Still very mysterious, for example, are the workings of the human brain, the nature of consciousness, and the essence of life.
A very basic question is: do humans have the mental capacity to understand everything about Nature? It is well known that our physical abilities are surpassed in some animals. Dogs have a much stronger sense of smell, eagles have greater visual acuity, mice can hear higher frequencies, birds perceive ultraviolet light, sharks and some fish respond to electric fields, chimpanzees are at least twice as strong physically, etc. If our physical endowments are limited, why should our mental capacities be considered unlimited? With limited mental capacity, any science produced by humans will necessarily be of limited scope and incapable of unraveling all of Nature’s mysteries. Yet, there is an almost pervasive belief among scientists that science will eventually be able to figure it all out. It is a comforting belief but is not based on facts, which are the very foundations of science. This belief is more a matter of faith, much like religious faith which most scientists reject as being unsubstantiated. It is based essentially on the extrapolation of scientific achievement – which has unquestionably been great – into the vast unknown. However, extrapolation can be quite misleading, as illustrated by the following historical example.
In 1791, Luigi Galvani, an Italian doctor, Professor of Anatomy, and researcher, postulated the existence of animal electricity based on his investigation of the contraction of frogs’ legs when exposed to a source of electricity. His nephew, Giovanni Aldini, who was also his research assistant, was a bit of a showman. He travelled all over Europe demonstrating to the public how a fresh corpse could be made to jerk an arm or a leg upon application of a sufficiently high voltage – the familiar electric shock that people experience when they accidentally touch a conductor at the mains voltage, for example. Audiences were enthralled by these demonstrations. Here was a dead body moving its arm or leg under the influence of electricity; surely, it was only a matter of time, it was thought before the dead body would be made to come back to life, that is, humans could be raised from the dead. People were extrapolating from what they could see with their own eyes to what they imagined could happen, but without knowledge of what is really involved in this extrapolation. The jerking of an arm or leg is now well understood to be a local phenomenon according to which the passage of electric current through a muscle in good condition will elicit contraction in the muscle. Bringing back a dead body to life is an entirely different proposition that requires first an understanding of the nature of life, and second, being able to instil life in dead cells. This of course has not yet been realized and is unlikely to be realized in the foreseeable future. All sorts of similar extrapolations are currently being made about new forms of life to be produced in the laboratory, about bringing back to life dead human bodies that have been kept at extremely low temperatures for this purpose, or about robots having human-like consciousness.
A favourite fixation of many scientists is ascribing matters to chance, such as chance encounters by atoms and molecules under appropriate conditions to produce life, chance mutations of genes, etc. This seems like a satisfying and convincing explanation, but is it? Consider a typical example of a chance event, the tossing of a coin. The probability of a head or tail outcome in a large number of trials is considered to be 0.5, as a matter of chance. But in fact, if one is able to account for all the factors that influence the outcome, from the moment the coin is tossed until it is brought to rest, it would be possible, in principle, to determine with certainty the outcome of any given trial. In other words, there is nothing intrinsically chancy about the outcome of tossing a coin. Ascribing the outcome to chance in such cases is not because chance is a fundamental mainspring of these events. Rather, it is essentially a means of avoiding having to consider influencing factors that are difficult to account for or bypassing unknown factors altogether. Moreover, relegating such matters to chance has an air of finality that is almost anti-scientific in that it stifles the search for alternative explanations.
Before ending this discussion, two reminders from the history of science should be mentioned. First, it is to the credit of science, as a work in progress, that scientific theories are never final but are subject to change or even rejection, although they may be embraced by almost all scientists for a very long time. An example is the ether postulated by French philosopher, mathematician, and scientist René Descartes in 1644 in order to explain action at a distance such as that between magnetized bodies or tidal motion due mainly to the Moon, and to a lesser extent, the Sun. He conceived of the ether as consisting of a whirlpool of rotating chains of particles that pervades all space and matter, and which transmit actions between bodies that are not in direct contact. Belief in the ether was pervasive for almost 250 years before it was finally abandoned early in the 20th century CE. Another example is Newtonian mechanics which was universally adopted for almost 200 years before it was superseded early in the 20th century CE by quantum mechanics at the atomic level and by Einstein’s general relativity at the cosmic level.
Second, an idea may be ridiculed by scientists for a long time before it becomes scientific. A good example is the postulate by ancient Greeks that disease can spread from a sick person to a healthy one by some invisible “seeds of plague”. That was considered totally absurd; how can disease spread by something intangible and invisible to the naked eye? More than two thousand years later, in the 19th century CE, the germ theory of the cause of the disease was accepted following the work of the German physician Robert Koch and the French chemist Louis Pasteur.
In conclusion, a truly scientific outlook should shun conceited wishful thinking but should be based on humble recognition of our limited mental capacity and the limitations of present-day science. It is almost a truism that the more we know the more we become aware that there is a lot more that we do not know. To paraphrase Wheeler,5 our knowledge is like an island surrounded by a sea of ignorance. As our island grows, so does our awareness of our ignorance (the perimeter of the island). We are still mired in Wheeler’s Era II of science. The question is when and how will Era III be ushered in?
References
- Eccles, J. C. (1989) Evolution of the Brain: Creation of the Self, London and New York, Routledge, p. 241.
- ‘Our definition of science’, Science Council (UK) [Online]. Available at:
https://sciencecouncil.org/about-science/our-definition-of-science/. - Petersen, A. (1963) ‘The philosophy of Niels Bohr’, Bulletin of the Atomic Scientists, vol. 19, no. 7, p. 12.
- Todayinscience [Online]. Available at:
https://todayinsci.com/E/Einstein_Albert/EinsteinAlbert-God-Quotations.htm. - Todayinscience [Online]. Available at:
https://todayinsci.com/W/Wheeler_John/WheelerJohn-Quotations.htm.