Three Fundamental Limitations and Their Common Issues

Du Won Kang

PureInsight | October 13, 2003



Currently, there is a sense that modern science will continue to advance indefinitely and will eventually discover a complete and consistent theory of the universe. However, as much as modern science has been making great advances, it has also been discovering its limitations. As others have also realized, some of the greatest discoveries of modern science are the discoveries of its own limitations.

This paper will present fundamental limitations in three areas that influenced the development of modern science. Then we will see what those limitations have in common and introduce a way of understanding them.

A Limitation in Modern Physics

When a piece of matter is heated, it starts to glow, gets red hot, and at higher temperatures becomes white. For a long time, known laws of radiation and heat failed to account for this common phenomenon. Max Planck struggled to provide a physical interpretation of the phenomenon at the atomic level. Finally, after some intense work in 1900, Planck reluctantly concluded that a radiating atom can emit only discrete quanta of energy. He was reluctant about this conclusion because it goes against the well established laws of classical physics which does not impose a fixed constant on levels of energy. Later, Planck's conclusion about the quanta of energy became an important foundation of quantum theory, and it was only the beginning of conflicts between quantum theory and the more sensible classical theory of Newton. Classical mechanics is closely related to our everyday experience of the world. However, atoms and subatomic particles seem to have mysterious characteristics which are very different from our ordinary experience of the world. From persisting anomalies and accumulating experimental data, which contradict classical mechanics, physicists were forced to make a radical departure from classical physics of Newton, and venture on a long and winding road towards quantum mechanics.

Werner Heisenberg wrote,

"I remember discussions with Bohr which went through many hours till very late at night and ended almost in despair; and when at the end of the discussion I went alone for a walk in the neighboring park I repeated to myself again and again the question: 'Can nature possibly be as absurd as it seemed to us in these atomic experiments?'" (From Physics and Philosophy, p42)

Nevertheless, in spite of conceptual difficulties, quantum mechanics has become one of the most successful formalisms in modern science. In principle, quantum mechanics can describe the myriad of physical phenomena and chemical properties of matter to an incredible accuracy. And its applications greatly influenced the development of our modern, technological society. Michio Kaku, a professor of theoretical physics, wrote,

"The consequences of quantum mechanics are all around us. Without quantum mechanics, a plethora of familiar objects, such as television, lasers, computers, and radio, would be impossible. The Schrödinger wave equation, for example, explains many previously known but puzzling facts, such as conductivity. This result eventually led to the invention of the transistor. Modern electronics and computer technology would be impossible without the transistor, which in turn is the result of a purely quantum mechanical phenomenon." (From Beyond Einstein, p40)

The enormous success of quantum mechanics comes from its formalism that accurately describes a myriad of phenomena of microscopic things. And it is also in that microcosm where quantum mechanics has fundamental limitations.

A central feature of quantum mechanics is Heisenberg's uncertainty principle. Basically, according to this principle, it is impossible to measure both the position and the momentum of an atomic or subatomic thing at any given time. As the position is measured more accurately, the momentum can be measured less accurately, and vice versa. If a position is measured absolutely accurately, then the momentum becomes completely unknown, and vice versa.

Although Werner Heisenberg introduced the uncertainty principle in 1927, it is just as relevant today. The inability to accurately measure both the position and momentum of microscopic things is not due to some limitation with current technology. According to many physicists, this is an inherent limitation, which cannot be resolved by any future advances in technology. Michio Kaku wrote,

"The Uncertainty Principle makes it impossible to predict the precise behavior of individual atoms, let alone the universe." (From Beyond Einstein, p44).

And according to Brian Greene, who is one of the world's leading string theorists, future advances in string theory will have to incorporate the uncertainty principle in order to become a complete theory that accounts for observable quantum phenomena. Brian Greene explains that the uncertainty principle is not just an issue of disruptions caused by measuring techniques:

"Even without 'direct hits' from an experimenter's disruptive photon, the electron's velocity severely and unpredictably changes from one moment to the next... Even in the most quiescent setting imaginable, such as an empty region of space, the uncertainty principle tells us that from a microscopic vantage point there is a tremendous amount of activity... Even in an empty region of space... the uncertainty principle says that the energy and momentum are uncertain." (From The Elegant Universe, p119)

Werner Heisenberg believed that the uncertainty principle arises from the dualism between wave description and particle description. This dualism is not just embedded in the mathematical scheme of quantum mechanics. The duality can also be inferred from simple experiments. Experiments seem to demonstrate that atomic and subatomic things have characteristics of both a particle and a wave. A particle occupies a small area in space and can collide with other particles, like solid objects. On the other hand, a wave is spread out in space and can pass through other waves. These descriptions between particle and wave appear to be opposite and conflicting notions. How can something be a particle and a wave at the same time? When a single electron is considered to be either a particle or wave and not both, then that can lead to an incomplete explanation of the observed phenomena. On the other hand, when the aspects of a particle and wave are combined to form a complete theory of observed phenomena, then that can lead to contradictions. According to Heisenberg, attempts to describe atomic events in terms of classical physics lead to contradictions because those microscopic things are not like ordinary objects of our everyday experience.

In Newtonian mechanics, every object has a definite position and momentum at any given time, and the object will follow only a single path of motion according to mathematical laws. In other words, motion of matter is fully deterministic, where there is only one future outcome. When the position and momentum of an object are known, then its motion can be predicted with precise mathematical calculations. Newtonian mechanics has been very successful in describing and predicting planetary motions in the heavens as well as events on earth. However, it fails to describe the phenomena of atomic and subatomic events in the microcosm.

In contrast to classical physics of Newton, according to Heisenberg, atomic events are like the concept of potentiality in the philosophy of Aristotle: "a strange kind of physical reality just in the middle between possibility and reality." In quantum mechanics, atomic and subatomic events are described in probabilities or tendencies. Quantum mechanics introduced the concept of indeterminacy into the foundation of modern physics. This was a huge leap from classical mechanics of Newton that dominated physics for centuries. And it was also a radical departure from the theory of relativity. Einstein rejected this interpretation of quantum mechanics on this very point of indeterminacy, and said that "God does not play with dice." Heisenberg wrote,

"...the change in the concept of reality manifesting itself in quantum theory is not simply a continuation of the past; it seems to be a real break in the structure of modern science." (From Physics and Philosophy, p29)

Although quantum mechanics has been very successful, we must remember that quantum mechanics only describes and predicts observable physical phenomena; it does not describe the inner reality of physical matter. In fact, as quantum mechanics advanced, different and conflicting interpretations of quantum mechanics developed, even among eminent physicists.

One of the earliest interpretations of quantum mechanics is the Copenhagen interpretation, which was led by a Danish physicist, Niels Bohr. According to this interpretation, "there is no deep reality", and atoms, electrons, and photons do not exists like objects in our everyday experience. According to this interpretation, a phenomenon fully comes into existence only when it is observed. Bohr said, "There is no quantum world. There is only an abstract quantum description."

On the other hand, Einstein was a "realist", and he believed that quantum mechanics is simply incomplete and there is a hidden deterministic reality behind quantum phenomena that may be discovered in the future. Although Einstein was in a very small minority of physicists with this view, other eminent physicists who also made great contributions to the development of quantum mechanics were realists too. Max Planck, who is considered to be the originator of quantum theory, believed in an objective world that is independent of an observer and adamantly opposed the indeterministic worldview of Heisenberg, Niels Bohr, and Max Born. Louis de Broglie, who is best known for his discovery of the wave nature of electrons, was aligned with the statistical interpretation, but after struggling with it for many years, finally settled on a realist position. Erwin Schrödinger, who developed wave mechanics, was also a realist, and devoted much of his later life in opposing the statistical interpretation of quantum theory that he had done so much to create. Schrödinger said,

"Physics takes its start from everyday experience, which it continues by more subtle means. It remains akin to it, does not transcend it generically; it cannot enter into another realm. Discoveries in physics cannot in themselves – so I believe – have the authority of forcing us to put an end to the habit of picturing the physical world as a reality."

About a decade after the passing of Einstein, John Stewart Bell demonstrated that the realist position requires that certain forces must be able to travel faster than the speed of light to account for observable quantum phenomena. And since this contradicts the foundation of the well established theory of relativity, many physicists reject the realist position.

In 1957, Hugh Everett introduced the many-worlds interpretation, which seems to resolve the quantum measurement problem. In the many-worlds interpretation, parallel universes are created for different possible outcomes from each act of measurement. For example, when a coin is tossed, although we observe only one outcome, other possible outcomes are supposed to occur in parallel universes that are instantly created. This interpretation is considered to be absurd by notable physicists and philosophers.

These are only a small sample of issues on attempts to give a complete interpretation of quantum mechanics. There are many interpretations. Nick Herbert compared eight of them (including the ones mentioned above) and wrote,

"An astonishing feature of these eight quantum realities, however, is that they are experimentally indistinguishable. For all presently conceivable experiments, each of these realities predicts exactly the same observable phenomena [...] All of them without exception are preposterous." (From Quantum Reality, p28)

Some physicists believe that quantum theory is incomprehensible and that it should be employed just as a means of calculating and predicting physical phenomena for practical use.

A Limitation in Formal Logic

Some of the greatest thinkers wanted to determine the nature of mathematical reasoning in order to improve their understanding of the notion of "proof" in mathematics. Toward that end, they attempted to codify the thought process of human reasoning, as it applies to mathematics. They surmised that logic and mathematics are interrelated and that mathematics can be a branch of logic, or vice versa. They thought that the kind of logical deductive method of geometry may be employed for mathematics, where all true statements of a system can be derived from the basis of a small set of axioms:

"The axiomatic development of geometry made a powerful impression upon thinkers throughout the ages; for the relatively small number of axioms carry the whole weight of the inexhaustibly numerous propositions derivable from them... For these reasons the axiomatic form of geometry appeared to many generations of outstanding thinkers as the model of scientific knowledge at its best." (From Gödel's Proof, p3)

However, inherent paradoxes were known to exist in logic. And a variety of paradoxes were also discovered in set theory, such as Russell's paradox. Those paradoxes all have two things in common: self-reference and contradiction. A simple and well known paradox is the liar paradox such as "I always lie". From such a statement it follows that if I am lying, then I am telling the truth; and if I am telling the truth, them I am lying. The statement can be neither true nor false. It simply does not make sense. From the discovery of paradoxes in set theory, mathematicians suspected that there may be serious imperfections in other branches of mathematics.

"These types of issues in the foundations of mathematics were responsible for the high interest in codifying human reasoning methods which was present in the early part of [the 20th century]. Mathematicians and philosophers had begun to have serious doubts about whether even the most concrete of theories, such as the study of whole numbers (number theory), were built on solid foundations. If paradoxes could pop up so easily in set theory – a theory whose basic concept, that of a set, is surely very intuitively appealing – then might they not also exist in other branches of mathematics?" (From Gödel, Escher, Bach, p23)

Logicians and mathematicians tried to work around these issues. One of the most famous of these efforts was conducted by Bertrand Russell and Alfred North Whitehead in their mammoth work of Principia Mathematica. They realized that all paradoxes involve self-reference and contradiction. And they devised a hierarchical system to disallow for both. Principia Mathematica basically had two goals: (1) provide a complete formal method of deriving all of mathematics from a finite set of axioms; and (2) be consistent with no paradoxes.

At the time, it was unclear whether or not Russell and Whitehead really achieved their goals. A lot was at stake. The very foundation of logic and mathematics seemed to be on shaky ground. And there was a great effort, involving leading mathematicians of the world, to verify the work of Russell and Whitehead:

"...[David Hilbert] set before the world community of mathematicians... this challenge: to demonstrate rigorously... that the system defined in Principia Mathematica was both consistent (contradiction-free), and complete (i.e. that every true statement of number theory could be derived within the framework drawn up in [Principia Mathematica])." (From Gödel, Escher, Bach, p24)

In 1931, the hope in that great effort was destroyed by Kurt Gödel with the publication of his paper: On Formally Undecidable Propositions of Principia Mathematica and Related Systems. Gödel demonstrated an inherent limitation, not just in Principia Mathematica, but in any conceivable axiomatic formal system that attempts to model the power of arithmetic. Arithmetic, the theory of whole numbers, such as addition and multiplication, is the most basic and oldest part of mathematics, which as we know has great practical importance.

Gödel proved that such an axiomatic formal system that attempts to model arithmetic cannot be both complete and consistent at the same time. This proof is known as Gödel's Incompleteness Theorem. There were only two possibilities in such a formal system:

(1) If the formal system is complete, then it cannot be consistent. And the system will contain a contradiction analogous to the liar paradox.
(2) If the formal system is consistent, then it cannot be complete. And the system cannot prove all the truth of the system.

For very simple formal systems, the limitation does not exist. Ironically, as a formal system becomes more powerful, at least as powerful enough to model arithmetic, the limitation of Gödel's Incompleteness Theorem becomes unavoidable.

Some scientists say that Gödel's proof has little importance in actual practice. However, Roger Penrose pointed out that another theorem, called Goodstein's theorem, is actually a Gödel theorem that demonstrates the limitation of mathematical induction in proving certain mathematical truths. Mathematical induction is a purely deductive method that can be very useful in proving an infinite series of cases with finite steps of deduction.

There was a deeper motivation behind Gödel's efforts beyond the issues of Principia Mathematica and other more practical formal methods. Like other great mathematicians and logicians of his time, Gödel wanted to have a better understanding of basic questions about mathematics and logic: what is mathematical truth and what does it mean to prove it? These questions still remain largely unresolved. Part of the answer came with the discovery that some true statements in mathematical systems cannot be proved by formal deductive methods. An important revelation of Gödel's achievement indicates that the notion of proof is weaker than the notion of truth.

Gödel's proof seems to demonstrate that the human mind can understand certain truths that axiomatic formal systems can never prove. From this, some scientists and philosophers claim that the human mind can never be fully mechanized.

Although Gödel's Incompleteness Theorem is not well known by the public, it is regarded by scientists and philosophers as one of the greatest discoveries in modern times. The profound importance of Gödel's work was recognized many years after its publication:

"Gödel was at last recognized by his peers and presented with the first Albert Einstein Award in 1951 for achievement in the natural sciences – the highest honor of its kind in the United States. The award committee, which included Albert Einstein and J. Robert Oppenheimer, described his work as 'one of the greatest contributions to the sciences in recent times.' " (From Gödel's Proof)

A Limitation in Philosophy

Philosophy is very dynamic, and it seems to have no strict rules and limitations. Philosophy seems to have no predefined boundaries in areas to explore, and it can just as well critically study the nature of science, art, and morality. Philosophy gave birth to modern science, and influenced the development of the most trusted and essential tools of modern science such as logic and the scientific method. It seems that philosophy is free, as free as the mind, limited in scope only by the imagination. And some philosophers hoped that, with the aid of science, they could eventually understand the nature of the universe.

However, after thousands of years of philosophical speculations since the ancient Greeks, that great hope and optimism in philosophy was finally destroyed forever by the philosophy of one man. Immanuel Kant, born in April 22, 1724, caused a virtual Copernican Revolution in philosophy in his later years. He is considered to be the most influential modern philosopher. Of him, the poet Heine wrote:

"The history of the life of Immanuel Kant is hard to write, inasmuch as he had neither life nor history, for he lived a mechanically ordered and abstract old bachelor life in a quiet retired street in Koenigsberg, an old town on the northeast border of Germany. I do not believe that the great clock of the cathedral there did its daily work more dispassionately and regularly than its compatriot Immanuel Kant. Rising, coffee drinking, writing, reading college lectures, eating, walking, all had their fixed time, and the neighbors knew that it was exactly half past three when Immanuel Kant in his grey coat, with his bamboo cane in his hand, left his house door and went to the Lime tree avenue, which is still called, in memory of him, the Philosopher's Walk...Strange contrast between the external life of the man and his destroying, world-crushing thought! In very truth, if the citizens of Koenigsberg had dreamed of the real meaning of his thought, they would have experienced at his sight a greater horror than they would on beholding an executioner who only kills men." (From The Age of Ideology, p27-28)

In 1781, Kant published The Critique of Pure Reason, his world-crushing thought. It was over eight hundred pages long. It was a critical and rigorous examination of "pure reason". According to Kant, when pure reason goes beyond possibility of human experience, then it will inevitably fall into contradictions where a thesis and its antithesis are both equally valid. For example, consider a question such as "Is the universe finite or infinite?" Then for a thesis that "the universe is finite", there is an equally valid and unavoidable antithesis that "the universe is infinite". Without the support of experience, pure reason becomes inherently speculative and doubtful about how it relates to reality. With this, Kant crushed the validity of some of the most important philosophical works of metaphysics that were trusted by many, for generations.

In metaphysics, generations of philosophers made various attempts to give various exposition of the ultimate nature of the universe. According to Kant, those attempts to give such a complete picture of the universe, well beyond human experience, always result in inevitable contradictions. And before Kant, philosophers have been debating about them without end.

Actually, Kant did not intend to destroy metaphysics. Instead, he wanted to save it by establishing the secure methods of natural science for metaphysics. Kant knew a great deal about science because, in many ways, he was a scientist. And he is considered to be the founder of a major field in modern science. Allen Wood wrote,

"As a researcher, for a time Kant devoted his intellectual labors mainly to questions of natural science: mathematical physics, chemistry, astronomy, and the discipline (of which he is now considered the founder) of 'physical geography' – what we would now call 'earth sciences'." (From Basic Writings of Kant, p. xi)

Kant wanted to raise the status of metaphysics to the level of a genuine science. Ironically, the only way that Kant was able to make the first step towards that goal was to drastically reduce the scope of metaphysics by demonstrating its inherent limitations. Then metaphysics should not speculate about such things as the ultimate nature of the universe. Instead, metaphysics should confine itself to more practical things that can be grounded on human experience.

The Critique of Pure Reason was also a critical examination of the faculty of pure reason, the nature and structure of the human mind. By "pure reason", Kant was referring to a pure form of a priori (before experience) knowledge, which has no involvement of a posteriori (after experience) knowledge. Kant believed that the notion of space and time as in Euclidean geometry and classical Newtonian mechanics are derived from necessary synthesis of a priori knowledge, which are determined by some innate characteristics of the human mind. However, long after the passing of Kant, we now know that Kant was wrong about this. Advances in mathematics have shown that very different kinds of geometry can be just as valid as Euclidean geometry. And Einstein's theory of relativity has revealed a very different perspective of space and time. Kant would have been even more surprised by quantum theory which introduces the notion of indeterminacy that challenges the most basic notion of cause and effect.

Although Kant is somewhat outdated, his main principles are timeless and they may be just as valid today as when he first published his world-crushing thought. According to Kant, the human mind is not like a mirror that passively reflects reality from the senses of the external world. Instead, the mind actively engages in governing and organizing the sense data into perceptions and concepts. Kant's distinction between a priori knowledge and a posteriori knowledge is important here. The a priori knowledge predisposes the mind to what the mind can perceive. So, the perception of the external world is not merely derived directly from the senses. Instead, the mind shapes and adds to the perceptions. According to Kant, what our mind perceives and constructs is different from external things. Although Kant believed that things objectively exist outside of our minds, he concluded that the mind can never know the "things in themselves".

Some scientists believe that they are studying real things. Other scientists are more sophisticated, and they say that they are studying only phenomena of things. Kant goes further and says that scientists are people too with the same faculty of reason, and they can only study their own constructed perceptions of their senses. If that is true, then it could undermine the entire foundation of the empirical methods of modern science.

The Common Issues

The limitations of the three areas may appear to be very different. However, they share common issues. In each of the three areas, attempts for absolute completeness and consistency of knowledge led to contradictions:

In physics, scientists tried to develop a complete and consistent mathematical theory that could in principle describe and predict all physical phenomena. But after decades of struggling towards that goal, the uncertainty principle emerged at the heart of physics. And although quantum mechanics may be the most successful theory in modern science, any complete physical interpretation of it seems to lead to contradictions and absurdities.

In formal logic, there were serious attempts to completely and consistently model all of mathematics in an axiomatic formal system. And just when such a solution was proposed, Gödel proved that no consistent axiomatic formal system can completely prove all truths of arithmetic. If a formal system is able to completely prove all truths of arithmetic, then it would have to be inconsistent.

In philosophy, generations of thinkers tried to determine the ultimate nature of things through reasoning, assuming that a logical consistent chain of reasoning would lead to correct conclusions. Finally, Kant showed that such metaphysical endeavors involve pure reason that is extended well beyond human experience, which leads to inevitable contradictions.

All these limitations seem to involve some innate duality. In quantum mechanics, the dualism between the wave and particle nature of matter gave rise to the uncertainty principle, where accuracy in observing one aspect of a thing leads to proportional inaccuracy in observing the other aspect of the thing. And the duality seems to cause difficulties in developing a complete and consistent interpretation of quantum theory. In logic, attempts to develop a formal system that would completely and consistently determine all true statements of mathematical systems were unsuccessful. Instead, paradoxes were discovered with dual characteristics of being both true and false at the same time. Furthermore, in formal logic, power of expression seems to give rise to weakness in consistency. Similarly, in philosophy, attempts to extend knowledge beyond the scope of human experience, led to lack of consistency and inevitable contradictions. And the only way to avoid such contradictions seems to be to drastically reduce the scope of knowledge. In all these endeavors, attempts to achieve absolute completeness and consistency seem to give rise to the opposite.

Some Principles of Falun Dafa

The above understanding of the limitations of the three areas was inspired by an understanding of the principle of mutual-generation and mutual-inhibition in Falun Dafa (also known as Falun Gong):

"In a very high and very microscopic dimension of the universe there exist two different kinds of substances...They pervade certain dimensions from top to bottom, or from the microscopic level to the macroscopic level...the lower the level, the greater the differences in the manifestations and variations of these two different substances...Descending further to the lower levels, these two kinds of matter with different properties become increasingly opposed to each other, and this then gives rise to the principle of mutual-generation and mutual-inhibition." (From Falun Dafa Essentials for Further Advancement)

At a certain level of interpretation, a principle of Falun Dafa can be like a general law. So, a single principle of Falun Dafa, such as mutual-generation and mutual-inhibition, can describe and predict a myriad of things.

History of science has demonstrated again and again that major progress involved struggles to break through well established ideas. And each breakthrough from old ideas required a willingness to let go of them.

The purpose of Falun Dafa is not about advancing science. However, it can have a positive influence on the development of science. Falun Dafa teaches cultivation, which involves letting go of attachments in one's mind, including attachments to various notions and ideas. Attachments are things that one is not willing to let go of, and they are irrational. Dogmatism can be a form of attachment that creates fundamental limitations in science. In cultivation, one should be aware of attachments and break through them. Then the mind can be free from dogmas and become more rational. Cultivation leads to a clear and objective mind, and such rationality is essential for a genuine science.

These are based on my current and limited understanding of some of the principles of Falun Dafa.


Long ago, philosophers and scientists had great aspirations to understand the ultimate nature of things. As modern science made great advances, it also made equally great discoveries of its limitations. Again and again, while striving for absolute completeness and consistency, it reduced its scope of knowledge of the universe. Now, a general method of modern science is to discard from knowledge any phenomenon that is not within its highly limited scope of competency. Today, mainstream scientists and philosophers generally study superficial aspects of things for practical purposes.

Many scientists and philosophers believe that modern physics is the most reliable and successful science, and other branches of modern science have tried to emulate it. In modern physics, theories are codified in the language of mathematics. Nevertheless, the most precise mathematical statements can be interpreted in very different ways on how they relate to reality. As we saw in quantum mechanics, which is one of the most successfully verified theories of modern science, physicists still have many different and conflicting interpretations of what it means and how it relates to reality. Today, the most advanced physical theories that go beyond quantum mechanics are so abstract and extreme, beyond possibility of human experience, that the theories have become infeasible to verify, and they are increasingly controversial. These issues seem to undermine the empirical methods of modern physics.

There are historic reasons for the emphasis on the codification of theory in the precise language of mathematics and the rigorous empirical methods in verifying theory with observable phenomena. Part of the reason is that, for hundreds of years, philosophers and scientists tried to weed out superstition and speculative metaphysics from the domain of science. They wanted to establish a pure body of scientific knowledge that was absolutely certain. It is ironic that the struggles to attain absolute certainty of knowledge have repeatedly led to opposite results. It seems that, if efforts are carried out to an extreme toward pure abstraction, as is done in theoretical physics, we may lose all knowledge and understanding. Tegmark and Wheeler wrote,

"A theory of everything would probably have to contain no concepts at all. Otherwise one would very likely seek an explanation of its concepts in terms of a still more fundamental theory, and so on in an infinite regress. In other words, the theory would have to be purely mathematical, with no explanations or postulates." (From "100 Years of Quantum Mysteries", Scientific American, February 2001)

In modern science, how far has pure reason ventured beyond actual experience, and how much has theoretical physics become like metaphysics which scientists and philosophers have worked so hard to avoid? Hundreds of years ago, Immanuel Kant had warned about the limitations of applying pure reason to reality beyond experience. And more recently, Kurt Gödel demonstrated an inherent limitation of the most trusted tool of pure reason itself.

Scientists should recognize the limitations of modern science, and understand that it must not be used to reject certain possibilities beyond its sharply defined scope of knowledge. Although we must be careful that we do not fall into superstition or pseudoscience, at the same time, we must be equally careful to not reject certain possibilities of what science can be.

Modern science has discovered its own limitations. These limitations are fundamental. And no amount of time and hard work will resolve them by carrying on as usual. Although modern science will continue to make some discoveries, without a fundamental change, those discoveries will only be within the very limited scope of its competency.

Mr. Li Hongzhi once said,

"...humankind must fundamentally change its conventional thinking. Otherwise, the truth of the universe will forever remain a mystery to humankind, and everyday people will forever crawl within the boundary delimited by their own ignorance." (From "Lunyu")


Aiken, Henry D. (1956), The Age of Ideology. Houghton Mifflin Company.

Durant, Will (1961), The Story of Philosophy, Pocket Books.

Gödel, Kurt (1962), On Formally Undecidable Propositions of Principia Mathematica and Related Systems, Dover Publications, Inc.

Greene, Brian (1999), The Elegant Universe, Vintage Books.

Heisenberg, Werner (1958), Physics and Philosophy (The Revolution in Modern Science), Prometheus Books.

Heisenberg, Werner (1949), The Physical Principles of the Quantum Theory, Dover Publications, Inc.

Hofstadter, Douglas R. (1979), Gödel, Escher, Bach (an Eternal Golden Braid) , Basic Books, Inc.

Herbert, Nick (1985), Quantum Reality (Beyond the New Physics) , Anchor Books.

Kaku, Michio (1995), Beyond Einstein, Anchor Books.

Nagel and Newman (2001), Gödel's Proof (Revised Edition) , New York University Press.

Penrose, Roger (1989), The Emperor's New Mind, Oxford University Press.

Tegmark and Wheeler, "100 Years of Quantum Mysteries", Scientific American, February 2001.

Wood, Allen W. (2001), Basic Writings of Kant, The Modern Libraries, New York.

Add new comment