The theoretical physicist Richard Feynman once stated: “Religion is a culture of faith, science is a culture of doubt.” This sentence contains quite some intellectual explosives. For, upon closer examination, it describes a new relationship between science, religion and spirituality. The first and the last in that list can then be generally recognized as an […]
The theoretical physicist Richard Feynman once stated: “Religion is a culture of faith, science is a culture of doubt.” This sentence contains quite some intellectual explosives. For, upon closer examination, it describes a new relationship between science, religion and spirituality. The first and the last in that list can then be generally recognized as an attitude of wanting to get to the bottom of things. Particularly, this understanding of spirituality seeks to moves us away from the gravitational field of traditional religions, which all too often pursue rather vulgar world-view politics than an honest reflection of “what really is “. A spiritual attitude in contrast aims at liberating us from rigid and dogmatic religious believes in something transcendent, and thus recognizes itself more as an inner attitude as regards to knowledge. We then see it as a mental (and immanent) motivational frame rather than a transcendental principle of faith, or as the masters of conceptuality, the philosophers, would say: it becomes an “epistemic orientation”. In a nutshell: It is about wanting to know rather than wanting to believe. By this motivation all the great scientists from Copernicus to Newton, Darwin, and Einstein and Feynman were animated, who we can therefore easily describe as “spiritually driven”. The relationship between scientific and spiritual thinking has thus many historical commonalities.
A closer look into this relationship requires a more in-depth consideration of the history of scientific thinking (which we have already done several times in this blog). For in the last 100 years we can observe a radical change in the nature of scientific claim for explaining the world. The historical origins of science are the philosophical desire and search for an absolute and ultimate truth. Already among the pre-Socratics, the ancient natural philosophers before Socrates, we can observe the development of the foundations of a metaphysics that was looking for ultimate causes and contexts behind the phenomena in nature. Regardless of the philosophical problems arising with the idea of an absolute and final knowledge, this intellectual drive held until modern age. It motivated Kepler in his theory of the movement of the planets, it was the basis for Newton’s mathematical system of mechanics, and still at the beginning of the 20th century let the physicists’ dream of the unity of science. Also the modern natural philosophy, beginning with Descartes and Leibniz, were led by the desire and the belief in the possibility of absolute certainty – which ultimately can only be found beyond sensual perception in the transcendent. It was only with the emergence of modern physics that a process accelerated, in which in the natural sciences the idea of the absolute was systematically suppressed in favor of an empiricist-positivist orientation embossed with a Bayesian methodological framework.
The detachment from an absolute certainty in quantum physics can be considered as one of the greatest philosophical insights of the 20th century. We recognize that the success of science in the last 100 years won its central developmental momentum precisely with the elimination of metaphysical dream of a universal truth. Or to put it into one sentence: The scientific method framework has become deeply ‘Bayesian’.
This term originates from a mathematical description of probabilities in the context of incomplete information, the so-called Bayesian theorem (named after the British mathematician and priest Thomas Bayes, who formulated it for the first time in a certain form). The interesting thing about the Bayesian method is that, when making statements referring to data or experiences, it already comes with respective a priori assumptions (which has led to much criticism of it being subjective). These are referred to as “prior credence”. These a priori probability correspond to the “degree of reasonable expectation”, which are then combined with the data or experiences and are thus correspondingly adapted (according to the Bayes’ formula). By this mean new and improved assumptions (to be more precise: new estimates for their probability) are generated. These can now serve as new priors for further data. This process can be iterated which yields an ever better knowledge with more and more precise probability estimates.
In the context of scientific research, the Bayesian theorem tells us that statements about nature are expressed in the form of theories that are true only to a certain (usually very high) degree of conviction, albeit never absolutely. Each theory can thus only be assigned a (Bayesian) probability that it is true and which gives its “degree of reasonable expectation”. In this sense we “believe” in their respective validity. The aim is now to look for the theory which is best suited to explain given observations and experiences, and which does not too easily lose its “degree of reasonable expectation” with new measurements and experiences.
One of the key factors in the Bayesian method is how new data, respectively experiences play into the “degree of reasonable expectation”. They represent nothing other than possibilities of a modification of the respective theory (opinion). And with these, their probabilities of it being true are adapted. In other words, insights and knowledge can only be gained from the permanent re-evaluation of experience, measurements and experiments. Theories are constantly re-evaluated and ‘updated’. They are never absolutely valid (that is, their Bayesian probabilities never get to 100%). We therefore want to call this methodological and epistemic framework of science the “Bayesian spirit”.
The Bayesian method is related to an old methodological principle of science: “Ockham’s razor” According to this principle one should, if confronted with two Theories, which explain a given observation equally well, choose the one which is simpler and more plausible, and not the one that is more complicated but for whatever reason appears to us more pleasant. The former is more likely the more accurate.
In the Bayesian spirit and the mindset of Ockham’s razor, science and spirituality meet: in the ever-new reflexivity of what we believe to know by means of the clear acceptance of “new data” and the commitment to unconditional readiness to process new information rationally and honestly. This obligation constitutes the clearest demarcation line to religion, which by no means wants to expose itself to this type of reflexivity.
The Bayesian spirit however carries also a normative-ethical dimension: We should try to grasp unreservedly and at all times what the anticipated consequences of a particular faith or action are, in the constant knowledge that absolute knowledge can never be achieved. In this attitude we find again the spiritual motivation for gaining knowledge: in the search for truth we avoid “lazy compromises.” Hence this finally corresponds to a concrete practice and way of life. With such a practical reference of spirituality therefore both questions of meaning as well as value automatically come into play. For the path from intellectual dishonesty, i. e. to think (or to believe) something despite better knowledge, to ethical corruptibility, i. e. to act against better knowledge, is rarely ever far.
With this commitment to the Bayesian spirit in science and spirituality, we therefore demand two things: intellectual integrity, which is to admit the permanent possibility of error and self-deception, and ethical integrity which lies in the detachment from self-interests. Both intellectual and ethical integrity lead to one another. This is probably the most important place in which science and spirituality meet.
Today we need more than ever the Bayesian spirit and scientific openness when it comes to dealing with technological developments from scientific research. For on the one hand to develop and use high tech, but on the other hand to reject scientific thinking, is more like a child that puts its hands before her eyes and claiming that she cannot be seen by anybody because she does not see herself.
Unfortunately, for non-scientists it is not easy to understand this spirit. What constitutes normality for scientists, namely, that every scientific knowledge is simultaneously doubted and controversially discussed by scientists, for the ordinary citizen means uncertainty and often leads to resignation and a mental departure from science. For the latter, the dialectical and often erroneous process of the emergence of scientific knowledge comes with little credibility.
Therefore, it is easy for populists and opponents of science to deny and reject unpleasant scientific insights, discrediting and defaming scientists, for the sole purpose of preaching their own faith, pursuing their own interests, and following their own ideological trajectories. It has hence become fashionable to season criticism of science with unsubstantiated underpinnings, defamations, and slander, all the more so as the criticism is less tenable and the critic less competent. And for every opinion and prejudice, a suitable “scientist” can be found to be cited for the political and social debate, who pretends competence and credibility for the questions discussed.
Science itself can deal well with such conduct, as with the Bayesian spirit it possesses a powerful method of sorting out nonsense. The problem arises where it meets political, social and economic decision making processes, which are strongly ideologically driven or where strong particular interests prevail. The consequence of this is too often uncertainty, doubts about the integrity of the scientific work, and finally, thinking and action detached from reality. An environment in which science is not only ignored but openly discredited is hardly a suitable to tackle the complex problems of our time.
- The Quantum Computer – The Holy Grail of the Quantum Revolution 2.0
- Entanglement – From a bizarre and long misunderstood quantum phenomenon to a key technology of the 21st century
- From Big Data to Big Brother – Do we finally loose the control of our data?
- Beyond the bitcoin crash – What will be the lasting impact of crypto currencies and the blockchain technology?
- 50 years after the first heart transplantation – The dawn of a new age in medicine
- Beyond the bitcoin crash – What will be the lasting impact of crypto currencies and the blockchain technology? on
- Entanglement – From a bizarre and long misunderstood quantum phenomenon to a key technology of the 21st century on
- “Origin” – Dan Brown’s latest thriller and the limits of naturalism on
- 50 years after the first heart transplantation – The dawn of a new age in medicine on
- A European scientist – On the 150th birthday of Marie Curie on
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014