Raiders of the lost purpose (1): fine tuning
Since more than a century ago, every generation has a moment in which religious believers experience an agonising urge to persuade themselves that the ‘truths’ of their religion are compatible with the sophisticate description of the universe that contemporary science is unfolding. Every now and then this leads to the landfall of some best-selling books, with the number of sold copies usually inversely proportional to the intellectual quality of their arguments, and in which the authors struggle to ‘prove’ that the supernatural tales of some religion are not only consistent with scientific knowledge, but even that science ‘demonstrates’ that those fables are ‘true’, if not literally, at least approximately so.
The last example has been the book co-written by a couple of Frenchmen with the presumptuous title Dieu – La science – Les preuves (“God – the science – the proofs”), with which I spent some days on the last Christmas just to have something to have fun with. This entry will not be a review of that book: its ‘arguments’ are most often so obviously shameful from a logical and scientific point of view, that it would be an insult to my readers to waste their precious time with that (suffice it to note that the authors don’t feel ashamed by adding to the ‘scientific proofs’ some pieces of ‘historical evidence’ presumably ‘impossible to explain’ without assuming the truth of the Bible: the divinity of Jesus -without which, how do you atheists explain His superb relevance in the history of the humankind?-, the amazing perdurance of the Jewish people -though, I’m tempted to replay: if Jews are so special in God’s plans, why haven’t they accepted the Christian faith by now?-, and -yes, I’m not pulling your leg – even the ‘miracle’ of Fatima -checkmate, impious!-).
Actually, what my reading of the book has led me to do is to resume my old engagement about the conflicts between science and faith, and in particular the arguments trying to show that some aspects of the universe (its very existence, the presence of life within it, and in particular the existence of intelligent beings like us) can only be explained by a divine, supernatural purpose. In this and the following entries, I will scrutinise some of the main aspects of this discussion, trying not to repeat what I have already said in this blog. In particular, I will concentrate in this series on what is known as the argument from fine-tuning. The best recent account of the argument is probably the book A fortunate universe: life in a fine-tuned cosmos, by G.F. Lewis and L. A. Barnes, in my opinion a convincing and very well crafted reply (at least regarding the purely physical facts, not the possible explanations, which the author just entertain without committing themselves to any) to what the authors call ‘their anti-particle book’: Victor Stenger’s The fallacy of fine tuning: why the universe is not designed for us.
First of all, we have to be clear that the expression ‘fine-tuning’ is eminently just a metaphor, like when economists talk of ‘the invisible hand’ of the market, or astronomers refer to distant supernovae as ‘standard candles’. When some scientists say that some physical constants ‘are fine-tuned’, they often simply mean: first, that if those numbers had been slightly different from what they are, it would be very likely that some properties of the universe would change in such a way that life would be impossible or nearly impossible; and second, that those constants having the values they have is somehow very improbable a priori, lacking, as we do, a convincing physical explanation of their values. So, the physical constants being ‘(apparently) fine-tuned’ just means that it is surprising that they have ended being what they are, and the surprise is stronger the narrower is the range of variation those constants may experience while staying consistent with the existence of life. It doesn’t mean, of course, that the only possible explanation (or even the best explanation) of those surprising facts is that the constants have been deliberately chosen by some kind of supernatural agent (one topic, that of what the hell the world ‘choice’ might exactly mean in this particular context, which I shall leave for another entry).
Leaving aside by now the question of the possible ‘(supernatural?) intentionality’ behind the ‘tuning’, the first important point is whether the physical constants under discussion are indeed ‘surprisingly precise’ or not. Victor Stenger had argued that they aren’t, for all of them could find a relatively easy explanation from the known physical laws. His book prompted a critique by Luke Barnes that was followed by an exchange of responses, and that finally led to the publication of the Lewis & Barnes’ book. One case in which Stenger’s arguments seem right to me refers to the problem of the neutrinos’ mass: some people argue that, were this mass much higher or much lower, the gravitational effects on the shape of the universe would be so big that it would be unstable and not fitting for life, but Stenger replies that what matters is not the mass of the individual neutrinos, but their total mass or energy, and hence, if they were, e.g., much heavier, the physical laws would entail that much fewer of them would be produced, so that their total mass in the cosmos would in principle be similar to what it is. I doubt, however, that all the problems can be dispensed in such an easy way, for, after all, the values of the problematic constants can only be derived from more fundamental laws by using the values of other constants these laws happen to contain, and the latter can only be found by empirical measures, so the fine-tuning problem would only be transported from some less fundamental constants to some more fundamental ones. The most we can expect is that the number of independent constants to be explained gets reduced if some more fundamental laws permit to predict the ratio between some of them, but this would always leave some constants unexplained. Some scientists dream with a future ‘theory of everything’ in which the only constants appearing have some ‘natural’ values like 0, π or 1, but of course the current state of physical theory is very far from this, and besides, it doesn’t seem that any so simple explanation could lead (without the addition of some ‘unnatural’ figure) to numbers as apparently arbitrary as the ones our unexplained constants seem to have.
Just by way of example, some very surprising ‘coincidences’ are the following ones:
-The slight difference in mass between the neutron (heavier) and the proton must fall (as it does) within less than 0,5 MeV in order to make that neutrons become stable within the atomic nuclei, and that it is neutrons what transform into protons when they decay (either inside or, much more often, outside the nucleus) and not vice versa (for, in this case, almost all ordinary matter would consist in neutrons). These two masses are derived, of course, from the ones of the two slightest quarks (u and d), but the precise values of these are equally unexplained.
-The ratio between the strength of gravity and the electromagnetic force could not have been much bigger than it is (for in that case, i.e., if gravity were much stronger than it is compared to electromagnetic force, stars would be too unstable and wouldn’t last enough time to allow the evolution of life in their planets), nor much smaller (for, had gravity been slightly weaker, main sequence stars such as the sun would have been significantly colder and would not explode in supernovae, which are the main source of many heavier elements; if had been substantially weaker, galaxies, stars and planets would not have formed at all, and had it).
-The strong force (that keeps atomic nuclei stable) must be tuned with at most a precision of 0,5% to allow for the stability of carbon when this is created within the stars through the fusion of one beryllium and one helium nuclei.
-The cosmological constant (Λ, which is a dimensionless number, connecting the stability of the universe with its energy density) cannot be higher than 10-122, but bigger than 0 in order to explain the accelerated expansion of the cosmos. In this case, it is not only that we lack any fundamental explanation of the value, but that most plausible predictions about the value Λ from quantum-field theoretical considerations about the energy of the vaccum would be expected to fall in the inverse order of magnitude (what has been called ‘the vacuum catastrophe’).
In successive entries I will also mention some other of these ‘surprising coincidences’, but mainly I shall argue, of course, about the philosophical aspects of the fine-tuning problem. So, stay tuned.
References
Bolloré, M. Y., and O. Bonnassies, 2021, Dieu – la science – les preuves, Guy Trédaniel Editions.
Lewis, G. F., and L. A. Barnes, 2016, A fortunate universe: life in a fine-tuned cosmos, Cambridge University Press.
Stenger, V., 2011, The fallacy of fine-tuning: why the universe is not designed for us, Prometheus.