Field of Science

Refuting the fine-tuning argument

Excellent article in eSkeptic refuting the fine-tuning of the universe argument. It doesn't completely dismantle it beyond repair, but attacks the premise that the physical constants should be independent.

To recapitulate, the fine-tuning argument holds that if the constants of nature (e.g. gravitational constant, cosmological constant, Planck's constant, permeability of free space, etc.) weren't exactly what they are, then life could not exist. Stars, planets, molecules, and life as we know it would never have been in the first place. Then, calculating that the probability of the constants being exactly what they are, it is concluded that it is no accident that they take those values, or that were are here to argue about it.

Some, including Hugh Ross, then further argues that this is proof of a creator. That Ross finds him in the Bible, as the author remarks, does not logically follow: the deduced creator need not be not synonymous with the one in the Bible.

But beyond that, I have never understood how these probabilities can be calculated - what is the range of possible values of each of the constants? G is 6.67428(67)×10−11 m3·kg−1·s−2, but how small or large is it possible for it to be? Could it be zero? Negative? One? A billion? Is the range even finite? If it's infinite we can dispense with the calculation, because then the probability that it takes the value it does would then be zero.

Additionally, and this is the author's argument, could it not be that the constants are coupled? If they are not independent, the calculating the probability that they take the values they do by multiplying the probability of each is wrong. We don't know anything about this, so it could just as well be that given that the speed of light is 299,792,458 m/s in vacuum, the values of all the other constants are constrained to the exact values that they really take.

The article then goes a little deeper and spoils the argument from analogy that things that look complex like those created by humans must always have been purposefully created by an intelligent designer:
Almost all arguments for the existence of God are based on analogies to human performance. For ancient humans the analogy was this: Humans move objects, objects in nature move, and so these objects must be moved by an invisible intelligence or designer. For the 19th century theologian William Paley, the analogy was this: Humans make watches which are complex, objects in nature are complex, and so objects in nature must have been made by an invisible intelligence or designer. For Ross the analogy is this: Humans fine-tune their machines for a purpose, values of the universe’s physical constants are extremely improbable but consistent with the existence of human life, therefore the universe must have been finely-tuned by a hidden super intelligent agent for the purpose of producing humans.
The fine-tuning argument is in a sense the most serious one the creationists have in their very large bag of goodies, because it is the hardest one to refute. This is because it concerns an aspect of nature that we don't really have the slightest grasp of, namely the question of the origin of the physical laws and constants. All other God of the gaps types of arguments can ultimately be explained by these laws (though here I am not advancing the conjecture that all science can be reduced to physics). The origin of life is going to be explained in terms of physics and chemistry, the complexity of life in terms of biology, and so on.

We may not ever reach an understanding of why the physical laws and constants are as they are, but the point here is that that isn't proof of anything, except that we don't know. Yet.

2 comments:

  1. I'm doing some research for this fine-tuning argument and ran into your post.

    Just a few questions... You say that you don't understand how they could have figured out the range or percentage that Hugh Ross is talking about. Maybe I'm not getting the finer points of abstract mathematics, but can't one just calculate the percentage by taking the value as the baseline and see how far you can get away from it? For example, if the value is 5.5, but I can still make things work with value 4, then we have a pretty big range (one would say (5.5-4)/5.5 = 27%) But if I find that the 5.5 value only works within the range of 5.49 to 5.50, then the percent "accuracy" is smaller.. 0.18%.

    Am I missing something here? I thought that it was pretty straight-forward how they would figure that out. Of course, one could say that perhaps it's IMPOSSIBLE for that range to even go below 5.3, so in that case, the domain changes. But that's another issue that needs to be proven separately. So far, as far as I can tell, they haven't shown the physical/philosophical boundaries or limitations, so we have to assume that if a value is 5.5, then we can just take that at face value (no pun intended).

    I also have questions about your claim that these values could be "linked" (as in through a Grand Unifying Theory), but I will just hold off on those questions, because I'm not even sure if you're checking this blog anymore. :)

    Thanks in advance for the thoughtful posting. I do understand that the God-of-the-gaps argument is weak, but I thought that your last statement can also be characterized by mean creationists as science-of-the-gaps.

    ReplyDelete
  2. Fine tuning argument is not actually required for proving the existence of God. Please see the link below:

    https://sekharpal.wordpress.com/2016/01/11/is-fine-tuning-actually-required-for-proving-the-existence-of-god/

    ReplyDelete

Markup Key:
- <b>bold</b> = bold
- <i>italic</i> = italic
- <a href="http://www.fieldofscience.com/">FoS</a> = FoS