Limitations of Reason: Is decoding reality about breaking down stuff smaller to see finer?

[bt_section][bt_row][bt_column width=”1/1″][bt_text]

If science is the tool to decode reality, then breaking things down into smaller and smallest parts seems to be the obvious modus operandi. But is it? Is it a question of seeing finer, and finer still? Will refining and enhancing our observations reveal deeper truths to us? You might think greater technological advancement will uncover all truths, in time. Let’s see about that. 

This modus operandi has worked fine for 400 years, since the birth of modern physics, but something is wrong now. Gravely. Talk to Professor Nima Arkani Hamed of the Institute of Advanced Studies at Princeton, touted to be the next Einstein, and he will tell you precisely that. His words, though not verbatim- ‘Never after the 1800’s, when the idea of ‘ether’ was prominent, have such a big group of scientists as our current bunch  been collectively wrong about something fundamental’.

Yes, this is the state of fundamental physics now, where the logical, natural expectations of the best minds in the world have failed.

You see, our current intellectuals have agreed on a fundamental theory called ‘Supersymmetry’ which suggests that certain elementary particles must exist, to create our reality. Now to be able to confirm the theory, they formulated experiments to find such particles. So, to examine very small particles, you require experiments in extremely high energies. That is why the 27 km long underground particle reactor – the Large Hadron Collider was built. Without getting into the technical details, these expected particles, if confirmed, would solve many of the big questions about our universe. Like why  is our universe so huge, why it is expanding at the rate it is, and other foundational puzzles.

Sadly, the LHC produced none of these particles, except the Higgs Boson, which was expected, but is not as pertinent to those questions. Nothing consequential apart from that. Zilch. Zero.

Arkani Hamed says that this could mean two things. One, that you need even greater energies to reveal those particles which verify the theory, or they have been dead wrong about the theory itself. Dead wrong in the conceptualisation of physical reality over 400 years.

Nonetheless, physicists are now planning another 100 km long particle collider in China, one that will generate 100 TeV ( teraelectronvolts)  of energy, possibly ten times greater than LHC. Note that it will cost 10 billion dollars and that there is no guarantee that any new particles will be found. 10 billion dollars, bigger than some countries’ GDP’s, on hunting for particles that might not exist!

[/bt_text][bt_hr top_spaced=”not-spaced” bottom_spaced=”bottomSmallSpaced” transparent_border=”noBorder” el_class=”” el_style=””][/bt_hr][bt_text]

Expensive particle colliders are getting even larger in size

Frankly, the endeavour of dividing space smaller and smaller with a bigger and bigger machine is becoming ridiculous. In theory then, as Arkani Hamed himself says, to observe the smallest possible stuff of the universe, you will need an infinitely large machine. What will physicists do, going ahead on this path, build machines as big as galaxies or something?

How logical is that? Scientists could be incorrect in their methodology of decoding reality, by their own admission.

Another fantastic thing to note is this. Since seeing smaller requires more and more energy, it means that at a particular scale (Planckian), the energy( or mass, as E=MC²)  will be so high that you will have created a blackhole. No, not an earth swallowing one, that is a pop culture myth. But it means that space-time will  break down and this empirical method will become nonsensical.

So you see, the current system  of empirical observation actually fails on more than one account. It seems  the empirical methods of science are reaching their limit, a dead end. It is possible that the method of dividing stuff smaller to get to fundamentals, reductionism as they call it,  could be fundamentally wrong.

[/bt_text][bt_hr top_spaced=”not-spaced” bottom_spaced=”bottomSmallSpaced” transparent_border=”noBorder” el_class=”” el_style=””][/bt_hr][bt_text]

Are we creating complex experiments simply to suit our scientific models?

Also, look at this problem from a different angle. Is it possible that such enormous expense of time and resource has been spent just trying to fit reality into a theory, rather than the other way around? In a previous blog we discussed that the same phenomena can be described in more than one logical framework, yet these frameworks could be mere approximations. We also understood rationality and logic are incomplete measurements of truth. What if the current theories of science are just that? Illusory approximations with no true basis in reality? Well, think about it.

So then there are two take aways here. First, that ‘seeing finer’ might  not be the issue at hand when it comes to decoding reality. Maybe only a radical, paradigmatic, shift in methodology could coax nature to yield more, to loosen its stubborn grip on further revelations.

Second, current science might be trying to force reality to concur with their models and it is no surprise that it is failing. Its possible that the model itself is illusory, even if logically impeccable.

So it seems that either the scientific conceptualisation of reality or their methods could be false. Even both, since one depends on the other. Think about that.

[/bt_text][/bt_column][/bt_row][/bt_section]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top