The Science Delusion: Limits of reductionism

[bt_section][bt_row][bt_column width=”1/1″][bt_text]

Scientists are befuddled by a strange new development in almost every field of study. Almost simultaneously, an elaborate degree of complexity is being encountered everywhere. ‘Fundamental stuff’ which was supposed to be getting simpler and simpler with division, is behaving in unexpected, freaky ways. Biology, with intractable epigenetics, physics with crazy multi-verse excesses, neuroscience with its trillions of neural connections, or even the omega number of  pure mathematics. Massive data sets are coming in, so massive that they defy comprehension. To a degree where scientists do not even understand where they must begin!

They are looking at humongous chunks of data, but do not know what to do with it.

Reductionism, the central methodological value of modern science is failing now, even if it yielded fantastic progress in the last few millennia. Reductionism means breaking down a thing/a process/ a phenomenon to its most fundamental components. The technological marvels of our world run on the principles of reductionism; by combining individual, basic parts to create complex systems. A car, mobile phone, an aeroplane and many more useful contraptions are built bottom up, from their most fundamental parts. In essence then, any reductionistic science is more of an ‘Engineering project’ rather than a ‘Science project’. Technological innovation is achieved by creating permutations and combinations of these fundamental units to produce many things with different uses. Today’s Scientists are busy combining known basics in every possible way, in the hope that one might just work.

 However, technology and engineering is only a part of the scientific paradigm, not the entirety of it. Unfortunately, so deeply conditioned are we in the method of breaking things to their basic parts that we cannot imagine studying or even looking at things in other ways! Simplification must be achieved by reducing to basics, we automatically  think. Paradoxically though, today science seems to have reached, down through the steps of reductionism, into a queer, formless terrain. Rather than yielding more solid ground,  reductionism is producing the exact opposite effect here!

The smaller they go, the more complex, the more inscrutable it becomes.

brian cox

The deeper they go, the more nebulous and imprecise it becomes!

The human genome project, a 3.8 billion dollar, 13 year long exercise, enthusiastically endorsed by Bill Clinton himself, was supposed to find the cure for all common diseases that plague our species, a lofty goal among many others. Announced in 1990, it was an international research project to decode the nucleotide base pairs forming the human DNA, i.e. to map the human genome. Paradoxically, more than a decade after this mapping, there are many more unanswered questions than before. It seems the more they know, the less they understand. 

A prominent scientist has said about the connection of genes with diseases, “In pointing at everything, genetics is pointing at nothing”. Far from eradicating cancer or alzheimers, research has suggested that we humans have traits which are actually de-coupled from our DNA sequences. Not affected by them at all.  For example, researchers had originally identified 54 genetic locations which contribute to height variations in different people. However, on a closer statistical analysis, it turned out that all these 54 ‘genetic contributors’ accounted for only 4% of the height variations in thousands of people! This meant that there were other mechanisms or phenomena at work, apart from, and in spite of the 3 billion well known base pairs of human DNA. They are calling this strange outcome as the ‘dark matter of genomics’! After spending so much time and resources in proudly having mapped the human genome, not much can be said about our traits with certainty. Ironically, before the entire human genome was sequenced, most scientists would have proclaimed, without batting an eyelid, that our genetic make-up determines everything about us. However, it does not, as scientists have arrived at this ‘missing heritability problem’, as they call it. Can you believe it?

GenomeMap

No, things are not getting simpler as we break them down further and further!

Exhaustive mapping of a system to build a model is not the same as trying to find out how the system itself works. It might not yield answers. In fact, we might be asking the wrong questions if  the ‘How’ is not yielding the mechanics anymore.

Leonard Susskind, string theorist and director of Stanford institute of theoretical physics remarked at a World Science Festival, “The smaller one goes now, the fuzzier and fuzzier things get, and much more vastly spread out”. Not only have we arrived at complexity impenetrable by our cognitions, it seems to be impenetrable by our highly refined instruments of observation too. Look, it took technology many decades to scale down observation only a 100 times, from atomic particles to the sub-atomic particles. However, to be able to detect the most fundamental strings of the ‘string theory’, a further scaling down of 15 times will be required. 

Fifteen orders of magnitude, do you understand? The size of a string to an atom can be compared with a tree to the entire universe! Imagine that. Can you?

Any technological advancement of the kind which can scale down 15 times is certainly not possible in many decades or maybe generations even. It is simply out of the question, one hopes and fears, in economic terms as well, to build machines of that precision and efficiency. Our current particle reactors took at least 10 billion to build. What will we  aim for next?

Analogous to the genome project, another few billions of dollars are being allocated to the ‘Human Connectome’ project. A connectome is supposed to be a network map to determine structural and functional connections of the neurons in the brain. Note, please, that  the human brain has 90 billion neurons and 150 trillion synapses. The bizarre fact is that neuroscientists have only yet been able to map the neural framework of the nematode, C. Elegans., an organism  which has only 302 neurons! To add to the outlandishness, even this map is an incomplete, static description. It does not explain the dynamics; how its neurons interact with each other. It also took five years for scientists to model 3 billionths of a mouse’s brain. 

 Imagine how much time the unimaginably complex human brain will take.

In fact, so ineluctable is the human brain that neurobiology is still dependent on brain models drawn up over a hundred years ago! Functional complexities apart, neuroscientists do not yet have a working theory for the entire brain itself. When the core principles are unknown, what substantial understanding will be gained from mapping with massive amounts of data? What would the neuroscientist do with the data?

140113.bigdata

Humour me just for a moment and consider this. How should this complexity be interpreted?  Could this mean we are not meant to decode what we are trying to, or at least not in the way we are going about it? Maybe decoding reality through the process of reductionism is simply not meant to be, maybe reality will not yield further by breaking things down to still smaller sizes. Maybe reductionism as an epistemological value of science needs to be re-looked at. Maybe new methodologies of experimentation are required.

 Maybe new ‘things’ need to be discovered in a ‘new’ way. 

It’s possible.

[/bt_text][/bt_column][/bt_row][/bt_section]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top