The Way Out

Satyashri Mohanty
6 min readJul 28, 2021
New Approach in Science

A young boy of 17 once wondered if a ray of light would appear stationary to him if he too travelled at the same speed as the ray of light. Of course, it would, he thought. Doesn’t a train appear stationary to passengers in another train that is moving at the same speed and in the same direction, as the former? The precocious boy was well aware of the Maxwell equations which state that the speed of light appears the same to all observers, regardless of their own speed. This was rather counter-intuitive, he knew. The boy postulated that the only solution to this paradox was time dilation. Time slows down as one moves fast.

The boy was Einstein and this was the famous “thought experiment” which sowed the seeds for Theory of relativity.

This theory was not birthed through empirical observation. The starting point was a conjecture about a conceptual gap in a well-accepted theory. The “solution” too, was first conceptualized without any empirical basis. Then Einstein, along with other scientists, deduced predicted effects that must materialize if the theory was correct. Their predicted effect thinking, in essence, was deduction. They couldn’t have used empiricism and inductive thinking to detect time dilation directly.

The scientists introduced a new method for understanding the mystery of the “invisible world”, the world that cannot be sensed directly and empirically. The solution they arrived at, however, met the criteria of ‘well posed’ and ‘falsifiable’. The starting point, it must be noted, was a mental exercise.

With this radical approach, pure empiricism and inductive logic paved the way for a new method of doing science.

Should management adopt this method to arrive at problems that are well-posed and falsifiable?

As discussed in chapter 2, organizations also have their own versions of “dark-matter.”

1. Direct data is absent for many business variables in organizations. It is impossible to get data for all possible variables for all potential decisions, not even with highly sophisticated IT infrastructure. Managers will always be ‘handicapped’ by lack of direct data in many situations.

2. For third party researchers, organizations will stay opaque. Companies will only reveal selected information in the public domain.

Clearly, there is a need to develop methods for working in an environment where relevant information is in short supply. This calls for experimentation which can be expensive, especially for organizations which are enjoying hard-earned stability in their businesses. Hence, before they begin, it is important that leaders in the organizations are convinced of the conceptual validity of the new paradigm or solutions. They need a set of rules for non-experimental or non-empirical filtration/validation of ideas.

Let us learn how limitations of empiricism were surmounted in science and a new method was discovered.

Limitations of Empiricism and the Need for a New Approach in Science

Empiricism fell short as science began exploring areas involving objects/phenomenon that are “too big” (stars, quasars, nebula, black holes) or “too small” (subatomic particles such as protons, electrons, quarks, and bosons) or “too rare” (a giant meteor strike). Scientists encountered obstacles to empiricism, which they had always upheld as the most reliable method for testing their theories. Experiments became expensive, time-consuming (it took billions of dollars and decades to build the particle colliders). In many cases, experiments could not be repeated. In some cases, it was impossible to conduct experiments. (Studying prehistoric events such as Big Bang or the Mesozoic period when dinosaurs lived is a challenge due to these reasons). Even in cases where experimental apparatus was built, direct observation of numerous phenomena, was, and continues to remain, impossible. (Some subatomic particles reveal themselves only to decay in a whisker). At best, evidence is indirect and circumstantial. It is, however, conclusive for most theories in modern science.

As for theories that explain the “too big or small”, “the very rare”, or “the way back in the past”, science evolved to rely on rigorous non-empirical parameters to evaluate them. Now, only the “sure-shots” survive for expensive experimentation.

What aided this method of non-empirical evaluation of theories was an even older school of philosophy — Rationalism.

Rationalism: Another Approach

Philosophers from this school claim that sound and valid reasoning helps us understand reality. Rationalism offers more depth than that can be accessed through observation or empiricism. A pure rationalist of the Greek era would argue that all knowledge of the world around us can be understood through thinking and reasoning.

This approach does not start with data or observation. It begins with a brilliant idea, conceptualized in the mind to resolve a theoretical paradox. Next, scientists conjecture predicted effects that must occur for the idea to hold true. Next, they set up experiments to verify the predicted effects.

To observe “unseen” entities, rationalists rely on a ladder of irrefutable statements that connect the “unseen” with the “seen”. (For example, an electron can only be “seen” through a blip on the cloud chamber or mass of star can only be known by measuring the intensity of light emanating from it.). The evidence is indirect and circumstantial.

The rules were framed to keep a check on conjecture (or conspiracy theories) borne out of biases. These rules helped theories meet the criteria of ‘well-posed’ and ‘falsifiable’.

The rules are -

1. Explanatory power of new theory: The explanation the new theory provides should be non-variable. Its proponents are not allowed to change explanations or arbitrarily invent new conditions in the light of strong logical refutations. The explanatory depth should have no holes. From the cause till the effect, the ladder of causation should cover all intermediate steps without gaps.

2. Logically compatibility with all other existing and successful theories: A new theory cannot arbitrarily replace an old one without explaining why the old one held up in the first place. Proponents of the new theory have to make a case for the old one as well. The logic has to be irrefutable.

The conditions ensure that all new and good theories are built on the gaps of older theories. This creates a network of theories that have clear boundaries of applicability. Any new theory that is “totally” out of this web and does not explain, with its own new facets, the success of previous theories, is rejected.

3. New Explanations: The theory should explain an existing phenomenon, one which had remained unexplained by old theories.

o This is justification for the new theory. Without this, there is a possibility that the new theory is just a differently worded version of the older theory (old wine in a new bottle!)

4. New Effects: The new theory should predict novel phenomenon, hitherto not observed. This leaves the door open for a potential empirical test in the future.

o If there is no new prediction, the new theory cannot be tested empirically. The new prediction justifies the novelty.

Has this method helped?

The fields of quantum physics, modern cosmology, and evolutionary biology depend on this method. Interestingly, the presence of atoms, boson particle, radio waves, anti-matter, bend space etc. were postulated in the minds of scientists many years before they were proven by sophisticated and expensive experiments. or further validation of this approach, one need only recount the numerous instances in history where two or more theoretical physicists, with no communication with each other, reached exactly the same conclusions using only their grey cells!

Hence, the real big deal here is the fact that scientists gained significant confidence in the validity of theories much before the theories could be subjected to empirical tests.

It is this confidence that draws billions of dollars in funds, the mindshare of independent scientists, and several years of effort. These resources are used to set up difficult experiments that help weed out bad theories. This method can be termed rationalism and indirect empiricism.

Management can reap rich rewards if it explores similar techniques for gathering information.

This approach can bring the following benefits -

· Expose important albeit ‘immeasurable’ variables that have been brushed under the carpet.

· Weed out bad theories without depending on empiricism or with minimal indirect empiricism.

· Management academia can develop new theories faster. Most research in management is handicapped by lack of empirical data.

· Facilitate problem solving in organizations through risk-free experimentation using first principles, without depending on data of “where it was done before”. This will facilitate rapid innovation.

This is the 3rd chapter of a series of articles that explores the limitations of management theory and practices as it currently exists in order to propose a radical, new, and structured thought process for bias free decision making in variety of scenarios. These techniques can help companies instil a culture of open dialogue, critical thinking, and innovation — all necessary conditions for developing agility in the current uncertain business environment.

Click here to start the series from chapter 1,

Click here for the previous chapter 2

--

--

Satyashri Mohanty
0 Followers

Founding Partner at Vector Consulting Group. Loves to solve wicked problems in supply chain and operations. A voracious reader, deeply interested in philosophy