The introduction chapter in The Computational Beauty of Nature by Gary William Flake uses principles of Computer Science to address significant biological phenomena. He starts with reductionism or more simply comprehension through dissection which can also be viewed as an altered interaction of different agents at distinct levels. We have various pieces of evidence (naturally occurring at that) that support this perspective. For example, Ant Colonies have a peculiar behavior that cannot be understood by examining each ant. It is the interaction between individuals that eventually form a colony’s complex patterns. Evolution operates with a similar mechanism. Over time, individual organisms interact via which new species might be produced. Our high level of consciousness or intelligence, also can’t be reduced to properties of individual neurons. These are some of the examples I found most interesting, and they align and deliver the main message of the author. This is to consider parallelism, iteration, feedback, adaption, and more to understand the full system of these complex pictures.
The author does truly bring many examples with this holistic approach, but I feel like it undermines the significance of reductionism. At a simple level, it is a very important scientific tool that is used by many scientists all over the world, even I used it repeatedly in school as it remained a core component in the analysis of many phenomena. Furthermore, yes while on its own it might not offer the full picture, dismissing it is not the right approach. I think the combination of reductionist and interactionist approaches is what works the best, where they make up for each other’s weaknesses. While the author acknowledges it, the overall tone still feels biased towards interactionism. The author also argues that the widespread introduction of computers has unified lots of disciplines by enabling combination of theory and experiments.Such methods include the use of fractals (modeling plant growth) and chaos (applied in physics, bio, econ, etc.) to provide a merger of those disciplines. Despite this, significant fragmentation is still relevant as scientists may use those methods but struggle to connect them to domain-specific insights. In theory, it’s possible, but how far can we extend the computation metaphor before we lose its predictive power? I think the author’s arguments are compelling but more nuance is needed in some cases.