Biocomplexity at all Levels of Biological Organisation
I am working on rewriting this (August 2022)
This theme concentrates on complexity and organisational scale. It concerns the way biological systems form as nested hierarchies of structure and the consequences of that. We all know the basic pattern: molecules; molecular interaction networks, cellular systems, cells, multicellular organisms with tissues, colonies, communities and the ecosystem. Which bits of that are genuine natural entities and which are just the way we have learned to think about biological structure, perhaps through convention? Is there something fundamental about nested hierarchy? If so is it just because large organisational structures only can form that way, or is there real function to the structuring (i.e. is more life-enhancing activity possible with this structure than without it)? Is there a continuity from molecule to ecosystem, or are biological levels real, discontinuous and distinct? Any way, what exactly is complexity and what does it have to do with hierarchical structure ?
Complexity and Emergence
In the reductionist scientific paradigm, we believe that any
supposed level of biological organisation (e.g. an organism) can be
explained entirely in terms of interactions among its component parts.
That is we explain all biology, including the behaviour of dogs, the
distribution of species on land and the anatomy of brains, in terms of
molecules and their chemical interactions (and of course can further
explain those with physics). It is noted, however, that some phenomena,
especially in biology, are (even in principle) inexplicable from
a knowledge of only the component parts, so reductionism is at best an incomplete answer.
This leads directly to the idea of emergence - the appearance of phenomena from the organisational structure acting as a whole. Emergence is central to most scientific definitions of complexity and because we can now independently define emergence, circular arguments using both can be avoided. Emergence
is easier to explain because it is a phenomenon with particular
identifiable properties, whilst complexity is an attempt to idendentify
the sort of system from which self-organisation spontaneously appears
from among the ineractions of component parts. Frankly, as an idea,
complexity still has many definitions and different meanings for
different branches of science. Happily this diversity of meaning has
been catalogued, reviewed by Ladyman, Lambert and Wiesner (2013), among
others and their work is elaborated in the (2020) book "What is a
Complex System?). Unhappily, that work is far from comprehensive and
to-date, scientific articles about complexity usually start by saying
"there is no consensus about the definition of complexity".
We will note in passing that there are several well established
statistical metrics of complexity such as Kolmogorov complexity (a
generally incomputable measure of information compressibility), its
algorithmic information theory 'children' including its aproximation
(Lempel-Ziv), Gell-Mann's 'effective complexity' and related Shannon
entropy based metrics such as mutual information, (among component
states), Kullback-Leibler divergence, the Bertschinger and Olbrich
(2006) measure of information closure, integrated information theory's
Phi (Oizumi et al. 2014) and offshoots such as logical depth and
thermodynamic depth (some of these are compared in application by
Albantakis (2021). In other words, lots of complicated maths to
quantify essentially the amount of coherent pattern forming from the
interaction of component parts. Notice I say complicated maths, not
complex - we need to distinguish between the two.
Complicated things may have
many components which interact in many ways, all at the same time, but
can be fully described (and therefore understood) through a strictly
reductionist analysis: breaking down the components and interactions
into a set of fundamental pieces and rules for interaction. A
good aproach with merely complicated systems is to abstract them as a
hierarchy of organisationally nested levels. Take for example a car:
engine, transmission and suspention, breaking system, electrics and
bodywork; each can be further broken down into e.g. disk break, disk,
pad, piston, etc. and each of these into components with a specific
shape and material, all interacting in ways that are set by the shape
and material (the component's form). Most scientifically based medicine
treats the human body that way too. The reason it works is that the
behaviour of higher levels in the notional hierarchy is fully
determined by the form and behaviour of its component parts.
Complex things are necessarily dynamic (can change in time) and embody information in their structure that influences the dynamics. But the defining characteristic
is that new properties emerge from their internal organisation -
properties that could not, even in principle, be predicted or
understood solely in terms of their component parts. These properties
must (like all properties) derive from embodied information - the
problem of complex systems is that some of the information is 'hidden'
in the whole-system level pattern of interactions.That is why the metrics mentioned above all attempt to quantify aspects of pattern (information) at the system-level.
Frances Haylighen recognises "a common, 'objective' core in the different concepts of complexity" (Haylighen 1996). He says:
"Let us go back to the original Latin word complexus, which signifies 'entwined', 'twisted together'. This may be interpreted in the following way: in order to have a complex you need two or more components, which are joined in such a way that it is difficult to separate them. Similarly, the Oxford Dictionary defines something as 'complex' if it is "made of (usually several) closely connected parts". Here we find the basic duality between parts which are at the same time distinct and connected. Intuitively then, a system would be more complex if more parts could be distinguished, and if more connections between them existed."
This idea is especially relevant to living systems, which are readily interpreted as assemblies of different parts interacting through connections, many of which represent mutual dependencies, collectively making up a functioning whole. This applies equally well across the whole range of levels in biological organisation: from interactions among molecules, up to interactions between living processes and the non-living earth-systems for which the Gaia hypothesis is a potential explanation.
In my own view, there are two aspects of complexity. One is the 'richness' of relationships (entwinement): the inter-connectedness of the component parts, supremely elaborated in the brain, of course, but also in ecological communities: it is what Darwin referred to as a "tangled bank". The other aspect of complexity is the number and variety of different kinds of components that, so entwined, make up the whole. This number and variety is measured by diversity and in the biological context, that is of course biodiversity. That is why biodiversity has been given a theme within this project, and we should note that it is not just the number of species, but the number of all system components, including for example genes. In our interpretation it also quantifies the extent to which component parts are different and it includes the variety of interconnections as well. All these aspects can be quantified in terms of different kinds of entropy and as a consequence as information. To put it simply, any physical system is made up of components and the way they are interconnected - these are both quantifiable in terms of information because underlying both is pattern in configuration. The pattern may be called structure and it is this we look at next.
Structure and Identity
By structure, we particularly mean the way the components are connected together with causal relationships (links). More
specifically a structure is an ordered set of material components in
which the order (placement of each in relation to the others) creates a
whole (see 'The Ma of Ecology' for further explanation). But what then, is a whole?
One answer that is very strong is the idea of a 'Kantian whole' -
named after the philosopher Immanuel Kant by Stuart Kauffman to
represent a system in which "all the parts exist for and as a
consequence of the whole". In other words a system with closure to efficient causation,
as Rosen and followers, Hoffmeyr and others would describe a system,
the components of which are made by the system, which in turn exists
because of the functioning of those same components it made (see Circular Causation for further explanation of this). In this definition, a whole and therefore a structure (which may or may not be complex) is its own cause (see Causal Closure).
Such a system has an identity and functions and with those come the
teleological (meaningful, purposful) attributes that we can only
legitimately attach to living things.
Summary
This theme contains the central application of our thinking: to
explain how life as a general phenomenon, independent of the scale we
look at it, is a kind of information processing. In recognising that
life’s information is embodied as functional complexity and using
information theory to understand how this naturally builds a hierarchy
of functional levels (see here) in which each of life’s defining phenomena play
out, we emphasise the unity over scales and through time. The rules
generating complexity apply continuously from atoms to whole ecosystems
and integrate life with the wider universe of information dynamics.
In
a fundamental sense, found through an information-theory perspective,
life as a whole is seen to be a single process, much elaborated by its
inherent complexity. But complexity is not a vague description of
diversity and intricacy: it has a formal and functional definition
which supports a deep and robust theory of life and its place in the
wider universe.
The
core idea here is that:
1) embodied information constrains the action
of physical forces among ensembles of interacting components (for
example a lot of atoms or biological cells) ;
2) the constraint of
forces make efficient causes;
3) when causes are organised (by information embodied by the structure of an ensemble) into functional sets of interactions, the ensemble can become a complex system;
4) a set of complex systems can act as the components of higher level complex systems, creating a hierarchy;
5) because all that is determined by information embodied in a nested
hierarchy of organisational levels, we should be able to quantify
biocomplexity and its functions in terms of information.
This theme gathers efforts to do exactly that.
The physical foundations of complexity
There are four physical forces: strong and weak nuclear,
electromagnetic and gravity (which is not quite the same, being a
phenomenon of space-time). Still, the only forces of much biological
significance are the electromagnetic. These forces act upon, and
eminate from, elementary particles, atoms (just not so much with the
noble elements) and ions. Without constraint they operate in all
directions (A), but if constrained by a spatial configuration such as
the regular lattice (B), they act coherently and so become effective at
the macroscopic scale. The confifuration is a limitation of where the
particals are placed - very precisely in the crystal latice shown.
Constraint on the placement of items is equivalent to embodying
information in the configuration of those items, so (B) embodied
information. Crystals are entirely inert and simple - not complex
systems. But we do not have stop there. In (C) atoms of several kinds
are arrayed in two different molecules which have an electron cloud
surface shape in which one of them fits rather well to the other. The
atoms at the surface also complement one another's left-over attractive
forces and so the two match and join together. The information
constraint of each causes the electrical forces of attraction to do
some work and information embodied in one effectively recognises the
information embodied in the other and they unite. This is exactly what
goes on when a signal molecule, such as a hormone is recognised by its
receptor molecule. Not only that, but the configuration of the receptor
might spontaneously change when the connection is made (because of the
change in electrical charge distribution the joining causes). This
change can be arranged so that the receptor then releases or attracts
another kind of molecule and we have a signalling system. Or it may
change conformation (shape) to open a hole within it and let
small molecules through (as in the gated ion channel illustrated
in (D)). Either way, the matching of patterns which are embodied
information results in a functional unit - something that can perform a
useful function in the wider context of a system.
The idea that information is essential to life is familiar, but has
been largely confined to the molecular scale (considered in depth by
the Molecular Biology
theme. Here we extend the concept that life is an information
phenomenon to apply at every level of organisation, from molecules to
the global ecological system. Our synthesis
arrives at the conclusion that living is information processing: the
transformation of information by logical operations, together with its
transmission (in communications and reproduction) and storage. Memory
is maintained by both molecular states and ecological states as well as
the more obvious nucleic acids; more generally, information is stored
by life by embodying it in structure at multiple scales of
organisation, from the shape of biomolecules to the networks of
interaction among the populations of an ecosystem. the two main means
life uses to process information are filtration (as in cognition) which
selects by context and synthesis, especially combining information at
lower levels of organisation to appear at higher levels in complex
systems (emergence). This information processing has one overall
function: it is to perpetuate itself as that is the ultimate function
of life.
Life’s information is instantiated as pattern in form embodying living
structures, such as molecular and cellular structures. The
corresponding pieces of information are combined by the creation
of mutual context among forms: one form ‘means’ something to another
such that a process may take place when they encounter one another (for
example when a hormone meats its receptor). This context results in
apparently new information, but it is not in fact new, it is ‘revealed’
by the process as an emergent property of the system. This constructive
process forms arbitrarily large complexes of information, the combined
effects of which include the functions of life.
In terms of a computer analogy, life is both the data and the program
and its biochemical structure is the way the information is embodied. A
cell can be seen as a set of algorithms running on biochemistry; an
organism as a set of algorithms running on a community of cells and an
ecosystem as a set of algorithms running on a community of organisms.
This idea supports the seamless integration of life at all scales with
the physical universe.
References
Albantakis, L. (2021). Quantifying the Autonomy of Structurally Diverse Automata: A Comparison of Candidate Measures. Entropy 23,1415. https://doi.org/10.3390/ e23111415
Bertschinger, N.; Olbrich, E.; ay, N.; Jost, J. (2006). Information and closure in systems theory. German Workshop on Artificial Life <7, Jena, July 26 - 28, 2006>: Explorations in the complexity of possible life, 9-19.
Haylighen, F. (1996). What is complexity? Principia Cybernetica Web. pespmc1.vub.ac.be/COMPLEXI.html
Ladyman, J., Lambert, J., Wiesner, K. (2013). What is a complex system? European Journal for Philosophy of Science. 3, 33-67.
Ladyman, J., Wiesner, K. (2020). What is a complex system? Yale University Press.
Hofmayr, J-H. S. (2021). A biochemically-realisable relational model of the self manufacturing cell. (2021). Biosystems. 207:104463. doi:10.1016/j.biosystems.2021.104463
Oizumi, M.; Albantakis, L.; Tononi, G. (2014). From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0. PLoS Comput. Biol. 10, e1003588.
Kauffman, S. A. and Clayton, P. (2006). On emergence, agency and organisation. – Phil. Biol. 21: 501–521.
This Theme seeks to:
- Define complexity in the biological context and show how it can be quantified;
- Develop a theory of how complexity at each level of organisation produces the phenomena of the next level, in terms that are independent of organisational level (see this page for new insights);
- Develop this into an integrated information-based
understanding of the complexity of life.
The Theme is led by
Dr Keith Farnsworth