Monday, June 18, 2012

What is Complexity?

This is going to be a long and winding post, as there are always fundamental questions that do not have easy or short answers. Complexity is one of those concepts that may seem simple on its surface, but it encompasses a profoundly deep perspective on the nature of our existence. It is paired with simplicity in many aspects, which I wrote about in:

http://theprogrammersparadox.blogspot.ca/2007/12/nature-of-simple.html

It would be helpful in understanding my perspective on complexity to go back and read that older post before reading further.

The first thing I need to establish is that this is my view of complexity. It is inspired by many others, and it may or may not be a common viewpoint, but I’m not going to worry in this posting about getting my facts or references into the text. Instead, I’m just going to give my own intuitive view of complexity and leave it to others to pick out from it what they feel is useful (and to disregard the rest).

Deep down, our universe is composed of particles. Douglas Hofstadter in “I am a Strange Loop” used the term ‘epiphenomenon’ to describe how larger meta-behavior forms on top of this underlying particle system. Particles form molecules, which form chemicals, which form into materials, which we manipulate in our world. There are many ‘layers’ going down between us and particles. Going upwards, we collect together as groups and neighborhoods, based in cities in various regional collections to interact with each other as societies. Each of these layers is another set of discrete ‘elements’ bound together by rules that control their interaction. Sometimes these rules are unbreakable, I’ll call these formal systems. Sometimes they are very malleable: thus informal systems. A deeper explanation can be found here:

http://theprogrammersparadox.blogspot.ca/2011/12/informal-ramble.html

If we were to look at the universe in absolute terms, the sum total of everything we know is one massive complex system. It is so large that I sincerely doubt we know how large it actually is. We can look at any one ‘element’ in this overall system and talk about its ‘context’; basically all of the other elements floating about and any rules that apply to the element. That’s a nice abstract concept, but not very useful given that we can’t cope with the massive scale of the overall system. I’m not even sure that we can grok the number of layers that it has.

Because of this, we narrow down the context to something more intellectually manageable. We pick some ‘layer’ and some subset of elements and rules in which to frame the discussion. So we talk about an element like a ‘country’, and we are focused on what is happening internally in it or we talk about how it interacts with the other countries around it. We can leverage the mathematical terminology ‘with respect to’ -- abbreviated to ‘wrt’ -- for this usage. Thus we can talk about a country wrt global politics or wrt citizen unrest. This constraints the context down to something tangible.

A side-effect of this type of constraint is that we are also drawing a rather concrete border around what is essentially a finite set of particles. If we refer to a country, although there is some ambiguity, we still mean a very explicit set of particles at a particular point in time (inferred).

So what does this view of the world have to do with complexity? The first point is that if we were going to craft a metric for complexity then whatever it is it must be relative. So it is ‘complexity wrt a,b,c,.., z’. That is, some finite encapsulation of both the underlying elements (possibly all the way down to particles or lower) and some finite encapsulation of all of the rules that control their behavior, at every layer specified. Complexity then relates to a specific subsystem, rather than some type of absolute whole. Absolute complexity is rarely what we mean.

In that definition, we then get a pretty strong glimpse of the underpinnings of complexity. We could just take it as some projection based on all of the layers, elements and rules. That is of course a simplification of its essence and that in itself is subject to another set of constraints imposed by the reduction. Combined with the initial subsystem, it is easy to see why any metric for complexity is subject to a considerable number of external factors.

Another harder. but perhaps more accurate way of looking at complexity is as the size of some sort of multidimensional space. In that context we could conceive of what amounts to the equivalent of a ‘volume’, a spatial/temporal approach to looking at the space occupied by the system. This allows use to take two constrained subsystems and roughly size them up against each other. To be able to say that one is more ‘complex’ than the other

Complexity in this way of thinking has some interesting attributes. One of them is that while there is some minimum level of complexity within the subsystem, organization does appear to reduce the overall complexity. That is, in a very simple system, if the rules that bind it are increased, but the increase reduces the interactions of the epiphenomenon, the overall system could be less complex than the original one. There is a still a minimum, you can’t organize it down to nothing, but chaos increases the size of complexity (which is different from the way information theory sees the world). So there is some ‘organizational principle’ which can be used to push down complexity to its minimum, however this principle is still bound by the similar constraints that hold for any restructuring operation like simplicity. That is, things are ‘organized’ wrt some attributes.

Another interesting aspect of this perspective of complexity is how it relates to information. If complexity is elements and rules in layers, information is a path of serialization through these elements, rules and layers. That is, it is a linearized syntactic cross-section of the underlying complexity. It is composed of details and relationships that are interconnected, but flattened. In that sense we can use some aspect of Information Theory to identify attributes of an underlying subsystem. There is an inherent danger in doing this because the path through the complexity isn’t necessarily complete and may contain cycles and overlaps, but it does open the door to another method of navigating the subsystem besides ‘space’. We could also use some compression techniques to show that a particular information path is near a minimal information path. So that the traversal and the underlying subsystem are in essence as tightly woven as they could possibly be.

A key point is that complexity is subject to decomposition. That is, things can appear more or less complex by simply ignoring or adding different parts of the overall complexity. Since we are usually referring to some form of ‘wrt’, then what we are referring to is subject to where we drew these lines in the space. If we move the lines substantially, a different subsystem emerges. Since there are no physical restrictions on partitioning the lines, they are essentially arbitrary.

Subsystem complexities are not mutually independent of the overall complexity. We like to think they are, but in that all things are interrelated. However, there are some influences that are so small that they can be considered negligible. So for instance fluctuations on the temperature of Pluto (the planetiod) are unlikely to affect local city politics. The two seem unrelated, however they both exist in the same system of particles floating about in space, and they are both types of epiphenomenon, although one is composed of natural elements while the other is a rather small group of humans interacting together in a regional confrontation. It is possible (but highly unlikely) that some chunk of Pluto could come crashing down and put an end to both an entire city and any of its internal squabbling. We don’t expect this, but there is no rule forbidding it.

The way we as a species deal with complexity is by partitioning it. We simply ignore what we believe is on the outside of the subsystem and focus on what we can fit within our brains. So we tend to think that things are significantly less complex than they really are, primarily because we have focused on some layer and filtered down the elements and rules. Where we often get into trouble with this is with temporal issues. For a time, two subsystems appear independent, but at some point that changes. This often misleads people into incorrectly assessing the behaviors.

Because we have to constrain complexity, we choose to not deal with large systems, but they still affect the complexity. For the largest absolute overall system, it seems likely that there is a fixed amount of complexity possible. One has to be careful with that assumption though because we already know from Godel’s Incompleteness Theorem that there is essentially an infinite amount of stuff theoretically out there as it related to abstract formal systems. One could get caught up in a discussion about issues like the tangibility of ‘infinite’, but I think I’ll leave that for another post and just state an assumption that there likely appears to be a finite number of particles, a maximum size, an end to time and thus a finite number of interactions possible in the global system. For now we can just assume it is finite.

Because of the sheer size of the overall system, there is effectively no upper limit on how complex things in our world can become. We could apply the opposite of the earlier ‘organizational principle’ to build in, what amounts to artificial complexity and make things more complicated. We could shift the boundaries of the subsystem to make it more complex. We could also add in new abstract layers would again would increase the complexity. It is fairly easy to accomplish, and from our perspective there is effectively an infinite amount of space (wrt a lifetime) to extend into.

One way of dealing with complexity is by encapsulating it. That is cleaving off a subsystem and embedding it in a ‘black box’. This works, so long as the elements and rules within the subsystem are not influenced by things outside of the subsystem in any significant way. This restriction means that working encapsulation is restricted to what are essentially mutually independent parts. While this is similar to how we as people deal internally with complexity, it requires a broader degree of certainty of independence to function correctly. You can not encapsulate human behavior away from the rules governing economies for instance, and these days you cannot encapsulate one economy from any other on the planet, the changes in one are highly likely to affect the other. Encapsulation does work in many physical systems and often in many formal system, but only again wrt elements in the greater subsystem. That is, a set of gears in a machine may be independent of a motor, but both are subject to outside influences, such as being crushed.

Overall, complexity is difficult to define because it is always relative to some constraints and it is inherently woven through layers. We don’t tend to be able to deal with the whole, so we ignore parts and then try to convince ourselves that these parts are not effecting things in any significant way. It is evident from modern societies that we do not collectively deal with complexity very well, and that we certainly can’t deal with all of the epiphenomenon currently interacting on our planet right now. Rather we just define very small artificial subsystems, tweak them and then hope for or claim positive results. Given the vast scale of the overall system, we have no realistic way of confirming that some element or some rule is really and truly outside of what we are dealing with, or that the behavior isn’t localized or subject to scaling issues.

Mastering complexity comes from an ever-increasing stretching of our horizons. We have to accept external influences and move to partition them or accept their interactions. In software, the complexity inherent in the code comes from the environment of development and the environment of operations. Both of these influence the flow and significance of the details within the system. Fluctuations from the outside needs and understanding, drive the types of instructions we are assembling to control the computer. Our internal ‘symbols’ for the physical world align or disconnect with reality based on how well we understand their influences. As such, we are effectively modelling limited aspects of informal systems in the real world with the formal ones in a digital world. Not only is the mapping important, but also the outside subsystems that we use to design and built it. As the boundaries increase, only encapsulation and organization can help control the complexity. They provide footholds into taming the problems. The worst thing we can do with managing complexity is to draw incorrect, artificial lines and then just blind ourselves to things crossing them. Ignoring complexity does not make it go away, it is an elementary property of our existence.

No comments:

Post a Comment

Thanks for the Feedback!