Glossary of Terms
The following definitions are brief, loose, and approximate. But despite these features, they should give readers a decent understanding of what we mean when we use these terms.
Is there a term you would like to see here? Click here to send your suggestion!
Adaptive system engineering- the general enterprise of engineering a system to be adaptable. The mechanism for adaptation can be evolution, learning, phenotypic plasticity, swarm morphogenesis, etc. The system can be deployed with locked-in adaptation (i.e. an evolved static antenna), or with a continuing ability to adapt. Applications include communication networks, transportation networks, evolutionary computation, adaptive software, reconfigurable computing, video game design, smart materials, etc.
Applies, Fails to Apply, or Does not Apply- This is an important distinction in philosophy. A property X applies to an object (or phenomena) A iff "A is X" is true. Property X fails to apply to A iff "A is X" is false. Property X does not apply to A iff "A is X" is not true, false, or vague. This happens when there is a conceptual gap between the possible exhibitors of the property and the category of the proposed exhibitor; i.e. "A is X" just doesn't make any sense. An example of not applying is "honesty is orange". That's not false, it just doesn't mean anything.
Autocatalytic Set- An collection of entities (molecules, people, nations, institutions, whatever) that produces as outputs the same elements which are necessary inputs for expanding the collection. This special kind of feedback is a key process in Stuart Kauffmann's theory for the original of life. Note the distinction between self-generating and self-maintaining processes.
Autopoietic Set- Autopoiesis is the process of dynamics self-maintenance. These collections of entities (molecules, people, cars, stars, whatever) interact in such a way that while the constituting elements may be replaced, the systems as a whole keeps the same cohesive structure. This is clearly an important concept in complex systems that gets very little attention (by name at least).
Building Block- When parts of a collection organize themselves into patterns that we recognize as coherent phenomena then we say that it is an emergent phenomena. Those coherent systems sometimes act as parts of a larger system. When systems are built from hierarchies in this way we refer to the coherent subsystems as building blocks and the system as modular.
Cellular automata (CA)- a system of simple rule-based
computational cells, often connected in a regular, static topology. Each cell
acts as a finite state automaton. [To hear a pronunciation of the word
click here.] The next state of each cell is dependent on
the state of cells in a local neighborhood (local in space and in time)
according to the transition rule (aka update rule, aka state table). The
cells are updated according to a schedule and an ordering scheme.
Clock-driven scheduling is typically used so that each cell is updated once per
system time interval. Within a system time
interval, the cell updates can be modeled as synchronous (as if all updates occur in parallel), or
asynchronous (order dependent). CAs were pioneered by John von Neumann and
many others, with memorable results such as the first self-replicating computing
system, and the popular Game of Life. CAs are generally thought of as a
different class of system than an agent-based model, although one could view
each cell as an immobile reactive agent.
There are a variety of neighborhoods such as von Neumann (5), Moore (9),
circular, hexagonal, and extensions in time (using state history). Many CAs
define a neighborhood such that it includes the cell being updated, but this is not necessary.
Popular CA topologies include 1-dimensional (a line or a ring, which allows one
to easily visualize the time dimension) and 2-dimensional (grids, torii).
While a CA system can be simply specified (via a neighborhood, rule table, and
topology), the state space increases exponentially with the number of cells;
the set of initial conditions can be enormous, and the dynamic computational patterns can be
Complex- Though the whole field is named after this property of some systems, no accepted definition exists. And since very few philosophers are engaged in complex systems, progress has been slow to elucidate the concepts behind complex phenomena. Among many others, John Holland has identified some important hallmarks of complex systems in the book "Hidden Order". Many of the articles written for this blog will focus on investigating complexity, e.g. see blog entries Complexity vs Complication and Complexity as a Homeostatic Property Cluster.
Complicated- Sometimes complication is considered a degenerate concept to complexity. High levels of complication are associated with large numbers of moving parts with (perhaps) specialized functions. Despite having these elements in common with complex systems, complicated systems do not exhibit the interesting behavior that often arise in complex systems. This is currently not a well-defined distinction.
Discrete event simulation- simulation of a model that has discrete states, where the model is updated
according to an event-driven schedule. The schedule contains a current
list of future events and the times that those events will occur. Event-driven
scheduling differs from clock-driven scheduling (which can involve either
synchronous ordering or asynchronous ordering of state element updates).
Event-driven scheduling can be appropriate for modeling expected future events:
births that imply deaths, isotopes that decay over time, balls that go up and
predictably come down. It can be more efficient to
use event scheduling than to compute the next state for a large number of
(mostly unchanging) state elements over a series of clock-driven steps. An event-driven schedule may need to be adjusted for contingencies;
the arrival time of a train may be impacted by unexpected weather, requiring us
to delete or adjust the arrival event in the schedule. Lastly, it is
possible to implement both event-driven and clock-driven scheduling in a single
Discrete event system- a system with discrete states, and for which state changes are driven by events. Examples include queuing systems
such as a bank or a communication network, in which the set of events includes the arrival or departure of customers/packets. Many technological systems are discrete event
systems, often driven by human events; imagine a copy machine that changes state only when a button is pressed. Note that the word discrete refers here to the
system states, not to the events themselves (which would be redundant) or to time; time can modeled as either continuous or discrete. The events are assumed to occur instantaneously. A discrete event system is often modeled using a
finite state automaton.
Dispositional Property- Some properties do not exhibit themselves in an active way. Properties such as fragile or soluble apply to objects and substances whether or not they are ever broken or dissolved. A vase is fragile whenever it is the case that IF it gets struck then it will break. So a dispositional property identifies how an object reliably reacts in certain conditions. Systems exhibiting evolution and/or adaptation will generally have system-level dispositional properties (including evolvable and adaptive).
Dissipative Structures- Systems wherein a continual flow of energy and/or matter is necessary to maintain structure and performance. These are systems where there are no equilibria or all equilibria are states to be avoided (i.e. death). Biological systems are like this, constantly changing and adapting to maintain functionality, and so are most other complex systems. In such a system sustainability is very different from stability.
Downward Causation- The misguided idea that phenomena (objects, behaviors, etc.) at higher levels of organization can cause change in lower levels of organization. This is a now common mistake in complex systems pseudo-science: that emergence implies that higher-level phenomena create their own rules and their constituent parts must follow. The macro-phenomena are lossy compressions of micro-phenomena and though different cohesive patterns can be identified (aka emerge) they are wholly constituted by and explained by the lower level phenomena. Causation, and therefore explanation, can only operate laterally through a system because going up is defining and going down is ridiculous.
El Farol Bar Problem- Now frequently known in its more general form as the minority game, it originates from Brian Arthur, an economics who helped start the Santa Fe Institute. Once a week the El Farol bar had live Irish music that was enjoyable only if the place wasn't too crowded. So the trick was to decide when to go and when to stay home to maximize expected utility. Different rules, including various mixes of rules, worked to produce the (on average) optimal distribution. This is one of the best, earliest examples of agent-based modeling of complex systems.
Emergent- Emergence is one of the core, yet undefined, concepts of complex systems. Some common slogans are "More than the sum of its parts" and "More is different (Anderson)". The general underlying idea of emergence is that the interactions of a system's parts generate (under certain circumstances) coherent behavior of the whole system. Sometimes 'emergence' refers to the whole system itself (e.g. a hurricane), and sometimes to emergent behavior (e.g. the spiraling of hurricanes) or emergent properties of the system (liquidity). I'm working on a project to refine this concept and you can look through the philosophy section of the actual blog to find posts on this subject.
Emergent Hierarchies- A hierarchy is a category of levels of organization where lower levels are proper subsets of higher levels. Furthermore, in an emergent hierarchy the emergent behavior at any given level of organization emerges from the behavior of the subsystems at the immediately lower level of organization.
Evolvable hardware- A hardware system that was created or optimized with an evolutionary algorithm
(evolved hardware), or hardware that has the additional capability of evolving in-situ (evolving hardware). Evolvable hardware is a specific example of the more general adaptive system engineering.
Feedback- In the context of most of the material on this site, feedback refers to a sequence of flows such that the output of a process acts again as input for that same process. It is often the case that the output goes through several other processes before feeding back into originator. John Holland identifies feedback as one of the primary hallmarks of complex systems.
Game of Life- Two completely separate ideas have this name: 1) John Conway's famous and pioneering cellular automata model for artificial life, and 2) iterated game-theoretic models used in the evolution of morality and cooperation (e.g. by Ken Binmore). The first was actually just called 'Life', but people treated it like a game of sorts, so it sometimes gets called this. The game-theoretic usage aims to identify the series of games one has to play to survive in the world with others - the games that everybody plays every day just to get by.
Game Theory- The mathematical technique for modeling strategic interaction invented by John von Neumann and the dominant methodology for constructive models in the social sciences today. Games consist of a set of players (agents, parts) each of which has a set of actions and a utility function over the space of possible outcomes. It's strategic because the outcome each player cares about depends on the actions of all the agents. Games come in many forms, each suited to particular applications. The technique is connected to complexity because it can (to a minimal degree) capture agent interaction, dynamics, learning, and other features of complex systems. In principle, however, only the simplest games can be solved and they offer little insight into realistically complicated systems. Also, the assumptions necessary to model using game theory are almost never true of interesting systems; this limitation put devastating limitations on the fidelity of game theoretic models. Though still a useful tool to learn about complexity in limited contexts (and fully appropriate for some specific applications), its use has outlived its usefulness and more robust modeling techniques need to be adopted.
Genotype- An organism's genotype is the gene sequence that it carries. Gene expression is a complicated process and not every change in the gene sequence generates a change in the organism's physical characteristics or behavior (called the phenotype). A model of such changes called "neutral mutation networks" by Walter Fontana can be seen here. The idea of a genotype applies to artificial systems as well as biological ones; a binary string that translates into agent behavior in a computational model is also called that agent's genotype.
Group Selection- The idea that the process of natural selection can operate on collections of individuals such that collections with a certain property promulgate more than collections without the property. The idea was originally introduced by Darwin himself in 1871, but had fallen into ill repute in the twentieth century. Recent work is reviving the notion and at least one us thinks that selection processes can operate equally effectively at any level of organization. See the post on The Return of Group Selection for more details.
Homeostatic Property Cluster- Within realist (naturalist) philosophy it is a definition of a category of objects such that there is no uniform set of necessary and sufficient conditions that all and only the member objects satisfy. Though it is a conceptual matter which properties are included, there is room for ambiguity or indeterminacy in the set of objects satisfying the definition. Despite this "imprecision", the picked out objects "correspond to inductively and explanatorily relevant causal structures. (Richard Boyd)"
Homo Economicus- This term is one way to refer to agents who have complete information and act perfectly rational; that is, agents that satisfy the requirements of neo-classical economics. The term was coined by John von Neumann in relation to the `minimax theorem' he developed for the theory of games in 1928. A good deal of current research investigates various variations on the assumptions constituting homo economicus, all under the name of bounded rationality.
If and Only If (iff)- The standard abbreviation for "if and only if" is "iff". This is used to indicated a biconditional relationship, i.e. if A then B and if B then A. If can also be stated "A just in case B" and is sometimes associated with unique cause and effect relationships.
Learning agent- An agent that encodes learned information into its internal structure (rules). Example: a classifier system. A learning agent differs from a reactive agent or a model-based reactive agent.
Level of Organization- The metaconcept for hierarchies that allows for non-fully-nested series of layers; so different aggregates of a single collection of parts may emerge different things at different levels or for different models at the same level. While some models induce or require true hierarchies, a definition of emergence requires the broader notion of levels of organization. The levels are just an ordinal rank of parts aggregating to make wholes at a higher level.
Level-Sensitive- Also known as hierarchy-sensitive. A property of aggregates that changes its criterion of application, truth conditions, or other such properties depending upon the organizational level at which it is being attributed. Level-insensitive aggregate properties (like mass, velocity, etc.) apply and can be evaluated independently of any levels or organization (aka hierarchies) in the system. See Identifying Levels through Emergence for related information.
Lossy Compression- Compressions are alternative representations of information that convey some function/benefit. Lossless compressions utilize repeated structures in the data to reduce the number of parts in the compressed representation following two-way rules so that the original representation can be regained with no loss of information (like zip files). Lossy compression samples data from the original system following some rules and throws the rest of the information away (like a jpg or mp3). Phenomena examined at higher levels of organization are typically lossy compressions of their constituent parts.
Macrolevel and Microlevel- When comparing two levels of organization it is often convenient to refer to the lower (higher resolution) level as the microlevel and the higher (lower resolution) level as the macrolevel. This nomenclature makes it easier to state such things as "the macrolevel supervenes on the microlevel" or "this macro level is a lossy compression of its microlevel".
Metamodel- A model of a model is what we call a "metamodel". The most common metamodeling technique is to summarize a model's output as a distribution of values; this is a statistical model of a (say) mechanistic model. The Temporal Web and Non-Abelian Operator techniques are also metamodels. One use of metamodels is to track and measure parameter changes (like the above three), but one could also use mathematical modeling or even another ABM technique (such as a network) to capture more elaborate details of the model.
Mathematical Proof- The exact nature of a mathematical proof can vary wildly just as experiments within physics or biology can be quite different depending upon the subfield and what is being proved. Though technological and conceptual advances may alter the techniques available to scientists and mathematicians, the underlying principle behind the methodology of proof has remained unchanged for thousands of years. I quote Ian Stewart, "a proof is a logically coherent story, deriving new theorems from existing ones in a manner that can withstand the closest scrutiny by skeptical experts."
Mereology- Mereology is the philosophical study of part-whole and part-part relationships. There are multiple distinct ways that wholes can be related to their constituting parts and these can be teased apart by examining the logical relations that they employ. Mereologocal principles have a lot to offer the study of complex systems, especially with respect to an analysis of what emergence (a kind of mereological relationship) amounts to. For an in-depth discussion of mereology, refer to the Stanford Online Encyclopedia of Philosophy.
Methodology vs Methods- Though the two words are often used interchangeably without confusion, it may sometime help to draw the distinction. A methodology is a coherent collection of ways to approach problems while a method is a particular technique or tool to collect or transform data. For example, statistics is a methodology and maximum likelihood estimate is a method; agent-based computer modeling is a methodology and 2D cellular is a method.
Model-based reactive agent- An agent that reacts to input based upon an internal model (state). Examples: a neural net with a static structure but feedback loops; sequential logic (circuit with state elements). A model-based reactive agent differs from a reactive agent or a learning agent.
Modularity- A system is modular if it is made up of building blocks, i.e. coherent subsystems that can reoccur in different configurations in other systems to produce different phenomena. Herbert Simon listed modularity as a hallmark of complex systems; hierarchies of increasing complexity evolve by combining subsystems which are themselves composed of a stable collection of subsystems which are composed of … all the down. Harnessing the modular nature of systems is probably the key to engineering complex systems (in the short term). In network theory 'modularity' refers to a structure where there are discernable groups: the nodes are sparsely connected in seperate groups but inside the each group the nodes are highly connected.
Multilevel Emergence- It is possible that some emergent phenomena at different levels of organization are generated by the same collection of parts. Not just in the sense that they can both be reduced to the same level, but further that, despite being at different levels themselves, the highest level capable of generating each phenomenon is the same. There could be differences in the number of parts required or in the relevant interagent behavior, but the point is that the same agents can generate different emergent behavior at different levels without there being any intermediary level of organization to act as subsystems.
Multiplier Effect- When an agent (i.e. an element of a system) produces an output that serves as an input for another agent, which gets processed and again used as input by yet another agent and so forth down the line such that the total output generated is (much) greater than the initial input, we say that the process exhibits a multiplier effect. It is borrowed from economics where it often refers to the net effect on GDP of some policy (like a tax cut) beyond the amount of the tax cut; it is due to how money flows through value-added processes. It is also one of the hallmarks of complex systems identified by John Holland.
A natural kind is a category for which there is a natural definition in the form of a clear set of necessary and sufficient conditions that if met by some object, it belongs to that category. The typical example (due to Hilary Putnam) is 'water = H2O'. Other common (though debated) examples of natural kinds are gold, tigers, and apples. One can think of natural kinds as the categories we would find if we 'cut nature at its joints' (though one might argue that we may not have a category for everything, i.e. it wouldn't be a partition of the things in nature). The definition of a natural kind should include those properties that all and only members of that category have.
Phenotype- The physical characteristics (including behavior, dispositions, development, etc.) of an organism is called its phenotype. Under the standard evolutionary model it is the phenotype that must express properties to reveal the organism's fitness. It is distinguished from an organism's genotype, the code that generates the phenotype. Some species exhibit phenotypic plasticity, the ability to drastically change physical features (such as sex, size, breathing apparatus, etc.) independent of the development process in reaction to environmental stresses.
Phenotypic Plasticity- Genes only partially encode for the features of an adult organism. Environmental factors (such as nutrition) play an important role in development. But even fully developed adults of some species can dramatically alter their gene expression under appropriate environmental pressures. One common example of phenotypic plasticity is that adults of some fish and amphibian species can change their sex (whether the produce eggs or sperm) under certain conditions. This must be differentiated from learning (e.g. nest location), regeneration (e.g. salamander tails), and dispositional abilities (e.g. chameleon or octopus color changing, language).
Phylogenetic Tree-. A graph diagram that represents how long ago species became differentiated is called a phylogenetic tree or evolutionary tree. Branching points in the tree are speciation events that may actually lead to entirely new lineages (e.g. mammal versus birds). These diagrams and genetic markers can be used to determine how far back two species diverged, i.e. what and when was their last common ancestor.
Punctuated Equilibrium- It was originally proposed by Stephen Jay Gould to explain patterns of new species in the fossil record that did not fit Darwin's principle of gradualism. The pattern can be described as long periods of evolutionary stability with brief periods of rapid evolutionary change found (either periodically or randomly) along the timeline. It's validity for speciation is contentious, but the pattern of fits and starts in rates of change in evolutionary processes has been identified in several (natural and artificial) evolutionary systems. Many independent factors (either endogenous or exogenous) are sufficient for producing punctuated equilibria.
Rational- As used in social science it is a term of art describing agents who act to maximize expected utility. It frequently comes under fire as being an unrealistic assumption of human cognitive abilities and/or tendencies. These are usually straw man arguments since rationality is not supposed to be realistic. Rationality merely claims that for any set of observed actions, there exists a utility function such that maximizing that utility function produces the same actions as the observed ones. It is a model, an approximation, a useful fiction that has proven to be very helpful in analyzing otherwise intractable problems.
Reactive agent- An agent that simply reacts to its input. Examples: stimulus-response; feedforward neural net; lookup table; combinational logic. A reactive agent differs from a model-based reactive agent or a learning agent.
Recycling- In a complex systems context, recycling refers to the combination of feedback and a multiplier effect. Outputs are enhanced in some way by other elements of the systems and then fed back into the originating element. Through this process a collection of entities can bootstrap itself towards larger collections and greater complexity.
Resolution- When comparing two levels of organization we often wish to speak of the difference in resolution...which is, roughly speaking, the ratio of the number of parts (the scope) in the microlevel to the number of parts in the macrolevel. As such, resolution is a relative measure although it can be relative to some third level or implicit baseline (like actual size or taking the macro level scope to be 1). In most models of complex systems the resolution of a system can actually be measured explicitly, but the idea remains useful in looser contexts in discussions of emergence and reduction.
Satisfice- To choose an option that is good enough (i.e. that satisfies some condition). It is generally taken to be a technique of bounded rationality that operates instead of maximizing. However one should notice that we can convert any satisficing condition into a maximization one. For example, instead of searching for the best possible paper to read for the reading group this week I just selected the first paper that was intersting enough for the purpose at hand. But that's the same thing as maximizing if we consider a desire to keep search times low in the analysis.
Scope- The scope of a level or organization is the number of "parts" included at that level. Parts can be any kind of phenomenon (e.g. objects, behaviors, properties) so it is sometimes difficult to specify an exact number here. The scope of the system determines which elements are endogenous (included in the scope) and exogenous (not within the scope). Some models can change scope (e.g. through birth and death processes) and there is certainly some link between microlevel scope and macrolevel emergence.
Speciation- can be considered as either a process by which new species arrive over time or an event by which one species becomes two. Several mechanisms can result in speciation: evolution during physical isolation, genetic divergence within a multiple-niche environment, and mutations to key genes to name a few. Speciation events are the branching points of phylogenetic trees and are therefore also phylogeny events. A lineage may evolve into a distinct species without ever "splitting" in which case a new species comes about without a speciation event; the speciation process, however, can be said to track niche movements and the general concept still applies.
Stable Polymorphic Equilibrium- When an evolutionary process results in the persistence of multiple traits, properties, behaviors, etc. instead of a single homogenous population then it is often referred to as a stable polymorphic equilibrium. The name probably derives from differential equation models, but it's malapropos since real evolving systems are neither stable nor in equilibrium. It is better to think of them as robust distributions or persistent heterogeneous populations.
Strongly Emergent- A phenomenon is strongly emergent if the macrolevel behavior/object/property conceptually distinct from any description in terms of lower-level phenomena. The existence of strongly emergent phenomena is questionable because they require a metaphysical, rather than just an epistemological, gap between levels because they count as new stuff/things in the world beyond what is accounted for at any microlevel.
Supervenience- To say that A supervenes on B means that there can be no change in A without a change in B. One common example is that the mind supervenes on the brain: any change in one's mental state implies that there has been a change in one's brain state. This is an important concept in philosophy that has a clear connection to emergent phenomena; If phenomena A (say a hurricane) emerges from microbehavior B (movement of water vapor molecules) then whatever the interactions of B that produce A are, any change in A must be brought about via changes in B. For a very detailed explanation of the types and uses of supervenience, see the Stanford Online Encyclopedia of Philosophy
System- A standard technical definition of a system is the collection of elements for which interactions are considered internal. Any forces, relations, etc. not among the included elements are considered external. A more elegant definition of what is generally meant comes from John Haskett's Design: A Very Short Introduction: "a group of interacting, interrelated, or interdependent elements that forms, or can be considered to form, a collective entity (p.97)."
Utility Function- There are many different instantiations of utility functions in the literature, but basically they are all a map from states to the real line. One advantage of using utility functions is that all states of affairs can be compared and one state found to yield greater, less, or equal utility than others. This should be distinguished from a broader class of techniques known as preference functions which only require that any two possible options can be pair-wise compared (so it need not even be
map-able onto the real line).
Vagueness- The application of truth conditions for some properties is cut and dry; for properties with fuzzy boundary conditions, however, the truth conditions are vague. The two canonical examples are baldness and the Sorites (heap) argument. Put one grain of sand on a table, is it a heap? No. Add a grain, is it a heap now? No… (thus adding one grain never makes the difference) …But wait! Now there are a billion grains and it's a heap. For some number and arrangements it's clear whether or not the collection is a heap and for others it is unclear. If it's every unclear then that property is vague.
Weakly Emergent- A phenomenon is weakly emergent if the macro behavior/object/property is unambiguously generated from its parts (i.e. the micro behavior/objects/properties). This is one step above simple aggregates that involves some micro-level interaction producing coherent dynamics at the macro level. Institutions (like universities and governments) are examples of weakly emergent phenomena.
Wireless mesh network- A self-configuring, self-healing network, with wireless links between nodes that act as routers. Also referred to as mobile ad hoc network (MANET). A wireless mesh network is often also a peer-to-peer network.
Yield Throughput- A process' output divided by its error rate is its yield throughput. It is used as a performance metric for comparing fabrication processes in industry, computer science, and biomedical engineering. Similarities among those processes and computer models hint at an analogous measure for simulations, prediction resolution (projected data points per time) divided by the error rate of those predictions.