Jump to content

vesica attractor; fossil evidence of gaia egg


metatron

Recommended Posts

Read my post it is about (Punctuated Equilibrium)

Punctuated Equilibrium; archetypal life forms were born on the cusp of two worlds. I believe the purpose of this is to establish an informational feed back loop anchored at a central Source.

From this evolutionary still point genetic novelty could be collected and recombined so new species can be created.

Link to comment
Share on other sites

  • Replies 59
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Read my post it is about (Punctuated Equilibrium)

 

Done, most unilluminating. You don't seem to actually understand what the idea means.

 

Punctuated Equilibrium; archetypal life forms were born on the cusp of two worlds.

 

Mars and Pluto for instance?

 

I believe the purpose of this is to establish an informational feed back loop anchored at a central Source.

 

Ever heard of the teleological fallacy?

 

From this evolutionary still point genetic novelty could be collected and recombined so new species can be created.

 

Genetic novelty doesn't need to be 'collected' or 'recombined'. Your ideas are pointlessly overcomplex, redundant and unrigorous.

Link to comment
Share on other sites

Done' date=' most unilluminating. You don't seem to actually understand what the idea means.

 

 

"In case you hadn't noticed most of the quotes are refering to the development of evolutionary theory called 'punctuated equilibrium' which is not a challenge to evolutionary theory, only a development and refinement"

Mars and Pluto for instance?

 

Ever heard of the teleological fallacy?

 

Genetic novelty doesn't need to be 'collected' or 'recombined'. Your ideas are pointlessly overcomplex, redundant and unrigorous.[/quote']

 

 

'punctuated equilibrium' is referring to the sudden change in morphological stasis after a long period of stability. are you going to admit your mistake and apologize or just edit your previous post.

 

 

 

Macrocosm\Microcosm

 

 

 

there is a center to all things

Link to comment
Share on other sites

Metatron, if Aardvark, Mokele and Ophiolite are having problems extracting a cogent argument from your posts then you have a problem. As previously noted either:

a) your content is rubbish

b) your communication skills are letting you down.

 

Lets try a different tack - post 42, wherein you post many quotes, with no apparent purpose. Please, in three or four paragraphs, without additional quotes or side issues, what do those quotes mean in relation to your theory.

Link to comment
Share on other sites

Context, only can be achieved when one refers to the information on a whole. let me be clear as possible The fossil record does not reflect Darwinian evolutionary models. The reason I am becoming redundant is because I have been accused of having only one out of context post to back this up.

 

There are many very good questions to be ask why do you keep asking the same questions over and over, challenge me !

 

I have actually made some very controversial claims with this model but no one seems to catch them. The gaps in our understanding of the basic fossil record are very well known, and are not even ones that need defending. What is worth discussing is possible solutions to these gaps not if there are gaps.

 

It appears you are defending rather than inquiring.

 

What I know about scientific approach is that you ask many question on many levels in order to achieve perspective. What is your function here?

Link to comment
Share on other sites

Metatron I am trying to understand your underlying thesis. I don't get it. I don't understand your argument. I am not being deliberately coy to draw you out to an untenable position.

I am not defending anything, for one thing I see nothing under attack.

I keep asking the same questions over and over is because so far you haven't answered any of them.

I might find it diverting to challenge you, but I can't do that if I don't know what you stand for.

My function here is to learn.

 

I am not unintelligent (IQ 145+).

I am not uneducated (B.Sc. (Hons) Geology)

I am not inexperienced (fifty six years and counting)

I am utterly confused as to your central thesis - please respond to my request in post 54: in three or four paragraphs, without additional quotes or side issues, what do those quotes [in post 42] mean in relation to your theory.

Link to comment
Share on other sites

I believe you are being honest, and I apologize if I am not reading you correctly.

I may be reacting to others that are not as sincere as yourself.

I will try to clarify and condense my view.

 

The fossil record show a disparity in the formation of complex body plans. The individual eukaryote cannot build these structures, they do not carry within themselves a blue print for an overall structure. science today is attempting to answer these questions [ via, systems science] though genomic constraints. My discovery shows the missing information in the original body design, was provided by a wave function, acting on a mass of oolitic spheres bound by a microbial substrate.

This substrate crystallized into an archetypal pattern, the first complex animal life. [source of a body plan pattern] that then spawned an entire phyla.

This central archetype then becomes a sustained, central information bank for the phyla.

Releasing new genetic information in pulses when they have been accumulated over time.

This model not only accounts for the original forms but also genetic control patterns of punctuated equilibrium. …… This is what the fossil is saying to me in the context of the fossil record . I hope this has clarified my point.

Link to comment
Share on other sites

Excellent. I now get your basic thesis. I need to return to your earlier posts and digest the details, then come back with specific comments or questions. That may take some time, as I do not wish to jump to conclusions.

In the meantime:

do you have a better image of the object?

where was it found? I'm not looking for lat/long, but formation/horizon.

Ophiolite

Link to comment
Share on other sites

This photo and photo-shop rendering is all I can find at the moment. What I really need is a photo shop animation of the dynamic.

 

It is beautiful system but at the moment I am the only one that can imagine it. I find this frustrating.

 

This is going to be a long step by step process, I am still trying to locate the other photos. I do not have the software and time that I used to have, so be patient with more photos.

The fossil has a opening all the way though the center just as the photo-shop rendering.

 

This representation is what I think this fossil would have looked like when it was alive. The right intake aperture became dominant over the left, resulting in an asymmetrical growth of extruding mineralization around the left aperture.

 

This particular vesica attractor would have resulted in a conch, or gastropod design.

The dominant right intake would develop a gill while the left developed a spiraling shell and central axis of the [columella.]

 

This would keep spiraling until the shell enclosed the left aperture complexly. This left spiraling point then became what most would assume as the front. Myself included.

 

If both chambers keep a symmetrical flow, which would have been very rare, the result would be a symmetrical body plan and two gills.

 

If the attractor retained the shell and a symmetrical flow though the apertures, the result would be a cephalopod. This shell is not a genetic adaptation but more precisely the receipt from paying {Schrödinger entropy debt} http://64.233.167.104/search?q=cache:FKs97eM3WIoJ:www.entropylaw.com/thermoevolution7.html+Schr%C3%B6dinger+entropy+debt%7D&hl=en

 

{The oolitic mass would shrink[dissipate] during this pulse into a higher ordered state.}

 

A fish’s body plan is the most perfect of all the possible out comes, and looks as though it only occurred once. All the myriad shell designs now appear to me as beautiful attempts at a fish’s body plan. Even natures screw up’s are geometrical marvels.

 

The fossil came from a creek bed cutting down though early Cambrian strata This strata is made up of dolomite limestone. The strata this originated from developed layers of a microbial mats in fine silty mud, that is devoid of any particles that would induce the growth of stromatalites, so instead you just find layers of cyanobacteia. When fine quartz particles our introduced, oolites are formed.

Link to comment
Share on other sites

  • 3 weeks later...

I just found this very abreviated but really well written intro to chaos theory, this should help anyone that is having trouble understanding this post.

 

 

 

Quote

Chaos and Complexity

 

One of the themes straddling both biological and physical sciences is the quest for a mathematical model of phenomena of emergence (spontaneous creation of order), and in particular adaptation, and a physical justification of their dynamics (which seems to violate physical laws).

 

The physicist Sadi Carnot, one of the founding fathers of Thermodynamics, realized that the statistical behavior of a complex system can be predicted if its parts were all identical and their interactions weak. At the beginning of the century, another French physicist, Henri Poincare`, realizing that the behavior of a complex system can become unpredictable if it consists of few parts that interact strongly, invented "chaos" theory. A system is said to exhibit the property of chaos if a slight change in the initial conditions results in large-scale differences in the result. Later, Bernard Derrida will show that a system goes through a transition from order to chaos if the strength of the interactions among its parts is gradually increased. But then very "disordered" systems spontaneously "crystallize" into a higher degree of order.

 

First of all, the subject is "complexity", because a system must be complex enough for any property to "emerge" out of it. Complexity can be formally defined as nonlinearity.

 

The world is mostly nonlinear. The science of nonlinear dynamics was originally christened "chaos theory" because from nonlinear equations unpredictable solutions emerge.

 

A very useful abstraction to describe the evolution of a system in time is that of a "phase space". Our ordinary space has only three dimensions (width, height, depth) but in theory we can think of spaces with any number of dimensions. A useful abstraction is that of a space with six dimensions, three of which are the usual spatial dimentions. The other three are the components of velocity along those spatial dimensions. In ordinary 3-dimensional space, a "point" can only represent the position of a system. In 6-dimensional phase space, a point represents both the position and the motion of the system. The evolution of a system is represented by some sort of shape in phase space.

 

The shapes that chaotic systems produce in phase space are called "strange attractors" because the system will tend towards the kinds of state described by the points in the phase space that lie within them.

 

The program then becomes that of applying the theory of nonlinear dynamic systems to Biology.

 

Inevitably, this implies that the processes that govern human development are the same that act on the simplest organisms (and even some nonliving systems). They are processes of emergent order and complexity, of how structure arises from the interaction of many independent units. The same processes recurr at every level, from morphology to behavior.

 

Darwin's vision of natural selection as a creator of order is probably not sufficient to explain all the spontaneous order exhibited by both living and dead matter. At every level of science (including the brain and life) the spontaneous emergence of order, or self-organization of complex systems, is a common theme.

 

Koestler and Salthe have shown how complexity entails hierarchical organization. Von Bertalanffi's general systems theory, Haken's synergetics, and Prigogine's non-equilibrium Thermodynamics belong to the class of mathematical disciplines that are trying to extend Physics to dynamic systems.

 

These theories have in common the fact that they deal with self-organization (how collections of parts can produce structures) and attempt at providing a unifying view of the universe at different levels of organization (from living organisms to physical systems to societies).

 

 

Holarchies

 

The Hungarian writer and philosopher Arthur Koestler first brought together a wealth of biological, physical, anthropological and philosophical notions to construct a unified theory of open hierarchical systems.

 

Language has to do with a hierarchical process of spelling out implicit ideas in explicit terms by means of rules and feedbacks. Organisms and societies also exhibit the same hierarchical structure. In these hierarchies, each intermediary entity ("holon") functions as a self-contained whole relative to its subordinates and as one of the dependent parts of its superordinates. Each holon tends to persist and assert its pattern of activity.

 

Wherever there is life, it must be hierarchically organized. Life exhibits an integrative property (that manifests itself as symbiosis) that enables the gradual construction of complex hierarchies out of simple holons. In nature there are no separated, indivisible, self-contained units. An "individual" is an oxymoron. An organism is a hierarchy of self-regulating holons (a "holarchy") that work in coordination with their environment. Holons at the higher levels of the hierarchy enjoy progressively more degrees of freedom and holons at the lower levels of the hierarchy have progressively less degrees of freedom. Moving up the hierarchy, we encounter more and more complex, flexible and creative patterns of activity. Moving down the hierarchy behavior becomes more and more mechanized.

 

A hierarchical process is also involved in perception and memorization: it gradually reduces the percept to its fundamental elements. A dual hierarchical processis involved in recalling: it gradually reconstructs the percept.

 

Hierarchical processes of the same nature can be found in the development of the embryo, in the evolution of species and in consciousness itself (which should be analyzed not in the context of the mind/body dichotomy but in the context of a multi-levelled hierarchy and of degrees of consciousness).

 

They all share common themes: a tendency towards integration (a force that is inherent in the concept of hierarchic order, even if it seems to challenge the second law of Thermodynamics as it increases order), an openess at the top of the hierarchy (towards higher and higher levels of complexity) and the possibility of infinite regression.

 

 

Hierarchies from Complexity

 

Stanley Salthe, by combining the metaphysics of Justus Buchler and Michael Conrad's "statistical state model" of the evolutionary process, has developed what amounts to a theory of everything: an ontology of the world, a formal theory of hierarchies and a model of the evolution of the world.

 

The world is viewed as a determinate machine of unlimited complexity. Within complexity, discontinuities arise. The basic structure of this world must allow for complexity that is spontaneously stable and that can be broken down in things divided by boundaries. The most natural way for the world to satisfy this requirement is to employ a hierarchical structure, which is also implied by Buchler's principle of ordinality: Nature (i.e., our representation of the world) is a hierarchy of entities existing at different levels of organization. Hierarchical structure turns out to be a consequence of complexity.

 

Entities are defined by four criteria: boundaries, scale, integration, continuity. An entity has size, is limited by boundaries, and consists of an integrated system which varies continuously in time.

 

Entities at different levels interact through mutual constraints, each constraint carrying information for the level it operates upon. A process can be described by a triad of contiguous levels: the one it occurs at, its context (what the philosopher Mario Bunge calls "environment") and its causes (Bunge's "structure"). In general, a lower level provides initiating conditions for a process and an upper level provides boundary conditions. Representing a dynamic system hierarchically requires a triadic structure.

 

Aggregation occurs upon differentiation. Differentiation interpolates levels between the original two and the new entities aggregate in such a way that affects the structure of the upper levels: every time a new level emerges, the entire hierarchy must reorganize itself.

 

Salthe also recalls a view of complexity due to the physicist Howard Hunt Pattee: complexity as the result of interactions between physical and symbolic systems. A physical system is dependent on the rates at which processes occur, whereas a symbolic system is not. Symbolic systems frequently serve as constraints applied to the operation of physical systems, and frequently appear as products of the activity of physical systems (e.g., the genome in a cell). A physical system can be said to be "complex" when a part of it functions as a symbolic system (as a representation, and therefore as an observer) for another part of it.

 

These abstract principles can then be applied to organic evolution. Over time, Nature generates entities of gradually more limited scope and more precise form and behavior. This process populates the hierarchy of intermediate levels of organization as the hierarchy spontaneously reorganizes itself. The same model applies to all open systems, whether organisms or ecosystems or planets.

 

By applying principles of complex systems to biological and social phenomena, Salthe attempts to reformulate Biology on development rather than on evolution. His approach is non-Darwinian to the extent that development, and not evolution, is the fundamental process in self-organization. Evolution is merely the result of a margin of error. His theory rests on a bold fusion of hierarchy theory, Information Theory and Semiotics.

 

Salthe is looking for a grand theory of nature, which turns out to be essentially a theory of change, which turns out to be essentially a theory of emergence.

 

 

General Systems Theory

 

"General Systems Theory" was born before Cybernetics, and cybernetic systems are merely a special case of self-organizing systems; but General System Theory took longer to establish itself. It was conceived in the 1930s by the Austrian biologist Ludwig Von Bertalanffy. His ambition was to create a "universal science of organization". His legacy is to have started "system thinking", thinking about systems as systems and not as mere aggregates of parts.

 

The classical approach to the scientific description of a system's behavior (whether in Physics or in Economics) can be summarized as the search for "isolable causal trains" and the reduction to atomic units. This approach is feasible under two conditions: 1. that the interaction among the parts of the system be negligible and 2. that the behavior of the parts be linear. Von Bertalanffy's "systems", on the other hand, are those entities (oe "organized complexities") that consist of interacting parts, usually described by a set of nonlinear differential equations. Systems Theory studies principles which apply to all systems, properties that apply to any entity qua system.

 

Basic concepts of Systems Theory are, for example, the following: every whole is based upon the competition among its parts; individuality is the result of a never-ending process of progressive centralization whereby certain parts gain a dominant role over the others.

 

General Systems Theory looks for laws that can be applied to a variety of fields (i.e., for an isomorphism of law in different fields), particularly in the biological, social and economic sciences (but even in history and politics).

 

General Systems Theory mainly studies "wholes", which are characterized by such holistic properties as hierarchy, stability, teleology.

 

"Open Systems Theory" is a subset of General Systems Theory. Because of the second law of Thermodynamics, a change in entropy in closed systems is always positive: order is continually destroyed. In open systems, on the other hand, entropy production due to irreversible processes is balanced by import of negative entropy (as in all living organisms). If an organism is viewed as an open system in a steady state, a theory of organismic processes can be worked out.

 

Furthermore, a living organism can be viewed as a hierarchical order of open systems, where each level maintains its structure thanks to continuous change of components at the next lower level. Living organisms maintain themselves in spite of continuous irreversible processes and even proceed towards higher and higher degrees of order.

 

Ervin Laszlo's take at a "theory of natural systems" (i.e., a theory of the invariants of organized complexity) is centered around the concept of "ordered whole", whose structure is defined by a set of constraints. Laszlo adopts a variant of Ashby's principle of self-organization, according to which any isolated natural system subject to constant forces is inevitably inhabited by "organisms" that tend towards stationary or quasi-stationary non-equilibrium states. In Laszlo's view, the combination of internal constraints and external forces yields adaptive self-organization. Natural systems evolve towards increasingly adapted states, corresponding to increasing complexity (or negative entropy).

 

Natural systems sharing an environment tend to organize in hierarchies. The set of such systems tends to become itself a system, its subsystems providing the constraints for the new system.

 

Laszlo offered rigorous foundations to deal with the emergence of order at the atomic ("micro-cybernetics"), organismic ("bio-cybernetics") and social levels ("socio-cybernetics").

 

A systemic view also permits a formal analysis of a particular class of natural systems: cognitive systems. The mind, just like any other natural system, exhibits an holistic character, adaptive self-organization, and hierarchies, and can be studied with the same tools ("psycho-cybernetics").

 

 

Synergetics

 

"Synergetics", as developed in Germany by the physicist Hermann Haken, is a theory of pattern formation in complex systems. It tries to explain structures that develop spontaneously in nature.

 

Synergetics studies cooperative processes of the parts of a system far from equilibrium that lead to an ordered structure and behavior for the system.

 

Haken's favorite example was the laser: how do the atoms of the laser agree to produce a single coherent wave flow? The answer is that the

 

laser is a self-organizing system far from the equilibrium

 

(what Prigogine would call a dissipative structure).

 

A "synergetic" process in a physical system is one in which, when energy is pumped into the system, some macroscopic structure emerges from the disorderly behavior of the large number of microscopic particles that make up the physical system. As energy is pumped into the system, initially nothing seems to happen, other than additional excitation of the particles, but then the system reaches a threshold beyond which structure suddenly emerges. The laser is such a synergetic process: a beam of coherent light is created out of the chaotic movement of particles. What happens is that energy pushes the system of particles beyond a threshold, and suddenly the particles start behaving harmoniously..

 

Since order emerges out of chaos, and chaos is not well defined, synergetics employs probabilities (to describe uncertainty) and information (to describe approximation).

 

Entropy becomes a central concept, relating Physics to Information Theory.

 

Synergetics revolves around a number of technical concepts: compression of the degrees of freedom of a complex system into dynamic patterns that can be expressed as a collective variable; behavioral attractors of changing stabilities; and the appearance of new forms as non-equilibrium phase transitions.

 

Synergetics applies to systems driven far from equilibrium, where the classic concepts of Thermodynamics are no longer adequate. It expresses the fact that order can arise from chaos and can be maintained by flows of energy/matter.

 

Systems at instability points (at the "threshold") are driven by a "slaving principle": long-lasting quantities (the macroscopic pattern) can enslave short-lasting quantities (the chaotic particles), and they can force order on them (thereby becoming "order parameters").

 

The system exhibits a stable "mode", which is the chaotic motion of its particles, and an unstable "mode", which is its macroscopic structure and behavior of the whole system. Close to instability, stable modes are "enslaved" by unstable modes and can be ignored. Instead of having to deal with millions of chaotic particles, one can focus on the macroscopic quantities. De facto, the degrees of freedom of the system are reduced.

 

Haken shows how one can write the dynamic equations for the system, and how such mathematical equations reflect the interplay between stochastic forces ("chance") and deterministic forces ("necessity").

 

 

Hypercycles

 

The German chemist Manfred Eigen was awarded the Nobel Prize in 1967 for discovering that very short pulses of energy could trigger extremely fast chemical reactions. In the following years, he started looking for how very fast reactions could be used to create and sustain life.

 

Indirectly, he ended up studying the behavior of biochemical systems far from equilibrium.

 

Eventually, Eigen came up with the concept of an "hypercycle". A hypercycle is a cyclic reaction network, i.e. a cycle of cycles of cycles (of chemical reactions). Then he proved that life can be viewed as the product of a hierarchy of such hypercycles.

 

A catalist is a substance that favors a chemical reaction. When enough energy is provided, some catalytic reactions tend to combine to form networks, and such networks may contain closed loops, called catalytic cycles.

 

If even more energy is pumped in, the system moves even farther from equilibrium, and then catalytic cycles tend ot combine to form closed loops of a higher level, or hypercycles, in which the enzymes produced by a cycle act as catalysts for the next cycle in the loop. Each link of the loop is now a catalytic cycle itself.

 

Eigen showed that hypercycles are capable of self-replication, which may therefore have been a property of nature even before the invention of living organisms.

 

Hypercycles are capable of evolution through more and more complex stages. Hypercycles compete for natural resources and are therefore subject to natural selection.

 

The hypercycle falls short of being a living system because it defines no "boundary": the boundary is the container where the chemical reaction is occurring. A living system, on the other hand, has a boundary that is part of the living system (eg, the skin).

 

Catalysis is the phenomenon by which a chemical reaction is sped up: without catalysis, all processes that give rise to life would take a lot longer, and probably would not be fast enough for life to happen. Then Eigen shows that they can be organized into an autocatalytic cycle, i.e. a cycle that is capable of self-reproducing: this is the fundamental requirement of life. A set of autocatalytic cycles gets, in turn, organized into a catalytic hypercycle. This catalytic hypercycle represents the basic form of life.

 

Formally: "hypercycles" are a class of nonlinear reaction networks. They can originate spontaneously within the population of a species through natural selection and then evolve to higher complexity by allowing for the coherent evolution of a set of functionally coupled self-replicating entities. A hypercycle is based on nonlinear autocatalysis, which is a chain of reproduction cycles which are linked by cyclic catalysis, i.e. by another autocatalysis. A hypercycle is a cycle of cycles of cycles. A hypercycle can be viewed as the next higher level in the hierarchy of autocatalytic systems.

 

Systems can be classified in four groups according to their stability with respect to fluctuations: stable systems (the fluctuations are self-regulating), indifferent systems (the fluctuations have no effect), unstable systems (self-amplification of the fluctuations) and variable systems (which can be in any of the previous states). Only the last type is suitable for generation of biological information because it can play all best tactics: indifference towards a broad mutant spectrum, stability towards selective advantages and instability towards unfavorable configurations. In other words, it can take the most efficient stance in the face of both favorable and adverse situations.

 

Eigen’s model explains the simultaneous unity (due to the use of a universal genetic code) and diversity (due to the "trial and error" approach of natural selection) in evolution. This dual process started even before life was created. Evolution of species was preceded by an analogous stepwise process of molecular evolution.

 

Whatever the mathematics, the bottom line is that natural selection itself turns out to be inevitable: given a set of self-reproducing entities that feed on a common and limited source of energetic/material supply, natural selection will spontaneously appear. Natural selection is a mathematical consequence of the dynamics of self-reproducing systems of this kind.

 

 

Dissipative Systems

 

By far, though, the most influential school of thought has been the one related to Ilya Prigogine's non-equilibrium Thermodynamics, which redefined the way scientists approach natural phenomena and brought self-organizing processes to the forefront of the study of complex systems. His theory found a stunning number and variety of fields of application, from Chemistry to Sociology. In his framework, the most difficult problems of Biology, from morphogenesis to evolution, found a natural model.

 

Classical Physics describes the world as a static and reversible system that undergoes no evolution, whose information is constant in time. Classical Physics is the science of being. Thermodynamics, instead, describes an evolving world in which irreversible processes occurs. Thermodynamics is the science of becoming.

 

The second law of Thermodynamics, in particular, describes the world as evolving from order to disorder, while biological evolution is about the complex emerging from the simple (i.e. order arising from disorder). While apparently contradictory, these two views show that irreversible processes are an essential part of the universe.

 

Furthermore, conditions far from equilibrium foster phenomena such as life that classical Physics does not cover at all.

 

Irreversible processes and non-equilibrium states turn out to be fundamental features of the real world.

 

Prigogine distinguishes between "conservative" systems (which are governed by the three conservation laws for energy, translational momentum and angular momentum, and which give rise to reversible processes) and "dissipative" systems (subject to fluxes of energy and/or matter). The latter give rise to irreversible processes.

 

The theme of science is order. Order can come either from equilibrium systems or from non-equilibrium systems that are sustained by a constant source (or, dually, by a persistent dissipation) of matter/energy. In the latter systems, order is generated by the flux of matter/energy. All living organisms (as well as systems such as the biosphere) are non-equilibrium systems.

 

Prigogine proved that, under special circumstances, the distance from equilibrium and the nonlinearity of a system drive the system to ordered configurations, i.e. create order. The science of being and the science of becoming describe dual aspects of Nature.

 

What is needed is a combination of factors that are exactly the ones found in living matter: a system made of a large collection of independent units which are interacting with each other, a flow of energy through the system that drives the system away from equilibrium, and nonlinearity. Nonlinearity expresses the fact that a perturbation of the system may reverberate and have disproportionate effects.

 

Non-equilibrium and nonlinearity favor the spontaneous development of self-organizing systems, which maintain their internal organization, regardless of the general increase in entropy, by expelling matter and energy in the environment.

 

When such a system is driven away from equilibrium, local fluctuations appear. This means that in places the system gets very unstable. Localized tendencies to deviate from equilibrium are amplified. When a threshold of instability is reached, one of these runaway fluctuations is so amplified that it takes over as a macroscopic pattern. Order appears from disorder through what are initially small fluctuations within the system. Most fluctuations die along the way, but some survive the instability and carry the system beyond the threshold: those fluctuations "create" new form for the system. Fluctuations become sources of innovation and diversification.

 

The potentialities of nonlinearity are dormant at equilibrium but are revelead by non-equilibrium: multiple solutions appear and therefore diversification of behavior becomes possible.

 

Technically speaking, nonlinear systems driven away from equilibrium can generate instabilities that lead to bifurcations (and symmetry breaking beyond bifurcation). When the system reaches the bifurcation point, it is impossible to determine which path it will take next. Chance rules. Once the path is chosen, determinism resumes.

 

The multiplicity of solutions in nonlinear systems can even be interpreted as a process of gradual "emancipation" from the environment.

 

Most of Nature is made of such "dissipative" systems, of systems subject to fluxes of energy and/or matter. Dissipative systems conserve their identity thanks to the interaction with the external world. In dissipative structures, non-equilibrium becomes a source of order.

 

These considerations apply very much to living organisms, which are prime examples of dissipative structures in non-equilibrium. Prigogine's theory explains how life can exist and evolution work towards higher and higher forms of life. A "minimum entropy principle" characterizes living organisms: stable near-equilibrium dissipative systems minimize their rate of entropy production.

 

From non-equilibrium Thermodynamics a wealth of concepts has originated: invariant manifolds, attractors, fractals, stability, bifurcation analysis, normal forms, chaos, Lyapunov exponents, entropies. Catastrophe and chaos theories turn out to be merely special cases of nonlinear non-equilibrium systems.

 

In concluding, self-organization is the spontaneous emergence of ordered structure and behavior in open systems that are in a state far from equilibrium described mathematically by nonlinear equations.

 

 

Catastrophe Theory

 

Rene' Thom's catastrophe theory, originally formulated in 1967 and popularized ten years later by the work of the British mathematician Erich Zeeman, became a widely used tool for classifying the solutions of nonlinear systems in the neighborhood of stability breakdown.

 

In the beginning, Thom, a French mathematician, was interested in structural stability in topology (stability of topological form) and was convinced of the possibility of finding general laws of form evolution regardless of the underlying substance of form, as already stated at the beginning of the century by D'Arcy Thompson.

 

Thom's goal was to explain the "succession of form". Our universe presents us with forms (that we can perceive and name). A form is defined, first and foremost, by its stability: a form lasts in space and time. Forms change. The history of the universe, insofar as we are concerned, is a ceaseless creation, destruction and transformation of form. Life itself is, ultimately, creation, growth and decaying of form.

 

Every physical form is represented by a mathematical quantity called "attractor" in a space of internal variables. If the attractor satisfies the mathematical property of being "structurally stable", then the physical form is the stable form of an object. Changes in form, or morphogenesis, are due to the capture of the attractors of the old form by the attractors of the new form. All morphogenesis is due to the conflict between attractors. What catastrophe theory does is to "geometrize" the concept of "conflict".

 

The universe of objects can be divided into domains of different attractors. Such domains are separated by shock waves. Shock wave surfaces are singularities called "catastrophes". A catastrophe is a state beyond which the system is detroyed in an irreversible manner. Technically speaking, the "ensembles de catastrophes" are hypersurfaces that divide the parameter space in regions of completely different dynamics.

 

The bottom line is that dynamics and form become dual properties of nonlinear systems.

 

This is a purely geometric theory of morphogenesis, His laws are independent of the substance, structure and internal forces of the system.

 

Thom proves that in a 4-dimensional space there exist only 7 types of elementary catastrophes. Elementary catastrophes include: "fold", destruction of an attractor which is captured by a lesser potential; "cusp", bufurcation of an attractor into two attractors; etc. From these singularities, more and more complex catastrophes unfold, until the final catastrophe. Elementary catastrophes are "local accidents". The form of an object is due to the accumulation of many of these "accidents".

 

 

The Origin of Regularity

 

Prigogine's "bifurcation theory" is a descendent of the theory of stability initiated by the Russian mathematician Aleksander Lyapounov. Rene' Thom's catastrophe theory is particular case of bifurcation theory, so they all belong to the same family. They all elaborate on the same theorem, Lyapounov's theorem: for isolated systems, thermodynamic equilibrium is an attractor of nonequilibrium states.

 

Then the story unfolds, leading to dissipative systems and eventually to the reversing of Thermodynamics' fundamental assumption, the destruction of structure. Order emerges from the very premises that seem to deny it.

 

Jack Cohen and Ian Steward are among those who study how the regularities of nature (from Cosmology to Quantum Theory, from Biology to Cognitive Psychology) emerge from the underlying chaos and complexity of nature: "emergent simplicities collapse chaos". They proved that external constraints are fundamental in shaping biological systems (DNA does not uniquely determine an organism) and defined new concepts: "simplexity" (the tendency of simple rules to emerge from underlying disorder and complexity) and "complicity" (the tendency of interacting systems to coevolve leading to a growth of complexity). Simplexity is a "weak" form of emergence, and is ubiquitous. Complicity is a stronger form of emergence, and is responsible for consciousness and evolution.

 

Emergence is the rule, not the exception, and it is shaped by simplexity and complicity.

 

 

Emergent Computation

 

Emergent computation is to standard computation what nonlinear systems are to linear systems: it deals with systems whose parts interact in a nontrivial way. Both Turing and Von Neumann, the two mathematicians who inspired the creation of the computer, were precursors in emergent computation: Turing formulated a theory of self-catalytic systems and Von Neumann studied self-replicating automata.

 

Alan Turing (in the 1950's) advanced the reaction-diffusion theory of pattern formation, based on the bifurcation properties of the solutions of differential equations.

 

Turing devised a model to generate stable patterns:

 

 

X catalyzes itself: X diffuses slowly

 

 

X catalyzes Y: Y diffuses quickly

 

 

Y inhibits X

 

 

Y may or may not catalyze or inhibit itself

 

Some reactions might be able to create ordered spatial schemes from disordered schemes. The function of genes is purely catalytic: they catalyze the production of new morphogenes, which will catalyze more morphogenes until eventually form emerges.

 

Von Neumann saw life as a particular class of automata (of programmable machines). Life's main property is the ability to reproduce. Von Neumann proved that a machine can be programmed to make a copy of itself.

 

Von Neumann's automaton was conceived to absorb matter from the environment and process it to build another automaton, including a description of itself. Von Neumann realized (years before the genetic code was discovered) that the machine needed a description of itself in order to reproduce. The description itself would be copied to make a new machine, so that the new machine too could copy itself.

 

In Von Neumann's simulated world, a large checkerboard was a simplified version of the world, in which both space and time were discrete. Time, in particular, was made to advance in discrete steps, which meant that change could occur only at each step, and simultaneously for everything that had to change.

 

Von Neumann's studies of the 1940s led to an entire new field of Mathematics, called "cellular automata". Technically speaking, cellular automata are discrete dynamical systems whose behavior is completely specified in terms of a local relation. In practice, cellular automata are the computer scientist's equivalent of the physicist's concept of field. Space is represented by a uniform grid and time advances in discrete steps. Each cell of space contains bits of information. Laws of nature express what operation must be performed on each cell's bits of information, based on its neighbor's bits of information. Laws of nature are local and uniform. The amazing thing is that such simple "organisms" can give rise to very complex structures, and those structures recur periodically, which means that they achieve some kind of stability.

 

Von Neumann's idea of the dual genetics of self-reproducing automata (that the genetic code must act as instructions on how to build and organism and as data to be passed on to the offspring) was basically the idea behind what will be called DNA: DNA encodes the instructions for making all the enzymes and the protein that a cell needs to function and DNA makes a copy of itself every time the cell divides in two. Von Neumann indirectly understood other properties of life: the ability to increase its complexity (an organism can generate organisms that are more complex than itself) and the ability to self-organize.

 

When a machine (e.g., an assembly line) builds another machine (e.g., an appliance), there occurs a degradation of complexity, whereas the offsprings of living organisms are at least as complex as their parents and their complexity increases in evolutionary times. A self-reproducing machine would be a machine that produces another machine of equal of higher complexity.

 

By representing an organism as a group of contigous multi-state cells (either empty or containing a component) in a 2-dimensional matrix, Von Neumann proved that a Turing-type machine that can reproduce itself could be simulated by using a 29-state cell component.

 

John Conway is the inventor of a game "Life", that is staged in Von Neumann’s checkerboard world (in which the state of a square changes depending on the adjacent squares). Conway proved that, given enough resources and time, self-reproducing patterns will occur.

 

Turing proved that there exists a universal computing machine. Von Neumann proved that there exists a universal computing machine which, given a description of an automaton, will construct a copy of it, and, by extension, that there exists a universal computing machine which, given a description of a universal computing machine, will construct a copy of it, and, by extension, that there exists a universal computing machine which, given a description of itself, will construct a copy of itself.

 

The two most futuristic topics addressed by Cybernetics were self-reproducing machines and self-organizing systems. They are pervasive in nature, and modern technologies make it possible to dream of building them artificially as well. Still, they remained merely speculative. The step that made emergent computation matter to the real world came from the computational application of the two pillars of the synthetic theory of evolution, namely the genetic code and adaptation.

 

 

Genetic Algorithms

 

The momentum for the computational study of genetic algorithms and adaptive systems was created in large part by John Holland's work. In the 1970s, the American computer scientist John Holland had the intuition that the best way to solve a problem is to mimick what biological organisms do to solve their problem of survival: to evolve (through natural selection) and to reproduce (through genetic recombination). Genetic algorithms apply recursively a series of biologically-inspired operators to a population of potential solutions of a given problem. Each application of operators generates new populations of solutions which should better and better approximate the best solution. What evolves is not the single individual but the population as a whole.

 

Genetic algorithms are actually a further refinement of search methods within problem spaces. Genetic algorithms improve the search by incorporating the criterion of "competition".

 

Recalling Newell and Simon's definition of problem solving as "searching in a problem space", David Goldberg defines genetic algorithms as "search algorithms based on the mechanics of natural selection and natural genetics". Unlike most optimization methods, that work from a single point in the decision space and employ a transition method to determine the next point, genetic algorithms work from an entire "population" of points simultaneously, trying many directions in parallel and employing a combination of several genetically-inspired methods to determine the next population of points.

 

One can employ simple algorithms such as "reproduction" (that copies chromosomes according to a fitness function), "crossover" (that switches segments of two chromosomes) and "mutation", as well as more complex algorithms such as "dominance" (a genotype-to-phenotype mapping), "diploidy" (pairs of chromosomes), "abeyance" (shielded against overselection), "inversion" (the primary natural mechanism for recoding a problem, by switching two points of a chromosome); and so forth.

 

Holland's classifier (which learns new rules to optimize its performance) was the first practical application of genetic algorithms. A classifier system is a machine learning system that learns syntactically rules (or "classifiers") to guide its performance in the environment. A classifier system consists of three main components: a production system, a credit system (such as the "bucket brigade") and a genetic algorithm to generate new rules. Its emphasis on competition and coopertation, on feedback and reinforcement, rather than on pre-programmed rules, set it apart from knowledge-based models of Artificial Intelligence.

 

A measure function computes how "fit" an individual is. The selection process starts from a random population of individual. For each individual of the population the fitness function provides a numeric value for how much the solution is far from the ideal solution. The probability of selection for that individual is made proportional to its "fitness". On the basis of such fitness values a subset of the population is selected. This subset is allowed to reproduce itself through biologically-inspired operators of crossover, mutation and inversion.

 

Each individual (each point in the space of solutions) is represented as a string of symbols. Each genetic operators perform an operation on the sequence or content of the symbols.

 

When a message from the environment matches the antecedent of a rule, the message specified in the consequent of the rule is produced. Some messages produced by the rules cycle back into the classifier system, some generate action on the environment. A message is a string of characters from a specified alphabet. The rules are not written in the first-order predicate logic of expert systems, but in a language that lacks descriptive power and is limited to simple conjunctive expressions.

 

Credit assignment is the process whereby the system evaluates the effectiveness of its rules. The "bucket brigade" algorithm assigns a strength (a maesure of its past usefulness) to each rule. Each rule then makes a bid (proportional to its strength and to its relevance to the current situation) and only the highest bidding rules are allowed to pass their messages on. The strengths of the rules are modified according to an economic analogy: every time a rule bids, its strength is reduced of the value of the bid while the strength of its "suppliers" (the rules that sent the messages matched by this bidder) are increased. The bidder strength will in turn increase if its consumers (the rules that receive its message) will become bidders. This leads to a chain of suppliers/consumers whose success ultimately depends on the success of the rules that act directly on the environment.

 

Then the system replaces the least useful (weak) rules with newly generated rules that are based on the system's accumulated experience, i.e. by combining selected "building blocks" ("strong" rules) according to some genetic algorithms.

 

Holland then went on to focus on "complex adaptive systems". Such systems are governed by principles of anticipation and feedback. Based on a model of the world, an adaptive system anticipates what is going to happen. Models are improved based on feedback from the environment.

 

Complex adaptive system are ubiquitous in nature. They include brains, ecosystems and even economies. They share a number of features: each of these systems is a network of agents acting in parallel and interacting; behavior of the system arises from cooperation and competitiong among its agents; each of these systems has many levels of organization, with agents at each level serving as building blocks for agents at a higher level; such systems are capable of rearranging their structure based on their experience; they are capable of anticipating the future by means of innate models of the world; new opportunities for new types of agents are continously beeing created within the system.

 

All complex adaptive systems share four properties (aggregation, nonlinearity, flowing, diversity) and three mechanisms (categorization by tagging, anticipation through internal models, decomposition in building blocks).

 

Each adaptive agent can be represented by a framework consisting of a performance system (to describe the system's skills), a credit-assignment algorithm (to reward the fittest rules) and a rule-discovery algorithm (to generate plausible hypotheses).

 

 

The Edge of Chaos

 

A new theoretical breakthrough occurred when Chris Langton demonstrated that physical systems achieve the prerequisites for the emergence of computation (i.e., transmission, storage, modification) in the vicinity of a phase transition ("at the edge of chaos"). Specifically, information becomes an important factor in the dynamics of cellular automata in the vicinity of the phase transition between periodic and chaotic behavior, i.e. between order and chaos.

 

The idea is that systems undergo transformations, and while they transform they constantly move from order to chaos and back. This transition is similar to the "phase transitions" undergone by a substance when it turns liquid or solid or fluid. When ice turns into water, the atoms have not changed, but the system as a whole has undergone a phase transition. Microscopically, this means that atoms are behaving in a different way. The transition of a system from chaos to order and back is similar in that the system is still made of the same parts, but they behave in a different way.

 

The state between order and chaos (the "edge of chaos") is sometimes a very "informative" state, because the parts are not as rigidly assembled as in the case of order and, at the same time, they are not as loose as in the case of chaos. The system is stable enough to keep information and unstable enough to dissipate it. The system at the edge of chaos is both a storage and a broadcaster of information.

 

At the edge of chaos, information can propagate over long distances without decaying appreciably, thereby allowing for long-range correlation in behavior: ordered configurations do not allow for information to propagate at all, and disordered configurations cause information to quickly decay into random noise.

 

This conclusion is consistent with Von Neumann's findings. A fundamental connection therefore exists between computation and phase transition.

 

The edge of chaos is where the system can perform computation, can metabolize, can adapt, can evolve. In a word: these systems can be alive.

 

Basically, Langton proved that Physics can support life only in a very narrow boundary between chaos and order. In that locus it is possible to build artificial organisms that will settle into recurring patterns conductive to an orderly transmission of information.

 

Langton also related phase transitions, computation and life, which means that he built a bridge between Thermodynamics, Information Theory and Biology.

 

The edge of chaos is also the locus of Murray Gell-Man's speculations. Gell-Man, a physicist who was awarded the Nobel prize for theorizing about the quarks, thinks that biological evolution is a complex adaptive system that complies with the second law of Thermodynamics once the entire environment, and not only the single organism, is taken into account.

 

Living organisms dwell "on the edge of chaos", as they exhibit order and chaos at the same time, and they must exhibit both in order to survive. Living organisms are complex adaptive systems that retrieve information from the world, find regularities, compress them into a schema to represent the world, predict the evolution of the world and prescribe behavior for themselves. The schema may undergo variants that compete with one another. Their competition is regulated by feedback from the real world under the form of selection pressure. Disorder is useful for the development of new behavior patterns that enable the organism to cope with a changing environment.

 

Technically speaking, once complex adaptive systems establish themselves, they operate through a cycle that involves variable schemata, randomness, phenotypic consequences and feedback of selection pressures to the competition among schemata.

 

 

Complex Systems

 

The American biologist Stuart Kauffman is the prophet of "complex" systems. Kauffman's quest is for the fundamental force that counteracts the universal drift towards disorder required by the second law of Thermodynamics. His idea is that Darwin was only half right: systems do evolve under the pressure of natural selection, but their quest for order is helped by a property of our universe, the property that "complex" systems just tend to organize themselves. Darwin's story is about the power of chance: by chance life developed and then evolved. Kauffman's story is about destiny: life is the almost inevitable result of a process inherent in nature.

 

Kauffman's first discovery was that cells behave like mathematical networks.

 

In the early 1960s, Monod and others discovered that genes are assembled not in a long string of instructions but in "genetic circuits". Within the cell, there are regulatory genes whose job is to turn on or off other genes. Therefore genes are not simply instructions to be carried out one after the other, they realize a complex network of messages. A regulatory gene may trigger another regulatory gene that may trigger another gene… etc. Each gene is typically controlled by two to ten other genes. Turning on just one gene may trigger an avalanche of effects.

 

The genetic program is not a sequence of instructions but rather a regulatory network that behaves like a self-organizing system.

 

By using a computer simulation of a cell-like network, Kauffman proved that, in any organism, the number of cell types must be approximately the square root of the number of genes.

 

He starts where Langton ended. His "candidate principle" states that organisms change their interactions in such a way to reach the boundary between order and chaos.

 

For example, the Danish physicist Per Bak studied the pile of sand, whose collapse under the weight of a new grain is unpredictable: the pile self-organizes. No external force is shaping the pile of sand, it is the pile of sand that organizes itself.

 

Further examples include any ecosystem (in which organisms live at the border between extinction and overpopulation), the price of a product (which is defined by supply and demand at the border of where nobody wants to buy it and where everybody wants to buy it). Evolution proceeds towards the edge of chaos. Systems on the boundary between order and chaos have the flexibility to adapt rapidly and successfully.

 

Living organisms are a particular type of complex adaptive systems. Natural selection and self-organization complement each other: they create complex systems poised at the edge between order and chaos, which are fit to evolve in a complex environment. At all levels of organization, whether of living organisms or ecosystems, the target of selection is a type of adaptive system at the edge between chaos and order.

 

Kauffman's mathematical model is based on the concept of "fitness landscapes" (originally introduced by Sewall Wright). A fitness landscape is a distribution of fitness values over the space of genotypes.

 

Evolution is the traversing of a fitness landscape. Peaks represent optimal fitness. Populations wander driven by mutation, selection and drift across the landscape in their search for peaks. It turns out that the best strategy for reaching the peaks occurs at the phase transition between order and disorder, or, again, at the edge of chaos. The same model applies to other biological phenomena and even nonbiological phenomena, and may therefore represent a universal law of nature.

 

Adaptive evolution can be represented as a local hill climbing search converging via fitter mutants toward some local or global optimum. Adaptive evolution occurs on rugged (multipeaked) fitness landscapes. The very structure of these landscapes implies that radiation and stasis are inherent features of adaptation. The Cambrian explosion and the Permian extinction (famous paradoxes of the fossil record) may be the natural consequences of inherent properties of rugged landscapes.

 

Kauffman also noted how complex (nonlinear dynamic) systems which interact with the external world classify and know their world through their attractors.

 

Kauffman's view of life can be summarized as follows: autocatalytic networks (networks that feed themselves) arise spontaneously; natural selection brings them to the edge of chaos; a genetic regulatory mechanism accounts for metabolism and growth; attractors lay the foundations for cognition. The requirements for order to emerge are far easier than traditionally assumed.

 

The main theme of Kauffman's research is the ubiquitous trend towards self-organization. This trend causes the appearance of "emergent properties" in complex systems. One such property is life.

 

There is order for free.

 

Far from equilibrium, systems organize themselves. The way they organize themselves is such that it creates systems at higher levels, which in turn tend to organize themselves. Atoms organize in molecules that organize in autocatalytic sets that organize in living organisms that organize in ecosystems.

 

The whole universe may be driven by a principle similar to autocatalysis. The universe may be nothing but a hierarchy of autocatalytic sets.

 

 

Autonomous Systems

 

The Chilean neurologist Francisco Varela has adapted Maturana's thought to the theory of autonomous systems, by merging the themes of autonomy of natural systems (i.e. internal regulation, as opposed to control) and their informational abilities (i.e., cognition) into the theme of a system possessing an identity and interacting with the rest of the world.

 

The organization of a system is the set of relations that define it as a unity. The structure of a system is the set of relations among its components. The organization of a system is independent of the properties of its components. A machine can be realized by many sets of components and relations among them. Homeostatic systems are systems that keep the values of their variables within a small range of values, i.e. whose organization makes all feedback internal to them.

 

An autopoietic system is a homeostatic system that continously generates its own organization (by continously producing components that are capable of reproducing the organization that created them). Autopoietic systems turn out to be autonomous, to have an identity, to be unities, and to compensate external perturbations with internal structural changes.

 

Living systems are autopoietic systems in the physical space.

 

The two main features of living systems follow from this: self-reproduction can only occur in autopoietic systems, and evolution is a direct consequence of self-reproduction.

 

Every autonomous system is organizationally closed (they are defined as a unity by their organization).

 

The structure constitutes the system and determines its behavior in the environment; therefore, information is a structural aspect, not a semantic one. There is no need for a representation of information. Information is "codependent". Mechanisms of information and mechanisms of identity are dual. The cognitive domain of an autonomous system is the domain of interaction that it can enter without loss of closure.

 

An autonomous unit always exhibits two aspects: it specifies the distinction between self and notself, and deals with its environment in a cognitive fashion.

 

The momentous conclusion that Varela reaches is that every autonomous system (ecosystems, societies, brains, conversations) is a "mind" (in the sense of cognitive processes).

 

 

A Science of Prisms

 

Alternatives to traditional science now abound. One is interesting because it starts with a completely different approach towards reality and it encompasses more than just matter.

 

In the 1970's the American physicist Buckminster Fuller developed a visionary theory, also called "synergetics", that attacked traditional science at its very roots.

 

"Synergy" is the behavior of a whole that cannot be explained by the parts taken separately. Synergetics, therefore, studies system in a holistic (rather than reductionistic) way.

 

The way it does this, is by focusing on form rather than internal structure. Because of its emphasis on shape, Synergetics becomes a branch of Geometrics, the discipline of configurations (or patterns). Synergetics employs 60-degree coordination instead of the usual 90-degree coordination. The triangle (and tetrahedron) instead of the square (and the cube) is the fundamental geometric unit. Fuller's thought is inspired by one of his own inventions, the "geodesic" dome (1954), a structure that exploits a very efficient way of enclosing space and that gets stronger as it gets larger.

 

The bottom line is that reality is not made of "things", but of angle and frequency events. All experience can be reduced to only angles and frequencies.

 

Fuller finds "prisms" to be ubiquitous in nature and in culture. All systems contained in the universe are polyhedra, "universe" being the collection of all experiences of all individuals.

 

Synergetics rediscovers, in an almost mystical way, most of traditional science, but mainly through topological considerations (with traditional topology extended to "omnitopology"). For example, Synergetics proves that the universe is finite and expanding, and that Planck's constant is a "cosmic relationship".

 

 

The Emergence of a Science of Emergence

 

Prigogine's non-equilibrium Thermodynamics, Haken's synergetics, Von Bertalanffi's general systems theory and Kauffman's complex adaptive systems all point to the same scenario: the origin of life from inorganic matter is due to emergent processes of self-organization. The same processes account for phenomena at different levels in the organization of the universe, and, in particular, for cognition. Cognition appears to be a general property of systems, not an exclusive of the human mind.

 

A science of emergence, as an alternative to traditional, reductionist, science, could possibly explain all systems (living and not).

 

Further Reading

 

Buchler Justus: METAPHYSICS OF NATURAL COMPLEXES (Columbia University Press, 1966)

 

Bunge Mario: TREATISE ON BASIC PHILOSOPHY (Reidel, 1974-83)

 

Cohen Jack & Steward Ian: THE COLLAPSE OF CHAOS (Viking, 1994)

 

Coveney Peter: FRONTIERS OF COMPLEXITY (Fawcett, 1995)

 

Dalenoort G.J.: THE PARADIGM OF SELF-ORGANIZATION (Gordon & Breach, 1989)

 

Dalenoort G.J.: THE PARADIGM OF SELF-ORGANIZATION II (Gordon & Breach, 1994)

 

Davies Paul: GOD AND THE NEW PHYSICS (Penguin, 1982)

 

Eigen Manfred & Schuster Peter: THE HYPERCYCLE (Springer Verlag, 1979)

 

Forrest Stephanie: EMERGENT COMPUTATION (MIT Press, 1991) Fuller Richard Buckminster: SYNERGETICS: EXPLORATIONS IN THE GEOMETRY OF THINKING (Macmillan, 1975)

 

Fuller Buckminster: COSMOGRAPHY ( Macmillan, 1992)

 

Gell-Mann Murray: THE QUARK AND THE JAGUAR (W.H.Freeman, 1994)

 

Gleick James: CHAOS (Viking, 1987)

 

Goldberg David: GENETIC ALGORITHMS (Addison Wesley, 1989)

 

Haken Hermann: SYNERGETICS (Springer-Verlag, 1977)

 

Holland John: ADAPTATION IN NATURAL AND ARTIFICIAL SYSTEMS (Univ of Michigan Press, 1975)

 

Holland John: HIDDEN ORDER (Addison Wesley, 1995)

 

Kauffman Stuart: THE ORIGINS OF ORDER (Oxford University Press, 1993)

 

Kauffman Stuart: AT HOME IN THE UNIVERSE (Oxford Univ Press, 1995)

 

Koestler Arthur: THE GHOST IN THE MACHINE (Henry Regnery, 1967)

 

Langton Christopher: ARTIFICIAL LIFE (Addison-Wesley, 1989)

 

Laszlo Ervin: INTRODUCTION TO SYSTEMS PHILOSOPHY (Gordon & Breach, 1972)

 

Lewin Roger: COMPLEXITY (Macmillan, 1992)

 

Mandelbrot Benoit: THE FRACTAL GEOMETRY OF NATURE (W.H.Freeman, 1982)

 

Nicolis Gregoire & Prigogine Ilya: SELF-ORGANIZATION IN NON-EQUILIBRIUM SYSTEMS (Wiley, 1977)

 

Nicolis Gregoire & Prigogine Ilya: EXPLORING COMPLEXITY (W.H.Freeman, 1989)

 

Nicolis Gregoire: INTRODUCTION TO NONLINEAR SCIENCE (Cambridge University Press, 1995)

 

Pattee Howard Hunt: HIERARCHY THEORY (Braziller, 1973)

 

Prigogine Ilya: INTRODUCTION TO THERMODYNAMICS OF IRREVERSIBLE PROCESSES (Interscience Publishers, 1961)

 

Prigogine Ilya: NON-EQUILIBRIUM STATISTICAL MECHANICS (Interscience Publishers, 1962)

 

Prigogine Ilya & Stengers Isabelle: ORDER OUT OF CHAOS (Bantham, 1984)

 

Salthe Stanley: EVOLVING HIERARCHICAL SYSTEMS (Columbia University Press, 1985)

 

Salthe Stanley: DEVELOPMENT AND EVOLUTION (MIT Press, 1993)

 

Thom Rene': MATHEMATICAL MODELS OF MORPHOGENESIS (Horwood, 1983)

 

Thom Rene': STRUCTURAL STABILITY AND MORPHOGENESIS (Benjamin, 1975)

 

Toffoli Tommaso & Margolus Norman: CELLULAR AUTOMATA MACHINES (MIT Press, 1987)

 

Turing Alan Mathison: MORPHOGENESIS (North-Holland, 1992)

 

Varela Francisco: PRINCIPLES OF BIOLOGICAL AUTONOMY (North Holland, 1979)

 

Von Bertalanffy Ludwig: GENERAL SYSTEMS THEORY (Braziller, 1968)

 

Von Neumann John: THEORY OF SELF-REPRODUCING AUTOMATA (Princeton Univ Press, 1947)

 

Waldrop Mitchell: COMPLEXITY (Simon & Schuster, 1992)

 

Zeeman Erich Christian: CATASTROPHE THEORY (Addison-Wesley, 1977)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.