Tuesday, February 8, 2011

Tao without method

The impossibility to define a paradigm for complexity, and the consequent lack of general computing or description methodologies for the solution of complex problems in a certain field as a a consequence a great difficulty  fomulation, description and solution of problems involving complex systems.
A methodology è la scientific procedure that allows for the classic problem-solving process:

definition                            theory/model
                                                   ↕
problem →→→→→calculation/description→→→→→  solution
data

In classical science, in what Weaver definines as problems of simplicity and problems of disorganized complexity, the problem is always well defined, the theory or model provides the methodology of calculation/description, and with this one can find the solution, not known. So of the three terms problem-calcolous/description-solution two are known (problem, computing/description) and one does not know the solution, and for this it is estimated/described. The whole procedure takes place in the presence of a paradigm, which provides both the definition of the problem, and what data need to know to solve it, with the theory/model for the solution of the problem. If the problem is at the physical level the presence of a formal theory (ie mathematics) allows the solution in the form of a number or a function. For higher levels, such as chemistry, biology etc. The solution is an acceptable and complete description within the paradigm that contains the problem. It is worth noting - among others - that only the presence of a shared paradigm can, for example, allow the well-known assessment of school students or the validity or less of scientific and academic careers.

Take for example a simple problems of simplicity known to any elementary student:
in a cubic tank of L side come in IN liters of water any second from a  tap and come out OUT litters any second from an outlet: assumed that IN is greater than OUT after what time the water will reach the edge of the tub?
The theory that allows to do the simple calculation to find the solution is basic physics, and the calculation is made by a branch of mathematics called arithmetic, obtaining as a solution to the problem a number expressed in units of time.
Even in this elementary case is noteworthy that the involved assumptions are non-elementary, such the principle of conservation of energy (of the mass in this case) and the logical competence to perform arithmetic calculations, that is to know how to use the axioms (of Peano) e le rules of arithmetic.

For the problems of disorganized complexity applies the same methodology of solution, but passing from a paradigm, and then from a method, deterministic to a probabilistic one. For example, in the simple case of launching a coin if the question of the problem is placed in a deterministic way as "launching the coin will come out heads or tails?" the answer is impossible, while framed in terms of probability the problem is easily solved and the complete response is that the probability of output is exactly equal to 50% for both cases. In this case the solution is always expressed as a number or a function, as before, but expressed as a probability.
Whole sectors very complex of science and its applications are exactly solvable in this way, as the statistical mechanics, that is the application of the probability theory to the termodynamics behavior of systems composed of a large number of elements, providing a model to link the properties of individual atoms and molecules to the macroscopic properties of the system composed by them, or the information theory, developed by Shannon with contributions of the same Weaver, which is the theoretical basis of description and implementation of any telecommunication system.


Even in the case of problems of disorganized complexity, exactly solvable in a statistical sense, there are examples of the emergency of collective complex properties not immediately related to the properties of individual elements.
The most significant example is the following: there is a game where a ball bouncing along a plane falls on small cylinders arranged at random, which prevent the most direct route and at the end of this jungle of obstacles slips in a row boxes placed at the bottom of the hill. Guess where the ball will end up is a difficult task: the system is not integratble and there is chaos, unpredictability.


Yet there is a bet that one can do with a good chance of victory: that launching one thousand balls below the middle box will fill most of those at the edges. We are setting off from the world of determinism, but tring one would see that actually, as the number of launches increases, the profile of the heights of the columns of balls are getting closer to a Gaussian.


This result is based on one of the most important theorems of probability theory, the Central Limit Theoremwhich states that the sum of a very large number of indipendent random variables  tends to standard normal distribution, that is a gaussian, and this is more true as larger is the number of balls.
One can read the result in so many ways, attributing the cause to the different number of paths that lead to the individual boxes, or the "fraying" of the initial conditions along the paths. But in fact, we are facing a new phenomenon. We can not help but recognize that this is something different from the motion of a single ball; it is a collective effect, found only on repeated many launches, which requires the introduction of collective variables regulated by new laws of nature different from the deterministic laws of motion. They are the statistical laws, which by their nature apply only to systems composed of many elements. Laws in part linked to those of the motion of individual elements, but largely new and indipendent. Laws that allow no more sure forecasts, but probable. 

(freely adapted from a  presentation of  Prof. Mario Rasetti "Theory of Complexity", 2008)


In the Science of Complexity, that is for the problems of organized complexity, things are radically different: in this case the problem is almost always well defined, the theory/model can be known - although it may correspond to the union of many theories/models of different disciplines - the solution, in many cases - but not all - is already known, what is lacking is the calculation procedure/description, since there is not a general methodology for the solution of the problem.
In the Weaver classical example "What makes an evening primrose open when it does?" the problem is very well defined, the necessary data (climate, temperature variation, soil composition, structure - morphology - the stage of plant growth, etc.) can all be known with great precision, the solution is known to anyone who walks through meadow in spring: in continental Europe at a given latitude at a given altitude in a certain place  that in the previous years has hosted primroses and that has not undergone major ecological changes, the probability that some primroses bloom from late February to the beginning of May is 100%, however one can not define the solution even in a probabilistic/statistical  way  because no theory/model is able to provide a probability function in time, and still less it is possible to answer to what does or not blooming primroses, although a number of topics in physics, biophysics, chemistry, geochemistry, biochemistry and biology is capable of describing many of the processes involved, but the total process - complex - it is indescribable in complete form.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.