29364360 the-logic-of-transdisciplinarity-2

25
THE LOGIC OF THE TRANSDISCIPLINARITY Luiz de Carvalho Recife – PE 2007 MINISTÉRIO DA EDUCAÇÃO UNIVERSIDADE FEDERAL RURAL DE PERNAMBUCO

Transcript of 29364360 the-logic-of-transdisciplinarity-2

Page 1: 29364360 the-logic-of-transdisciplinarity-2

THE LOGIC OF THE TRANSDISCIPLINARITY

Luiz de Carvalho

Recife – PE 2007

MINISTÉRIO DA EDUCAÇÃO UNIVERSIDADE FEDERAL RURAL DE PERNAMBUCO

Page 2: 29364360 the-logic-of-transdisciplinarity-2

THE LOGIC OF THE TRANSDISCIPLINARITY

Luiz de Carvalho

Recife – PE 2007

MINISTÉRIO DA EDUCAÇÃO UNIVERSIDADE FEDERAL RURAL DE PERNAMBUCO

Proposta de projeto de tese para doutoramento a ser apresentada ao Dr. Basarab Nicolescu com co-orientação dos professores Dr. Romildo Nogueira e Dr. George C. Jimenez.

Page 3: 29364360 the-logic-of-transdisciplinarity-2

Category Theory Language

1 Category theory and local and nonlocal toposes

1.1 Local theory

1.2 Local language interpretation

1.3 Toposes and levels of reality

1.4 Local toposes sheaves

1.5 Logic fibrillation

2 Logic of transdicisplinarity/complexity

2.1 Intuitive characteristics, language, and formalization

2.2 Set theory paradoxes and type theory

2.3 Ascending hierarchy of types

2.4 Higher-order logic

2.4 Complex plurality and algorithmic complexity

2.6 Law of excluded middle: local and nonlocal semantic toposes

2.6 The nonlocal and the inconsistency

3 Transdisciplinary logic´s fibrillation and algebraization

3.1 Possible translations semantics

3.2 Morphism between two signature systems in Hilbert style H1 ⊕ H 2

Conclusion

Page 4: 29364360 the-logic-of-transdisciplinarity-2

Presentation: a brief historic-scientific introduction

The aim of this work is to offer a logic-formal and axiomatic characterization to the Transdisciplinarity Theory. However, an introduction to transdisciplinarity requires a brief reading of the historic-scientific events that justifies the need for a new epistemic approach, convenient to the changes occurred during almost two centuries. These changes proceeded in order to break up some or almost every classical logic principles, which have motivated or motivate the scientific concepts up to now. Those principles are: the principle of identity, which supports that anything is itself, or formally, A is A; the noncontradiction law, which supports that, between two contradictory prepositions, one is false, or formally: P(¬ P¬∧ ); and the law of excluded middle, or formally, P ¬∨ P.

The history of the classical logic-deductive method starts with Aristotle and, since then it is improving, up to the beginning of modernity. Nevertheless, from nineteenth century, cracks in the classical Aristotelian-Euclidean thinking begin to appear, regarding the called non-Euclidean geometries. Euclid (330-277), with his Elements, gives a systematic form to Greek knowledge gathering proposition and demonstration in a deductive way, according to Aristotle lesson, which poses that, in a T theory, a proposition (P) and its denial, named non-P ( ¬ P), are not deductive, and that only P or non-P are demonstrable.

Logic is the pratical study of the valid arguments and, for centuries, according to the Elements, the geometric model theory remained till the emergence of the completeness and movement problems, linked to the postulate five (that, if a straight line falling on two straight lines makes the interior angles on the same side less than two right angles, the two straight lines, if produced indefinitely, meet on that side on which are the angles less than the two right angles), and to the congruence axiom, that is:

a) (Completeness): The Elements axiomatization is not enough to deduce the theorems.

b) (Movement): the congruence axioms depend on the movement concept.

For many years, two tendencies have been trying to solve these problems: adding more axioms to complete geometry or showing that this attempt would be impossible. Archimedes has stood up for the first tendency and others, for different versions of the postulate five (given any straight line and a point not on it, there "exists one and only one straight line which passes" through that point). Nonetheless, Lobachevski, Gaus and Bolyai have argued that geometry does not represent the physical reality, and that its postulates and theorems are not necessarily true in the physical world.

The postulate five has been attacked, culminating in the Lobachevski´s postulate (Lp): there exist two lines parallel to a given line through a given point not on the line. A completely new geometry emerged from this novel axiom, presenting no contradictions regarding the Euclidean geometry. From these

Page 5: 29364360 the-logic-of-transdisciplinarity-2

progresses, we could not infer that the axioms of mathematics are intuitive constructions based on some kind of non-mathematical evidence.

This way, the evolutive process of the deductive axiomatic system leaves the aristotelic grammatical and discursive scope to feature itself as a reasoning algebra with G. Boole (1815-1864). After that, how is it possible to set up the consistency of the axioms in a confident way? It is necessary to build up mathematics from itself. On the other hand, the congruence axioms in the Element´s book V, from where Eudoxus´ method of exhaustion and Cauchy-Riemann integral emerged (408 B.C. – 355 B.C.), were initially inspiring to the conception of real numbers as a geometric abstration (Axiom XI – every right angle are similar); it was necessary to give consistency to the real, free from the infinite approach of the infinitesimals, which Berkeley used to call absent magnitude ghost; this was performed by Weierstrass, with his formal theory of limits, and by Dedekind/Cantor, with his construction of real numbers based on natural numbers, named arithmetization of analysis. Such constructions involve some use of mathematical infinity. In this context, David Hilbert´s program have arisen: he wanted mathematics to be formulated on a solid and complete logical foundation.

The axiomatic method has been encouraged with these developments, proceeding with this perspective, which was also adopted by Isaac Newton (1642-1727), who classified, in a axiomatic and deductive way, the classical dynamics, in his PRINCIPIA, by three axioms or laws of motion. Newton´s axiomatic counted on the infinitesimal calculus method. However, the presumed self-evidence of the three axiomatic principles, which validated Newtonian principles, particularly the fundamental assertion regarding the principle of inertia and its parameter, absolute time and space, ended up as a naive pretension. Despite its basis problems, Newtonian model prevailed during the 1700s with a scientific revolution that was born to fight Aristotle authority, but that turned to be a closed and powerful model, whose principles evoke a privileged observer, Laplace´s demon, that, according to Basarab Nicolescu, validated the consistency of the newborn science postulates:

(I) The existence of universal laws, mathematically characterized

(II)The discovery of these laws by means of scientific experience

(III) The perfect reproducibility of the experimental data

According to what would assure the consistency of mathematics as an instrument for a new research program, science, Cantor´s theory of sets, in which elementary arithmetic is based, sustained the aristotelian classical logic as na underlying logic (D’OTTAVIANO,1994).

Classical physics successess, on the other hand, kept confirming the three postulates, even if the first postulate had never been definitively proved. The arithmetization of analysis moviment, that, according sets theory would led to the named sets paradoxes, or Cantor-Russell paradox, if such a set is not a member of itself, it would qualify as a member of itself by the same definition,

Page 6: 29364360 the-logic-of-transdisciplinarity-2

culminated with Hilbert program that aimed to prove that such problems could be surpassed through a suitable formalization that would allow the meta-theoretical demonstration of the arithmetical consistency and, as a consequence, of mathematics ( HILBERT & BERNAYS, 1934).

During the first half of 20th century, Kurt Gödel publishes his incompleteness theorems, which would limit Hilbert program:

A: (First theorem) To every ω-consistent recursive class κ of formulas there correspond recursive class signs r, such that neither vGenr nor Neg(vGenr) belongs to Flg(κ) (where v is the free variable of r).

B: (Second theorem) For any formal recursively enumerable (i.e. effectively generated) theory T including basic arithmetical truths and also certain truths about formal provability, T includes a statement of its own consistency if and only if T is inconsistent.

One of the consequences of these theorems is that any formal system that envolves elementary arithmetic, accepts true and nondemonstrable sentences. Newton C. A. da Costa and Francisco A. Dória (da COSTA and DORIA 1991) have extended this result to classical physics. Nevertheless, what does Gödel theorems mean to science and, particularly, to Physics? The science philosopher Maria Luiza Dalla Chiara understands that, from the point of view of the development of Logic theories in its non-independent relation to any research content, Gödel theorems have produced a plurality situation in the treatment of some physical theory (DALLA CHIARA, 1980).

However, if Gödel theorems regards only to the formal aspect, in what sense the theorems could be extended to the physical theories? Before we continue, let’s take a look to another result of the Gödel theorems, the undecidable, which will make our question clearer:

C: For any consistent formal theory that proves basic arithmetical truths, it is possible to construct an arithmetical statement that is true but not provable in the theory. That is, any consistent theory of a certain expressive strength is incomplete. The meaning of "it is possible to construct" is that there is some mechanical procedure that produces another statement.

Albert Einstein, concerning the debate about Quantum Physic’s incompleteness in the famous EPR argument (Einstein-Podolski-Rosen), which we are going to consider soon, give us the necessary variables to understand Gödel’s results regarding physical theories, that Lokenath Debnarh synthesized in these words:

The totality of concepts and propositions constitute a physical theory provided it satisfied two essential equirements: external justification, which means the agreement between theory and experiment, and internal perfection, which implies that the theory can be inferred from the most general principles without addition concepts and assumptions. (DEBNATH, 1982).

Page 7: 29364360 the-logic-of-transdisciplinarity-2

What Einstein comprehends as a theory’s inner perfection is its consistency as a mathematical structure, what would lead us to Gödel’s A theorem. However, D´s external justificative would demand an axiomatic characterization of physical theories, which we can obtain from Patrick Suppes structure (P SUPPES - 1967), which links mathematic theory and physical phenomena:

A physical theory T is a triple T = ⟨M,D,R ⟩ , where M is the theory’s mathematical structure; D is the application domain and R is the set of rules that links M and D, according to da Costa and Sant’anna (da COSTA and SANT’ANNA, 2006):

On the other hand, mathematical concepts in principle cannot be considered as a faithful picture of the physical world; but can, through axiomatization, be susceptible to the use of logic-mathematical tools as indecidable propositions like C one hold physical meaning. In fact, according to Lycurgo (LYGURGO 2004), there is a connection between these results and what we can observe in the physical world; but what is relevant in completeness theory, according to the definition that “a physical theory is complete if every element of the physical reality has a counterpart in the physical theory”, that is, there is more in the world than the specific theory can anticipate, according to Gödel’s results. But theses results do not provide us hope to get a complete axiomatization, as incompleteness is structural. According to Maria Luiza Dalla Chiara, the notion of an independent Logic is, historically connected to some kind of single logic (Dalla Chiara, 1980).

Nevertheless, in front of current demands, logic is no more characterized regarding the valid formula concept and is perceived as consequence relations according to Tarski, allowing a plural and interrelation treatment between logics.

At the same time, another revolution was happening in scientific research world, which would definitely place us in another perspective: Quantum Mechanics. This theory is the best example of semi-complete theories´ blend. Historically, we first have the quantum theories of Planck 1900), Einstein (1905), Bohr (1913) e De Broglie (1924); in a posterior moment, Heisenberg’s matrix mechanics (1925) and Schrödinger’s undulatory mechanics (1926) and Pauli´s and Dirac´s theories. All these formulations incorporate, somehow, the controversial phenomena that turn quantum mechanics in a singularity in science history: wave-particle duality, its probabilistic aspect, and the inseparability concept.

Quantum theory and its fundamental experiment, Young´s double-slit experiment (http://www.upscale.utoronto.ca/GeneralInterest/Harrison/DoubleSlit/DoubleSlit.html#TwoSlitsElectrons), which distinguishes quantum mechanics from classical mechanics through atomic phenomena explanation. This experiment provide us with some odd results: if we call the first experiment open slit A and the closed one, B, we have the following results, represented as logical circuits: BABABA ¬∧¬¬↔→¬↔¬∧ )( , that is, particles remain particles with peculiar intensity distribution, and the same happens when B is open and A is

Page 8: 29364360 the-logic-of-transdisciplinarity-2

closed, ABABAB ¬∧¬¬↔→¬↔¬∧ )( . But what we note when we open both A and B slits at the same time? Based upon previous experiments and on the fact that the third one would keep distribution consistency, we would get the natural sum of both beams projected in the screen, as each electron moving in its and well-drawn and fixed trajectory, which pass through the slit without influencing other electrons that pass through the other slit. Therefore, we could have that, if this is valid for ∀ n experiment, so it is valid for n+1, as follows:

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

!)

2)

3)

4) 1 2

5) 1 3

6) 4 5

A B hipótese

A B A separação

A B B separação

A ModusPonens e

B Modusponens e

A B Introdução e

∧ →

∧ →

However, this is not what happens exactly. Electronic diffraction phenomenon shows a picture on the screen that occurs due to interference and does not correspond to the sum of the pictures produced by each of the slits individually, as the above hypothesis presents. Now let’s imagine that this phenomenon is consistent with the two first experiences, i.e., the third experience maintains the properties of electrons pertaining to the previous experiments. This could mean that the undulatory aspect observed is motivated by the particles individually, suggesting some hypothesis:

a) electrons should irradiate electromagnetic waves continuously;

b) they would need a inexhaustible energy source in order to do not deteriorate their nuclei;

c) their speed should be higher than the speed of light C.

As these hypotheses are senseless, quantum mechanics is contradictory in comparison with classical electrodynamics, which proposes that the undulatory aspect is consequence of the electron trajectory.

We can conclude that light presents particle and wave features at the same time. The first two experiments obviously suggest the continuity concept, according to Nicolescu: it is not possible to pass from one extreme to another without passing through the mid-space. However, the third experience breaks continuity and places non-classical probability as a suitable mathematical tool to deal with diffraction interferences.

Diffraction phenomenon demonstrates that the continuity perceived in the previous experiments with individual particles is not valid; because diffraction phenomenon presents bindings where particles would not be splitted, according to the hypotheses shown above; this phenomenon is known as nonseparability

Page 9: 29364360 the-logic-of-transdisciplinarity-2

or nonlocality. This condition made many researchers react, resulting in EPR Paradox, or Einstein-Podolsky-Rosen Paradox. Essentially, EPR is a thought experiment that demonstrates that the result of a measurement performed on one part A of a quantum system has a non-local effect on the physical reality of another distant part B. As far as one can judge, this situation is not possible because special relativity asserts that no information can be trasmitted faster than light. Nevertheless, Jonh Bell, with his theorem, has proved an absolute distinction between quantum and classical mechanics. Assuming separability and sustaining a, b, and c hypotheses, we would need hidden variables that would continuously determine experiments of diffraction, regarding a given interpretation of the quantum phenomena.

Bell proved that the hypothesis of local separability (a particle holds defined values that do not depend on a contextualization; and physical effects speed of propagation is finite) is not in accordance with quantum physics, that is, the correlation coefficient C(θ)<N for local hidden variable theories and to quantum mechanics C(θ)>N for some θ. It was born a new kind of causality: non-local causality. In our macrophysical world, if two objects interact in a certain moment in time and then stand back, they interact less. In quantum world, entities keep interacting independently of the distant (NICOLESCU, 2001).

Non-local causality idea questioned another classical science principle: the existence of only one level of reality, by which Laplace´s demon, a privileged observer, could be an invariant system from a number of general laws (NICOLESCU,2001). Under this context, in one hundred years of scientific development, which meets a perspective pluralization, Transdisciplinarity and Complexity have emerged, as well as several perspectives with the same objective: adjust themselves to complexity and to the current disciplinary boom.

TRANSDISCIPLINARITY: COMPLEXITY, LOGIC OF THE INCLUDED THIRD AND LEVELS OF REALITY What is transdisciplinarity/complexity as defined by Basarab Nicolescu? The transdisciplinarity is founded upon three pillars: levels of reality, logic of the included third and complexity. The revolution of quantum physics with its problems, especially the known measurement problem, plays central role in the concept of transdisciplinarity and complexity. Avoiding go into metaphysical subtleties, which is as rigorous as calculation, for us is sufficient the idea of levels of reality as an ensemble of invariant systems under the action of a number of local general laws, or as something that offers resistance to our experiences. But we can use the principle that "the non-local is inconsistent with the locality (NL)" and from this say that a level of reality is a structurally well-defined class, and is local, which plurality constitutes non-local classes that made the complexity. The NL principle means that two levels of reality are

Page 10: 29364360 the-logic-of-transdisciplinarity-2

different, if, passing from one to the other, there was a breaking in the laws and a break in fundamental concepts, (NICOLESCU, 2001). To deal with levels of reality from a logic point of view, that is our goal, is necessary the locality theory, studied in the ambit of categories theory, the local topos, which Lawvere gave an axiomatic characterization similar to the category of the sets, the called toposes, in reference to the toposes introduced by Grothrndieck for algebraic geometry. (LAWVERE, 1975). Then, levels of reality and levels of perception make a local topos. A topos form a kind of category that has the following properties similar to the category of sets:

- Should have a terminal object corresponding to the unitary element, the set ⟨ ∗ ⟩ ;

- Should be able to form the Cartesian product A×B of two objects and the exponential object CB of functions of C on B;

- Should be able to form the Cartesian product A×B of two objects and the exponential object CB of functions of B on C ;

- Must have an object of values of truth Ω with a distinguished element and a corresponding element sub-object one-by-one B of A and its characteristic function χ B: A → Ω such that, for any element a of A, a belongs to B if and only if χ B(a)=, and must have an object of naturals N equipped with an element 0∈N and the functions S:N→N defined by recursion such that: for each a ∈ A and each function h:A→A, We can build a single function φ:N→A such that φ(n)=hn(a) for any natural number n (LAMBEK, 2004).

Also according to Lawvere, these properties can be expressed in the language of categories theory, not referring to elements, in a way that, is seen as an application 1 → Ω and 0 as 1 → N. Thus, each topos ∆ is associated with a type theory λ (∆) → ρ as its internal language. Inversely, for every type theory ρ is associated a topos π(ρ), the topos generated by ρ, known as Tarski-Lindenbaum’s category of ρ. (LAMBEK, 2004). If we take the levels of reality and its perception as local topos, what relationship exists between complexity and transdisciplinarity? We can say that the possibility of dealing with multiple levels of knowledge reality, which constitutes as a complex plurality in itself, is given by the logic of the included third without the need of included fourth, fifth, n.

What is supposed regarding levels of reality to meet the above requirements, while local topos, is that he keeps doing some kind of coherence asserting the axiom of regularity of ZF (Zermelo-Frankel), or, that any not empty set X contains an element Y that is disjunctive of X, formally: X ≠ ø→ (Y∃ Y ≈∩∈ XYX I ø); that axiom forbids circularities as the set of all sets that do not belongs to himself as a member, formally: M= AA / ∉A and therefore belongs to itself, that is, the paradox of Russell. With the results of the theorem of Pierce of: toute polyade supérieure à une triade peut être

Page 11: 29364360 the-logic-of-transdisciplinarity-2

analysée en terme de triades, mais une triade ne peut pas être généralement analysée en termes de dyades" como faz Nicolescu (Nicolescu, 1998) and those of N. Resch for polyvalent logic (MANY-VALUED LOGIC 1969), according to that, any system of logics Ln valents will contain tautological formulas as theorems that are not necessarily in Ln+1 and by theorem of CL deduction (classical logic) all tautology is a theorem of CL, we have the local topos complete.

However, the transdisciplinary local toposes are generally incomplete, not worthing the generalized axiom of choice and could be formalized as intuitive logic (J), classic while local levels of reality. What constitutes transdisciplinary logic complexity, as we understand it, is the non-local interaction of local toposes in accordance with NL principle. That is the sense of quantum discontinuity as Nicolescu means (NICOLESCU, 1998)

From this point, the issues rose by Nicolescu about the sense of opened unity of the world, is essential to understand what he calls the Gödelian structure, the correlation of levels of reality, for example: What is the nature of the theory which can describe the passage from one level of reality to another? Is there coherence or even a unity in the ensemble of levels of reality? What should be the role of the subject-observer in the existence of an eventual unity of all the levels of reality? Is there a privileged level of reality? Would be the unity of knowledge, if any, of subjective or objective nature? What is the role of reason in the eventual unity of knowledge? What would be, in the field of reflection and action, the predictive power of the new model of reality? Would be possible to understand this present world? In the same text, Nicolescu refers to levels of reality ensemble as an evolutive process that would possess a self-consistency which, by the second theorem of Gödel, this self-reference would imply in a contradiction and consequent separation in levels of reality undoing the same self-consistency in local collapses in which worth the classical logics. As we said above, what is subjacent to this process of self-consistency is an interpretation of measurement problem in quantum physics. Then in what consists the measurement problem?

OSVALDO (1992, pp. 177-217) in its study of measurement problem, characterized the quantum mechanics (QM) as a structure as follows: A closed system is described by a state that evolves over time in a determinist way, according Schrödinger equation, Ψ(r, t). As we mentioned, this state provides only the probability of obtaining different results from a measurement. By now, after measurement, the system is in another state, which depends on the outcome, thus, in the course of measuring the system evolves in an indeterminist way, described in Von Neumann projection postulate or wave packet collapse or state reduction. Therefore, the problem of measurement arises from this opposition between determinist and indeterminists evolution associated to projection postulate. And if we add two more hypotheses:

α) the metric device belongs to the quantum state;

Page 12: 29364360 the-logic-of-transdisciplinarity-2

β) The composed system, object/device may be closed on the environment, so, its evolution should be deterministic, but, state reductions will continue to occur during measurements.

It is not necessary to say that this paradox is linked to the double slit experiment and to its paradox wave/particle. To summarize, the question is to know how during measurement, a quantum superposition may become in states that do not superpose. Von Neumann answer is that projection postulate accompanying any act of measurement, formally:

Ζ): ( ) ( )φ τ φ τ∧

≈ Ω Where omega is the unitary operator and the state vector

is expressed by superposition 11φ φα≈ + 22

φα where 1φ and 2φ are self-

state of an observable which would reduce to η): 1φ 2φ∨ depending on

observation/measurement result, and that the probability of each outcome is θ) 2 2

1 2a a∧ The unitary operator and their inverse adjunct is equal to •∧ ∧

Ω∧ Ω 1

From that solution, two problems appear: one of CHARACTERIZATION, i is, what are the conditions for apply the projection postulate, and being it a inherent condition for measurement, what will characterize a measurement and/or observation? The other problem would be COMPLETENESS: Could projection postulate be derived from other quantum physics principles with a physical model suitable for the measurement process? Without going into details of various currents that proposed solve this problem, we see that a new thermodynamic axiom, linking a large number of particles and incoherent states, would have been introduced to the problem of completeness. SUCH ADJUNCTION, CALLED THERMODYNAMIC AMPLIFICATION, COULD CHARACTERIZE MORE SPECIFICALLY THE ATOMIC PHENOMENAS OBSERVATION BASED ON RECORDS OBTAINED THROUGH AMPLIFICATION DEVICES WITH IRREVERSIBLE WORK. This program would be later unsolved, so that, the problem of completeness is unsolvable and that the thermodynamic models do not provide exact solution to the measurement problem. If the completeness problem was solvable, it would provide an example of measurement with certain conditions:

- a) a quantum state can be attributed to the macroscopic measuring equipment, (device state in a Hilbert space of finite dimension)

- b) The composite quantum system (device/object) can be considered closed in relation to environment and the rest of the universe, (the composed system evolution is unitary)

- c) The different measurement results correspond to distinct final states of the device; (pointer states)

- d) A more precise specification of this pointers supposition constitutes a condition of solvability;

Page 13: 29364360 the-logic-of-transdisciplinarity-2

- e) Should be precisely defined what is meant by measuring, (a measurement is a subclass of interactions governing the evolution of a composed system, device/object, that should be able to distinguish between classes of object states for which the medium values of a self-adjunct operator are different. In addition to this definition, there are the measurement classes of repeatable and predictable. A repeatable measurement is such that, shortly after being made and repeated, the result certainly is the same, or, the final state of the object after the measurement is the self-state corresponding to self-value obtained as a result. The predictable ones are that for each possible outcome there is a state of the object that leads to a predictable result.) A measurement satisfying these conditions, would give a solution for the completeness problem. Meanwhile, even as (OSVALDO 1992, pp. 177-217), the evidence is that such measurement cannot be defined, and then, the completeness problem being as unsolved.

Being:

Σ – Unitary condition

Λ – Measurement

Ξ – Solvability

Φ – The a, b and c hypothesis

The evidence of insolvability in general can be represented as:

Φ,Σ,Ξ Λ

Accepting that does not exist ΦeΣ then does not exist measurements that meets the condition of solvability, Φ,Σ,Λ Ξ.

The problem of measurement received several proposals of solution, and one of them is the theory of open systems, in which the compound system, that includes object/device, cannot be completely isolated from the environment. In this conception, there is an approach in which the reduction process would be an objective physical phenomenon that would transcend the measuring act, transforming itself in the collapse or decoherence problem, or the problem of potentialities update, ontologies of powers that could demand a supplementation. In this perspective is settled the Gödelian self-consistent universe of Nicolescu. Thus, the question for us is: Is the inconsistency associated to reality levels ensemble evolution self-referentialty, the same inconsistency of the correlation between locality and non-locality? If the answer is yes, we have from the second theorem of Gödel, that levels of reality ensemble are inconsistent. If the answer is no, we have the following hypotheses:

Page 14: 29364360 the-logic-of-transdisciplinarity-2

- Or the correlation local/ non-local would of be of extension and therefore will

worth ψ : x τσ ∈↔∈ x ,ψ τσ ≈ , or a conservative interpretation of the logical structure would be extended to other structures or toposes, violating the NL principle, and the ensemble of levels of reality would be inconsistent;

- Or, self-referentialty not infringes the axiom of regularity and the correlation local/non-local would be inconsistent.

So, in the positive response view, the system broadly would be equal to the correlation local/non-local which collapses from itself-reproduction in consistent and incomplete local toposes. Answerig to questions above, we could say that transdisciplinarity is consistent and inconsistent making worth the contradictory pairs that experience and scientific theory, after quantum physics, had saw appear: local causality and non-local; reversibility and irreversibility, separability and non-separability; wave and particle, continuity and discontinuity, symmetry and broken symmetry, and so on. But Nicolescu himself answered such questions, particularly the first, about the nature of the theory that could describe the transition from one level to another, saying that no one managed to find a mathematical formalism that allows the passage of a strict world to another. But we must differentiate logical epistemic formalization from formal mathematical structuring of models theories. A logical epistemic formalization has no compromise with the stander algebraic model, such as the formal mathematical structure. Logical epistemic formalization works with theory of categories advanced features, creating logical-topological spaces of possible worlds from the Tarskian logic conception as relations of consequence. In this way, we believe that logic subjacent to transdisciplinarity, is a combination of classical and inconsistent logics, kind of not Hegelian temporal dialectic, because we are always in some reality, and that subject self-position, as Hegel believes, is derived. However, criticism to Hegelian succession of contradiction seems to lack sense, because, if there is a contradiction, at some lapse of time this contradiction is simultaneous. In any way, transdisciplinarity self-reference is not closed in the immanence of self- position of idealistic subject of self, that puts the non-self and understand it. The complexity of transdisciplinary self-reference opens to the multiplicity of historical-existential experience as a subject pertain to the worlds that it is included. Two other properties are at the core of transdisciplinarity:

# The opening of reflexivity in infinite resource becomes totality indeterminacy: the whole, the same, just like this while incomplete (inexhaustible);

# The opening of dialectic to difference, as an outcome of relative incommensurability of different languages and/or (levels of reality) and meaning horizon.

But, is the sense of self-referential of Nicolescu the same as of Morin?

In a revealing dialogue we get the difference of self-referential sense involved in the "logic" of included third between B. Nicolescu and E. Morin, at Stéphane Lupasco international conference, where Nicolescu asks:

Page 15: 29364360 the-logic-of-transdisciplinarity-2

- B.N: E, so, you were quite attracted by theorem of Gödel. Because the essential point is there: this organizational structure of included third logic is Gödelian type. It never closes.

- E.M: Of course. But Gödel reaches to a principle of uncertainty. That is, he says the system cannot give account of itself, with its own resources. It is not known whether it is consistent and it is not known if it contains a contradiction. First point: the uncertainty. Second point: is possible, however, conceive a richer meta-system, able to understanding the system, and that, of course, it opens to infinite itself. But then, at this moment, I do not see the expression of a included third specific logic.

Nicolescu then retakes aspects of Gödel’s theorems, almost always overlooked in the analytical approaches, introducing the meta-theoric affirmation of T inside of T consistency, and therefore, their self-reference as a contradiction that, according to E. Morin, is a priori: Le théorème de Gödel nous dit qu'un système d'axiomes suffisamment riche conduit inévitablement à des résultats soit indécidables, soit contradictoires. Cette dernière assertion est souvent oubliée dans les ouvrages de vulgarisation de ce théorème.

So let's see: La is the arithmetic language, T an axiomatizable recursive extension of PA, Peano arithmetic, PrT(x) a proof predicate for T and Com(T) the sentence ¬PrT( 0=1). We assume that T is consistent and T Com(T) and being σ a sentence in La which meets T σ ↔ ¬PrT( σ ). We say that Λ is a model of T,and is positive if Λ σ and negative in other cases. If T is consistent, there is a model Λ1 of T. If Λ1 is positive, then Λ1 Com(T+¬σ), then there is a Λ2 negative model of T which is a definable extension of Λ1. If not, Λ1 = Λ2. For the hypothesis that Λ2 Com(T), we have a model Λ3 of T which is a definable extension of Λ2. Λ2 if is negative, Λ2 PrT( σ ), so Λ3 σ. Meanwhile, Λ3 ¬ σ and Λ2 ¬ σ and Λ3 are an extension of Λ2, which is a contradiction where T theory proves its own consistency. (This evidence is due Jech, 1994) Once self-consistency reintroduces the subject in the system by itself formulated, Nicolescu complete its approach of included third logic with the ontological conception of Lupasco: the subject of knowledge is implied in the logic by itself formulated. The experience is the subject itself experience. The circular character of logic assertion, as its own logical experience, occurs from circular character of subject: to define the subject we must take into account all phenomena, elements, events, states and propositions that relate to our world and, furthermore, the affectivity. An obviously impossible task: in the ontology of Lupasco, the subject never would be defined. This Gödelian structure always opens, beyond the levels of reality, a NOT RESISTANCE ZONE, that we rather to call, SEMIOTICALLY NOT FORMED FIELD, which is a field not finitely algebraizable by AAL method (abstract algebraic logic) (W. Blok and D. Pigozzi, 1989) that we will describe soon. Here there is a problem on the complexity measurement in the algorithmic complexity point of view. Will Kolmogorov complexity measurement apply to this semiotically not formed field? Is possible

Page 16: 29364360 the-logic-of-transdisciplinarity-2

consider the transdisciplinary not-separability as a totality whose complement is not computable? Intuitively speaking, Kolmogorov complexity of a number n, denoted by K (n), is the size of a program that generates n ; and n is called random if n ≤ K (n). Kolmogorov has proved that not random numbers set is recursively enumerable, but is not recursive. Odifreddi (1989) considered this a version of Gödel’s first theorem of incompleteness. Second Kikuchi (1997), complexity of Kolmogorov also leads to the second theorem of incompleteness of Gödel, extracted from the paradox of Berry, (“What is the "first not named number with less than ten words?”). Since Gödel has proved that the first theorem leads to the second. Then, to apply Kolmogorov complexity to levels of reality ensemble and semiotically not formed field, we would have to consider local/non-local inconsistency null or levels of reality ensemble self-referential as a recursive extension, which would give to levels of reality ensemble a local character as such is the Kolmogorov complexity, valid for the local topos and the incompletenesses of Gödel. The question turns on know what type of contextuality is involved the non-resistance area or the transdisciplinary semiotically not formed field: quantum or classical? In classical contextuality, according to Aert,(2002), the result is affected by various environmental aspects, but not by irreducible and specific non predictable from system and experimental disturbance interaction. In classical contextuality, Kolmogorov axioms are met, and a classic model of probability can be used. However, when contextuality is intrinsic, the system and the disturbance, both, have an internal relation of constitution, as such its interface creates a concrescence of emergency. The presence of intrinsic contextuality means that Kolmogorov axioms are not met. The Entanglement can be tested by determining whether correlation experiments in entities which being together infringe the inequality of Bell (Bell 1964) Pitowsky,(1989) had proved that if the inequality of Bell are met for a range of probabilities concerning to considered experiment response, then there is a Kolmogorovan classical probability that describes this probability. In this case, the probability can be described as a lack of knowledge about the precise status of the system. In other hand, if the inequality of Bell is infringed, such classical Kolmogorovian probability is not valuable; rather, Bell’s inequalities infringement proves that the probabilities involved are not classical. In transdisciplinary contextuality case, we believe being this classical and non-classical at the same time, what seems to confirm Lupasco’s proposition cied by Nicolescu: quantic dialetic is a doubt expansion, in this case is not a case of lack of knowledge, but context generation as a potential, at the same time that some knowledge is updated. So, algorithmical complexity pertains to transdisciplinary complexity as a local topos. In the chapter "Transdisciplinarity and open unity of the world", which Nicolescu calls levels of reality ensemble, is in the Gödelian conditions of its structure, a local topos in which is valuable Kolmogorovian classical probability. But, in other hand, in the "Included third. From quantum physics to ontology”. The problem, if is not from translation, the relationship between a level of reality and another, does not preserves the coherence, and then, the application of Kolmogorovian complexity to structurally incoherent levels of reality ensemble is not valuable. So we can say that:

Page 17: 29364360 the-logic-of-transdisciplinarity-2

Being Cn: the levels of reality ensemble; and Cn∪ Zr: the levels of reality ensemble union zone of non-resistance. Cn has a coherent structure and worth the algorithmic complexity ∧ Cn∪ Zr is structurally incoherent and generally does not worth the algorithmic complexity. The question that immediately arises is whether is possible, in general, a coherent level of reality ensemble Cn. In this case, the expression levels of reality ensemble, in general, mean a local extension. Then, the self-consistency refers to Cn ∪ Zr as irreducibility of contradiction and relativity of consistency. The union operator ( ∪ ) here is being taken in Lupasco sense, of distinguishable non-separability ∧ A ( is actualization) ¬∧ P ( is not potentiation), ie, Cn ∪ Zr = ∧ A ¬∧ P. This formulation presents us a problem about the use of denial and implication in included third logical. For example: Is the included third or not a negation of the principle of the excluded third (ETP)? Apparently, the denial of the excluded third, is not equivalent to intuitionist denial which is based on openings of a topological space , ( )X A X in which

implication definition (→ ) does not follows ( , , )∧ ∨ ¬ as in classical logic ; ( )A B¬ ∧ ¬ but not worth the double negation, A¬¬ A.

In transdisciplinary logic, the denial of the excluded third when generates a contradiction makes use of classical denial by the principles of De Morgan

( ) ( )A A A A¬ ∨ ¬ ≡ ¬ ∧ ¬¬ , what suggests us that in A(X) there is induced operations PX, the set of subsets of X, of finites intersection and union which are closed and form a Boole algebra. So there is not a necessary link between incompleteness, that we will call C¬ and A(X), the opening of a topological space, i is ¬(¬C → A(X)). Thus, we believe that the transdisciplinary local toposes form incomplete closed spaces in which worth the classical logic with open complement in which worth a weaker denial, possibly of intuitive type where (¬C→ A(X)). These spaces or logic fibrillation, ¬( ¬C→A(X)) ⊕( ¬C→ A(X)) ⊕ ϖ ⊕ ( ¬C → A(X)) ⊕ ¬(¬C → A(X)) = Ω , Where ϖ is the intrinsic contextualization, forming a logic of contradictory. So, we have to:

1) Ω ¬( ¬C → A(X)) ⊕( ¬C→ A(X))⊕ ϖ

2) Ω ϖ ⊕ ( ¬C → A(X)) ⊕ ¬(¬C → A(X))

From 1) e 2) is made 1Ω and so on.

OBJECTIVE This work aims to investigate the possibility of a logical construction of transdisciplinarity in the Hilbert style LT 1 . In particular, we believe that a transdisciplinary logical is formed from an intuitionist classical logical combination and non-classicals temporal inconsistent. Give an algebraic formalization of local/ temporal inconsistent interaction whose subjacent logic is intuitionist and non-local, making use of W. Carnielli’s possible translations semantics and Tarski’s concept of logic as consequence relations.

Page 18: 29364360 the-logic-of-transdisciplinarity-2

METHODOLOGY The methodology used in the logics combination to form a transdisciplinary logic is the categorical fibrillation. The process of logics composition can be synthetic (Splicing logic), for example: fibrillation of D. Gabbay. L= L 1 L 2 if L is our incognite, since it is known L1 and L 2 , or can be analytical (Splitting logic), for example: W. Carnielli’s possible translations semantics: L=L 1 ... L n in the case where L is known and its decomposition in more simple logic. The question that arises before combining logics is: what kind of structure we want to achieve with the concept of logical system: proof system as tablê, axiomatic system, natural deduction, or semantic methods as logical matrices, valuations and Kripke’s semantics? As usual in this kind of methodology, A. Tarski’s concept of logic enables more resourcefulness in the combined treatment of logics, which is independent of the concept of valid formulas. What's common in any system of logic is the concept of logical consequence denoted by defined from an ensemble of sentences or formulas of L.

Definition:

A system of Hilbert is a pair H = ‹ C, R› as such C it is a signature and R one set of pairs of the formula ‹ Γ,Ψ › as such Γ U Ψ it is a formula set of L( C ). The elements ‹ Γ,Ψ› of R, the elements are called rules if Γ= is called axiom. We can make a fibrillation of logic in Hilbert’s type to form transdisciplinarity logic in such way that they are morphed in the systems of categories of Hilbert Hil. Therefore, the logic of the transdisciplinarity forms a composition of classic logics and inconsistent in Hilbert’s style.

JUSTIFICATION There is no doubt that the twentieth century was the center of great changes in all areas of knowledge, particularly in the scientific universe, resulting in remarkable victories in every technique and production fields. In spite of that, it could be noted that knowledge splitting and compartmentation, inherited from a tradition shaped by the thoughts generated during the interstice between fifteenth and nineteenth centuries, was no longer enough to create the epistemological references required to solve the features of knowledge itself. Many problems seemed to be out of the entangled theory systems, and a kind of blindness hovered over the attempts to understand many nature’s fundamental problems, as well as its more common problems. However, debates about topics like the structure of matter, the objectivity or the possible

Page 19: 29364360 the-logic-of-transdisciplinarity-2

relationships between the observer and the levels of reality, brought to light unsurpassable paradoxes. Therefore, new ideas emerged, as the Theory of Relativity, the Quantum Mechanics, and the Complexity Theory, among others (Nicolescu, 2003). Anyway, the straighten up of the contents referring the Complexity Theory suggests a revaluation of the systems for selecting and determining conceptualization, as well as of the systems that configure the logical operations. This intends even to affect the designation structures of the intelligibility fundamental categories, like the mechanisms that operate their application control. It is accepted, therefore, that thinking about the complex systems implies accepting the need to overcome major challenges, starting from the implications of the multidimensional and hologramatic structures, or even the interconnectivity and inseparability of the “one” and the “multiple,” which is pertinent to it. Despite the expectation that open up new possibilities to understanding and the sociocultural transformations that this could raise, it is known that it would be impossible to embrace the study of the Complex Systems only starting from the established contemporary disciplinarities. During the First World Congress of Transdisciplinarity, held in Convento de Arrábida, Portugal, from 2-7 November, 1994, a Letter of Intent was prepared, outlining a set of fundamental principles, which exalted the need to consolidate the transdisciplinar and transcultural thinking as the best way to approach the different aspects of complexity in distinct systems (Nicolescu, 2001). Transdisciplinarity would then represent a conception of the research based on a new comprehension milestone, shared among several disciplines, and which is followed by mutual interpretation of the disciplinary epistemologies. The cooperation, in this case, would be headed for problems solving, where transdisciplinarity emerges to build a new model to bring the realities of the object of study closer (Hernández, .....). Turning to the Article 14 of the Transdisciplinarity Letter, whose terms refer to argument rigidity as the best limit to the possible conceptual deviations from the articulation of the problem situation data, we can justify the need for unfolding a logic-formal and axiomatic characterization of the Transdisciplinarity Theory, mainly when we consider the need for another epistemic perspective suitable to the changes in the way of thinking, inherent of the last 200 years. Further, considering that a logic system is always useful to choose the prevailing operations, relevant and evident under its domain (exclusion-inclusion, disjunction-conjunction, implication-negation), it is intended, through this logical constructions, to foster the inclusion of other features to the transdisciplinary way of thinking, like the “Opening”, comprehending the unknown, unexpected, and unpredictable acceptance; and “Tolerance”, as an exercise to recognize the right of the ideas and truths to be the opposite of ours. Finally considering that logic keeps close connections with metaphysics, mathmatics, philosophy and linguistics, we can appraise the impacts from this project, especially regarding the reiteration of almost every classical logic principles disruption, which sustain the ground theory in all the study fields mentioned above.

Page 20: 29364360 the-logic-of-transdisciplinarity-2

In Brazil, many research groups are involved in studing logic, standing out Prof. Newton da Costa group, which I am involved in, becoming specialized in non-classical logics. Nevertheless, regarding the indication of a wide work field outlining, I intend, through this proposition, to look for deeper Transdisciplinarity subsidies from a more authentic source, hoping that it will constitute, in a near future, an important reference to the development of theoretical and technological studies on semiotics. Therefore, we expect that the outcomes from this project accomplishment allow a contextualization in this promising scenario, and serve as a back up to exploration approaches of new horizons, particularly those which find themselves immersed in the paradigms invisible zone, so that we can overcome the determinism of the current explanatory models, which are associated to the convictions and beliefs systems, in every scope. This must contribute to the collapse of the cognitive and intellectual conformisms.

Bibliography Alain de Benoist, L'œuvre de Stéphane Lupasco, Le Club Français de la Médaille, n° 53, Paris, 1976.

Aerts, D. and Aerts, S., 1994, Applications of quantum statistics in psychological studies of decision processes, Foundations of Science 1: 85-97.

Aerts, D., Aerts, S., Broekaert, J. and Gabora, L., 2000a, The violation of Bell inequalities in the macroworld. Foundations of Physics 30: 1387-1414.

Aerts, D., 1985, The physical origin of the EPR paradox and how to violate Bell inequalities by macroscopical systems. In P. Lathi and P. Mittelstaedt (eds) On the Foundations of Modern Physics (World Scientific: Singapore), pp. 305-320.

Aerts, D. and Aerts, S., 1994, Applications of quantum statistics in psychological studies of decision processes, Foundations of Science 1: 85-97.

Aerts, D., Broekaert, J. and Gabora, L., 1999, Nonclassical contextuality in cognition: borrowing from quantum mechanical approaches to indeterminism and observer dependence. In R. Campbell (ed.) Dialogues Proceedings of Mind IV Conference, Dublin, Ireland.

Aerts, D., Broekaert, J. and Gabora, L., 2000b, Intrinsic contextuality as the crux of consciousness. In K. Yasue (ed.) Fundamental Approaches to Consciousness. (Amsterdam: John Benjamins Publishing Company), pp.173-181.

Atsushi Takahashi, Du logique et de l'être chez Stéphane Lupasco, The Review of Liberal Arts, n° 83, Otaru University of Commerce, Otaru, Hokkaido, 1992.

Page 21: 29364360 the-logic-of-transdisciplinarity-2

Bernard Dugué, Utilisation de la logique dynamique du contradictoire pour formaliser les systèmes : vers un paradigme ondulatoire en biologie ?, Revue Internationale de Systémique, vol. 5, n° 4, Paris, 1991.

Basarab Nicolescu, Lupasco et la génèse de la Réalité, 3e Millénaire, n° 3, Paris, 1982.

Trialectique et structure absolue, 3e Millénaire, n° 12, Paris, 1984

Nous, la particule et le monde, ch. "La génèse trialectique de la Réalité", Le Mail, Paris, 1985

Regrettables oublis, 3e Millénaire, n° 18, janvier-février 1985.

Préface à L'expérience microphysique et la pensée humaine, op. cit., 1989.

La transdisciplinarité , manifeste, Le Rocher, 1996

Levels of Complexity and Levels of Reality, in "The Emergence of Complexity in Mathematics, Physics, Chemistry, and Biology" , Proceedings of the Plenary Session of the Pontifical Academy of Sciences, 27-31 October 1992, Casina Pio IV, Vatican, Ed.Pontificia Academia Scientiarum, Vatican City, 1996 (distributed by Princeton University Press), edited by Bernard Pullman.

Bernard Morel, Dialectiques du mystère, La Colombe, Paris, 1962

Constantin Noïca, Préface à Logica dinamica a contradictoriului, op. cit., 1982.

Ed. Morot-Sir, S. Lupasco, Logique et contradiction , Revue des Sciences Humaines, n° 49, Faculté de Lettres de Lille, janvier-mars 1948.

Françoise Garoche, L'analyse paradoxale : Trialectique et système - Méthodologie de formation et d'évaluation, Revue Française de Pédagogie, n° 75, Paris, avril-mai-juin 1986.

Jean-Jacques Wunenburger, La raison contradictoire - Sciences et philosophie modernes : la pensée du complexe, Albin Michel, Paris, 1990.

HERNÁNDEZ, Fernando. Transgressão e mudança na educação: os projetos de trabalho; trad. Jussara Haubert Rodrigues. Porto Alegre: Artes Médicas, 1998.

Marc Beigbeder, Contradiction et nouvel entendement, Bordas, Paris, 1972 (thèse de doctorat).

Pierre Solié, Ouverture sur l'unité du monde, Cahiers de Psychologie Jungienne, n° 28 - "Synchronicité - Correspondance du psychique et du physique", Paris, 1er trimestre 1981.

Page 22: 29364360 the-logic-of-transdisciplinarity-2

Robert Amadou, "Le principe d'antagonisme et la logique de l'énergie" de Stéphane Lupasco, La Gazette des Lettres, 15 août 1951.

George Melhuish, The Paradoxical Universe, Rankin Bros Ltd. Bristol, 1959.

Gérard Moury, Stéphane Lupasco - Pour une nouvelle logique : la logique dynamique du contradictoire, Institut National de Recherche et de Documentation Pédagogiques, Paris, 1976.

Yves Barel, Le paradoxe et le système, Presses Universitaires de Grenoble, Grenoble, 1979.

W. A. Carnielli, J. Marcos e S. de Amo. Formal inconsistency and evolutionary databases. Logic and Logical Philosophy, 8:115–152, 2000.

M. E. Coniglio. Combina¸c˜oes de sistemas de conseq¨uˆencia. Atas do XI Encontro Nacional de Filosofia (ANPOF). A aparecer, 2004.

W. A. Carnielli e C. Sernadas. Preservation of Interpolation by Fibring. In F. Miguel Dion´ısio W. A. Carnielli e Paulo Mateus, editores, Proceedings of CombLog’04–Workshop on Combination of Logics Theory and Applications, p. 151–158, Lisboa, 2004. Center for Logic and Computation–IST.

Carnielli, W. A. Possible - translation semantics for paraconsistent logics. Proceedings of the 1st World Congress on Paraconsistency. (D. Batens, C. Mortensen, G.

Priest, J. P.Van Bendegem eds.). Research Studies Press. Baldock, UK: 149 – 163. (2000).

Goldblatt, R. Topoi. The categorial analysis of Logic. Nort Holland, Amsterdam.

N. C. A. da Costa. Sistemas Formais Inconsistentes. Tese de doutorado, Universidade Federal do Paraná, Curitiba, PR, Brasil, 1963. Editado pela Editora UFPR, Curitiba, 1993.

C. Hifume. Uma Teoria da Verdade Pragmática: A Quase–Verdade de Newton da Costa. Dissertação de mestrado, Universidade Estadual de Campinas, Campinas, SP, 2003

R. A. Lewin, I. F. Mikenberg e M. G. Schwarze. Algebraization of paraconsistent logic P1. The Journal of Non-Classical Logic, 7(1/2):79–88, 1990.

J. Marcos. Semânticas de Traduções Possíveis. Dissertação de mestrado, IFCH-UNICAMP, Campinas, Brasil, 1999. URL = http://www.cle.unicamp.br/pub/thesis/J.Marcos/.

Pigozzi, 0?] Blok, W., Pigozzi, D. Abstract algebraic logic and the deduction theorem. The Bulletin of Symbolic Logic. (A aparecer).

Page 23: 29364360 the-logic-of-transdisciplinarity-2

[Blok – Pigozzi, 1986] Blok, W., Pigozzi, D. Protoalgebraic logics. Studia Logica, 45:337– 369. (1986).

[Blok – Pigozzi, 1989] Blok, W., Pigozzi, D. Algebraizable Logics, volume 77 (396) of Memoirs of the American Mathematical Society. AMS, Providence, Rhode Island. (1989).

Bueno, J., Coniglio, M. E., Carnielli, W. A. Finite algebraizability via possible-translations semantics. Proceedings of CombLog’04 - Workshop on Combination of Logics: Theory and Applications. ( W. A. Carnielli, F. M. Dion´ısio, P. Mateus eds.): 79–86. (2004).

“Range Theorems for Quantum Probability and Entanglement”, in Khrennikov, A. (editor) Quantum Theory: Reconsideration of Foundations, 299-308 Vaxjo, Vaxjo University press (2002).

(With Oron Shagrir) “The Church-Turing Thesis and Hyper Computation”, Minds and Machines 13, 87-101 (2003).

(With M. Hemmo) "Quantum Probability and Many Worlds" (Forthcoming in a special issue of Studies in the History and Philosophy of Modern Physics, edited by R. Frigg and S. Hartmann).

Le principe d'antagonisme et la logique de l'énergie - Prolégomènes à une science de la contradiction, Coll. "Actualités scientifiques et industrielles" , n° 1133, Paris, 1951 ; 2ème édition : Le Rocher, Coll. "L'esprit et la matière" , Paris, 1987, préface de Basarab Nicolescu.

.

Page 24: 29364360 the-logic-of-transdisciplinarity-2
Page 25: 29364360 the-logic-of-transdisciplinarity-2