Научная статья на тему 'METHODOLOGICAL PARADIGM OF NON-CLASSICAL SCIENCE'

METHODOLOGICAL PARADIGM OF NON-CLASSICAL SCIENCE Текст научной статьи по специальности «Физика»

CC BY
28
18
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Wisdom
Область наук
Ключевые слова
non-classical rationality / operational relativity / complementarity principle / symmetry / indeterminism

Аннотация научной статьи по физике, автор научной работы — Andrey Galukhin, Elena Malakhova, Irina Ponizovkina

Scientific theories and methods developed within the framework of quantum and relativistic physics are the most representative paradigmatic instantiations of non-classical science. The profile of nonclassical science is exposed through the analysis of a set of epistemic ideals and methodological principles. The adoption of the principle of operational relativity of phenomenal descriptions showed that a reference to the means of observation had become an intrinsic part of scientific description strategies. The transformation of the concept of objectivity can be seen in a specific combination of operationalism with interactional phenomenalism and constructivism. The introduction of the principle of complementarity marked the deviation from the standards of a monologic and linear description of the objects under study. This principle provides the operational basis for the integration of different parts of our knowledge with regard to non-trivial cognitive situations featured by the indeterminacy relations. Another prominent methodological trend is the reconsideration of the value of strict deterministic explanation strategies in favour of probabilistically oriented approaches. Scientists have encountered a new class of regularities that are typically analysed in terms of various types of statistical and non-causal determination. Nevertheless, it would be wrong to assume that any probabilistic account of natural phenomena implies indeterminism.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «METHODOLOGICAL PARADIGM OF NON-CLASSICAL SCIENCE»

DOI: 10.24234/wisdom.v21i1.593

Andrey GALUKHIN, Elena MALAKHOVA, Irina PONIZOVKINA

METHODOLOGICAL PARADIGM OF NON-CLASSICAL SCIENCE

Abstract

Scientific theories and methods developed within the framework of quantum and relativistic physics are the most representative paradigmatic instantiations of non-classical science. The profile of non-classical science is exposed through the analysis of a set of epistemic ideals and methodological principles. The adoption of the principle of operational relativity of phenomenal descriptions showed that a reference to the means of observation had become an intrinsic part of scientific description strategies. The transformation of the concept of objectivity can be seen in a specific combination of operationalism with interactional phenomenalism and constructivism. The introduction of the principle of complementarity marked the deviation from the standards of a monologic and linear description of the objects under study. This principle provides the operational basis for the integration of different parts of our knowledge with regard to non-trivial cognitive situations featured by the indeterminacy relations. Another prominent methodological trend is the reconsideration of the value of strict deterministic explanation strategies in favour of probabilistically oriented approaches. Scientists have encountered a new class of regularities that are typically analysed in terms of various types of statistical and non-causal determination. Nevertheless, it would be wrong to assume that any probabilistic account of natural phenomena implies indeterminism.

Keywords: non-classical rationality, operational relativity, complementarity principle, symmetry, inde-terminism.

Introduction

tal value of relativistic, quantum, systemic, probabilistic and synergetic ideas. This led to the crucial transformation of the methodological principles of scientific knowledge.

"Non-classical science" is an umbrella term that is used for denoting various manifestations of a new type of scientific rationality, which dates back to the era of paradigmatic transformations associated with the revolutionary changes in natural sciences that were taking place during the first half of the 20 century. Due to the progress in the prominent areas of scientific research, specifically in thermodynamics, electrodynamics, atomic physics, cosmology and microbiology, the cardinal paradigmatic shifts in conceptual frameworks and methodological approaches occurred. The development of scientific research programs in those areas was accompanied by a growing recognition of the fundamen-

In this paper, the profile of non-classical science is articulated through the analysis of a set of epistemic ideals, norms and principles that form a unique methodological paradigm, representing a new type of scientific rationality. We take scientific theories and methods developed within the framework of quantum and relativistic physics as the most representative paradigmatic instantiations of non-classical science. The methodological paradigm of quantum and relativistic physics incorporated a set of principles that determined some new ways of describing, explaining and predicting natural phenomena in a set of

recently discovered physical domains. Among the principles, which represent the most typical characteristics of the method of non-classical science, are as follows: the principle of operational relativity, which requires to make scientific descriptions with reference to the features of experimental arrangement and implies the necessity to explicate the operational foundations of a physical theory; the complementarity principle, according to which in case of contradictory phenomena, specifically when the physical quantities cannot both have a well-defined value, the exposition of an object under study should be given through a combination of its' complimentary descriptions; the principle of symmetry which has gained a more comprehensive application due to the increasing value of mathematical constructivism and the growing acceptance of systemic approach; and a set of other principles that provided the basis for the development of some new models of explanation deviating from classical deterministic schemes. In this paper, we provide an analysis of some really innovative schemes of scientific description and explanation which have been introduced into science with the establishment of a new methodological paradigm incorporating these principles. The primary goal is to reveal epistemological implications of the most significant methodological norms which formed a regulative basis for the development of heuristic research programs in non-classical science, specifically in physics, and to show that they instantiate a rather specific type of scientific rationality which is different from the classical one.

Operational Relativity and a New Standard of Knowledge Objectivity

It would be worth considering the paradigm of cognitive attitudes to the world, which was instantiated in the ideals and norms of Modern age science. The classical ideal of scientific knowledge objectivity was specified by a set of principles implying the recognition of the univer-

sal value of the view, which could be categorised as 'the naturalistic object-centrism': "due to the universality of the discovered law, due to the integrity of the comprehended world, and due to the unity of the methods of research, science got rid of the presence of both an observer and observation instruments, and this was seen as a guarantee of its objectivity" (Romanovskaya, 1995, p. 105). The classical understanding of the principle of objectivity of scientific knowledge was based on a philosophically grounded combination of an abstract belief in the unrestricted cognitive powers of human reason with an 'object-centric view' of the structure of cognitive activity. This view suggested that there must be a reasonably strict distinction between an agent performing cognitive acts and an object which is accessible in its' naturalistic immediacy and can be transformed into the target object of the agents' cognitive intentions. The core of this classical epistemological paradigm incorporated the essential assumption that any account of the objectivity of scientific knowledge should be abstracted from the identification of the position of an agent, which meant that any scientific description should be given without explicit reference to the instrumental, procedural and situa-tional aspects of agent's activity.

There was a crucial transformation of the concept of objectivity of scientific knowledge, which marked the establishment of a non-classical kind of scientific rationality (Horgan & Tien-son, 1994). The principles of non-classical scientific rationality can be revealed on the basis of generalisation of the features of special research methods developed for the study of a new type of objects, specifically those that were identified and localised in the area of quantum-mechanical phenomena. In this area, objectivity (in the sense of a natural system's observable behaviour objectified 'before and independently' of that system's coming into interaction with the tools constituting experimental arrangement) "turned out to be a rather rough approximation and had to give way to more abstract ideas" (Markova,

1998, p. 78). The emergence of non-classical scientific rationality was marked by the introduction and consolidation of epistemic normative frameworks and methodological approaches associated with the embracement of some "new cognitive ideals, according to which a clear fixation of the means and operations of cognitive, specifically, experimental activity, is not an obstacle to objective description and explanation of natural processes (as it was supposed in classical natural science), but a necessary condition for the adequacy and completeness of scientific descriptions and explanations" (Stepin, 1995, p. 65).

Thus, what is common to methodological standards applied in non-classical, quantum and relativistic physics is a set of norms requiring the explication of the operational basis of theories; that means that an apparent reference to the means of observation and to the methods of measurement has become an intrinsic part of scientific description strategies. By these standards, not only the semantics of observational terms but also the ontological value of relevant theoretical constructs could not be determined without taking into account the specificity of measurement procedures and of instrumental constituents of real and virtual experimental situations.

Consider, for instance, the difference between the phenomenology of measurements in quantum mechanics and the way of idealising measurements in classical mechanics: "In classical mechanics, measurements are idealised as testing whether a system lies in a certain subset of its phase space. This can be done in principle without disturbing the system, and the test result is, in principle, fully determined by the state of the system. In quantum mechanics, none of these idealisations can be made. Instead: (i) measurements are idealised as testing whether the system lies in a certain (norm-closed) subspace of its Hilbert space; (ii) a measurement, in general, disturbs a system: more precisely (and in the ideal case) unless the state of the system is either contained in or orthogonal to the tested subspace, the state is projected onto either the tested sub-

space or its orthogonal complement (this is known as the "collapse" of the quantum state, or the "projection postulate"); (iii) this process is indeterministic, with a probability given by the squared norm of the projection of the state on the given subspace (the "Born rule" or "statistical algorithm" of quantum mechanics)" (Bacciaga-luppi, 2013, pp. 304-305).

The principle of building the picture of reality concerning the means of observation, operational constituents and procedural factors of research activity was integrated into the regulatory basis of non-classical physical theories (Djidjian, 2016). This can be termed as 'the principle of operational relativity'. Methodological value and epistemological implications of this principle can be revealed concerning nontrivial cognitive situations that are typically characteristic of quantum-relativistic physics. In the physics of atomic micro-processes, for example, it would be impossible to give the proper interpretation of atomic events without taking into account the interaction of an object (or an atomic system) with a measuring device. As Niels Bohr (1963b) noted, "while, within the scope of classical physics, the interaction between object and apparatus can be neglected or, if necessary, compensated for, in quantum physics this interaction thus forms an inseparable part of the phenomenon. Accordingly, the unambiguous account of proper quantum phenomena must, in principle, include a description of all relevant features of the experimental arrangement" (p. 4).

Fixing the features of measurement devices that interact with atomic systems and making operationally valid complementary descriptions pertaining to the phenomenal states of the objects under investigation in quantum physics are the most typical instances of implementing the principle of operational relativity.

And what about the ontological implications of this principle? It is through the interaction between an object and the relevant experimental device that the cognisable reality is revealed, that is, the object is exposed on the "cut" of its actual

states and, therefore, the horizon of ontologically meaningful propositions becomes articulated: "the transition from thè possible' to the àctual' takes place as soon as the interaction of the object with the measuring device, and thereby with the rest of the world, has come into play" (Heisenberg, 1990, p. 21). Adopting the idea of the complex interactive-phenomenal nature of reality led to the further transformation of ontological schemes and assumptions underlying classical scientific research programs.

Methodological extension of the principle of operational relativity can be seen in a set of standards (Bryanik, 2019), which require to determine the operational value of the concepts used when elaborating theoretical models based on accepted idealisations (consider, for instance, Bridgman's operationalism and Smirnov's method of intensional and constructive interpretation of the empirical concepts). Scientists who shared the view that basic concepts should be formulated in the language of accessible experience encountered the problem of finding operational criteria for establishing commensurability of the macroscale of the experimental equipment and the microscale of the cognisable objects. Within the framework of relativistic physics, the applications of fundamental physical concepts for the description of a physical system is restricted by the principle which requires to take into account the relation of that system to the state of movement of the observer"; in terms of epistemological implications, "we are talking about the relativity of these concepts and quantities, meaning their relation to the means of their measurement... A similar situation, in general, was realised in quantum physics, which proclaimed the relation of the manifestations of wave and corpuscular properties of micro-objects to the means of their observation" (Zhdanov, 1995, p. 77).

So, in non-classical physics, the reformation of the concept of objectivity transforms the structure of cognitive intentions. The classical linear way of describing natural phenomena presup-

posed homogeneity of experience; it was established under deterministic assumptions pertaining to classical Newtonian physics. In non-classical quantum physics, "one must include the measuring device as an active participator in the measurement, not just a recorder of a fixed value" (Whitaker, 1996, p. 217). The constitution of atomic systems and their reactions to external influence are "fundamentally determined by the quantum of action" (Bohr, 1963a, p. 11).

The phenomenology of quantum interference imposes further limitations on the schemes of scientific description: that is because "we may obtain different recordings corresponding to various individual quantum processes for the occurrence of which only statistical account can be given" (Bohr, 1963a, p. 12). The structure of scientific descriptions turns out to be complicated due to the fact that any description would be incomplete without reference to alternative phenomenal dimensions in which the quantum system is exposed. Thus, the alteration of the concept of objectivity in quantum physics is correlated with the essential deviation from the principle of the linearity of description: the same quantum system can be identified as being in the superposition with regard to its possible states and under certain experimental conditions, this 'same' system exhibits different and even incompatible properties (consider, for example, the cases of wave-corpuscle dualism). This phenomenal heterogeneity presupposes contrasting pieces of evidence and stresses the need for alternative and complementary descriptions. The cognitive strategy, which allows for the irreducible complexity of non-linear description of the objects of the study, indicates that there are essential possibilities of alternative and variable cognitive perspectives of the world.

Complementarity and Non-Linearity of Scientific Descriptions

Dispositions to non-classical rationalisation of scientific knowledge manifested themselves in

those epistemological shifts that involved reconsideration of the classical standards of the linear description of natural phenomena and the subsequent acknowledgement of the value of the complementarity of different descriptions having divergent evidential basis.

The introduction of the principle of complementarity into the regulative foundations of science resulted from methodological analysis of those cognitive situations that appeared to be paradoxical from the classical point of view. As we have noted, the methodological paradigm of quantum physics was integrated on the basis of the general rule that objective description of atomic systems cannot be obtained without making explicit reference to the features of the experimental arrangement. However, in different experimental situations, atomic objects tend to manifest incompatible properties (e.g., the properties of a wave displayed by the atomic system under certain experimental conditions are incompatible with the properties of a particle displayed by the same object in a different experimental situation), and such an extraordinary mode of behaviour imposes restrictions on the consistency of the overall phenomenal description in a classical sense.

The search for an adequate mathematical expression for the incompatibility of a quantum object's wave and corpuscular properties was an essential part of the preliminary development of the apparatus of non-classical physical theory. The formulation of the uncertainty, that is, indeterminacy relations by V. Heisenberg, was one of the preconditions for the effective completion of this search. The indeterminacy relations principle was introduced as a heuristic device for giving formal expression to the relations between operators in quantum mechanics that do not commute: the problem is that "we cannot identify a function that would be an eigenfunction of both coordinate and momentum. As a consequence of the definition of the coordinate and momentum operators in quantum mechanics, there can be no state in which the physical quantities, coordinate

q and momentum p, both have a well-defined value" (Prigogine & Stengers, 1984, p. 223).

In fact, the idea of indeterminacy relations was derived from purely formal mathematical developments of the apparatus of quantum mechanics. This idea was subjected to further explication and interpretation when indeterminacy relations were mapped onto the picture of the investigated reality.

On the one hand, the indeterminacy relations could be taken as being indicative of the phenomenal features of the objects under investigation. On the other, the whole situation in which indeterminacy relations are displayed could be interpreted in epistemic terms, that is, as a specific case of knowledge indetermination: "The knowledge of the position of a particle is complementary to the knowledge of its velocity or momentum. If we know the one with high accuracy, we cannot know the other with high accuracy; still, we must know both for determining the behaviour of the system" (Heisenberg, 1990,

p. 17).

N. Bohr claims that in quantum mechanics, scientists encounter "a novel type of relationship, which has no analogue in classical physics and which may conveniently be termed "complementarity" in order to stress that in the contrasting phenomena we have to do with equally essential aspects of all well-defined knowledge about the objects" (Bohr, 1948, p. 314). In order to provide an adequate and complete description of such objects, one should rely on the evidence obtained by different, even mutually exclusive experimental arrangements and use "complementary" classes of concepts, interpretative frameworks and principles of representation. In other words, in order to develop an efficient methodological framework for scientific descriptions, one should take into account the features of experimental arrangement and include the reference to a specific operational structure of observation into the descriptive content, articulated in the interpretation of the variables of the quantal formalism.

Complementarity is the intrinsic feature of the method of building a paradoxically integral picture of the object under investigation. The complementarity of descriptions is relevant to the situation in which the absolute localisation of an object seems to be problematic from the classical point of view.

The complementarity principle can be considered a methodological principle with a relatively broad spectrum of applications. Thus, researchers quite often encounter situations in which the determination of the spectrum of possible states of a complexly organised system requires to take into account the complementarity of its' structural, functional and genetic characteristics, and this, in turn, implies that various complementary methods of its description should be used. Furthermore, probability function, insofar it is likely to include some additional variables, also tends to become more complex. The establishment of the principle of complementarity as one of the core principles of the non-classical methodology of science imposes significant limitations on classical reductionist-fundamentalist intentions: "The bottom line is that multilayered, polyfundamental variable systems cannot be conceptualised from any privileged positions. Complementarity from this point of view is a consequence of polymorphism, heterogeneity of the accepted ontology with its attributive potential" (Il'in, 1994, p. 76).

Among the possible specifications of the principle under consideration, there is one which assumes the complementarity of describing objects in parts and as a whole: "Such complementarity of a partial and holistic description pertains not only to accounting for the behaviour of micro-objects under the conditions of observation when a holistic macroscopic device is used, it is also relevant to approaching the more general problem of the relationship between reduction-ism and the systemic approach, as well as to recognising the advantages of the complementarity of the figurative form of intuitive ideas and the subsequent quantitatively developed theoretical

model" (Zhdanov, 1995, p. 78).

This principle itself has become an object of epistemological analysis, which aims to determine the philosophical significance of the foundations of efficient methods of heuristically oriented scientific research programs.

In its philosophical value, the principle of complementarity can be interpreted as the principle of polyvariant organisation of a system of cognitive procedures, which is characterised by the intrinsic capacity of producing alternative sets of representations of the same object. The implication is that cognitive strategies should be developed in such a way that would enable scientists to take into account the relation of an object of inquiry to a set of complementary expanded conceptual schemes, experimental arrangements and methods of description. This perspective has very little in common with the object-centred essentialism and linearism of classical science. Philosophical analysis and episte-mological interpretation of the principle of complementarity reveals the preconditions for the transformation of the knowledge about the features of the overall situation of observation into the evidence essential to obtaining knowledge about the object under investigation.

The establishment of a new standard for the objective scientific description and the subsequent introduction of the principle of complementarity of descriptions marked the crucial transformation of epistemological foundations of science.

Symmetry and Systemic Complexity

The principle of symmetry was integrated into the methodological framework of scientific research due to the need to develop a model for describing natural processes of varying complexity, which would be in accord with the general strategy of identifying invariants in a set of objective systems' transformations. The principle of symmetry also matters for the specification of a more fundamental principle of the causal unity

of the physical world.

The concept of symmetry comprises a set of postulates of physical theory that are explicated through a set of relevant categories, among which the most significant are the categories of identity, conservation, invariance. These categories are used in order to identify and qualify the symmetric form, which is manifested both in the structure of objects and in the transformative processes taking place in the physical world. The idea of symmetry is a really valuable device for making account of that which remains preserved throughout the changes in physical systems qualified by a group of transformations.

The considerations of symmetry come into play when scientists try to identify invariants in a set of transformations or to detect the transformational compatibility of a number of essential parameters of a system. In a broad sense, symmetry form is identified with the invariable elements in the evolution of spatiotemporal systems.

Symmetry can serve as a reliable indicator of regularities, specifically those that fall under the general category of the laws of nature. As Wig-ner (1964) notes, "there is a great similarity between the relation of the laws of nature to the events on the one hand, and the relation of symmetry principles to the laws of nature on the other" (p. 35). Wigner (1964) claims that the function of the invariance principles is "to provide a structure or coherence to the laws of nature, just as the laws of nature provide a structure and coherence to the set of events" (p. 36).

The generative effects of using the principle of symmetry on the development of conceptual schemes employed in physics are manifested in many ways. The most representative cases are as follows: the discovery of the law of the conservation of parity in strong and electromagnetic interactions (the concept of parity corresponds to the enantiomorphism of mathematical functions describing particles) - it involved the identification of a relevant symmetric form which corresponds to this conservation (the exception is the case of "tau" and "theta" particles); the explication of the

concept of symmetry (in case of fermions -asymmetry) of the wave function concerning changes in the coordinates of interchangeable particles, and the use of Fermi-Dirac's and Bose-Einstein's statistical frameworks for the formal description and prediction of particle dynamics; and, finally, the application of the angular-momentum conservation theorem in quantum mechanics (Fano & Rao, 1996, pp. 41-46). Thus, "the gauge symmetry of classical electromag-netism can seem to be no more than a mathematical curiosity, specific to this theory; but with the advent of quantum theory the use of internal degrees of freedom, and the related internal symmetries, became fundamental" (Brading & Cas-tellani, 2007, pp. 1344-1345).

The development of postclassical science, instantiated in heuristic theories of quantum physics and thermodynamics, led to paradigmatic shifts in ontological assumptions that determine the vision of the structure of reality (Stefanov, 2018). Insofar ontological assumptions were correlated with methodological principles, and scientists had to reconsider the idea of symmetry in a less stringent deterministic way. The idea of symmetry was placed in a new context and acquired an almost transcendental status: it was used as a heuristic methodological device for developing theories. The most exciting fact is that the idea of symmetry was integrated into the regulatory basis for selecting deterministic postulates. Thus, "for classical physics in general, symmetries - such as spatial translations and rotations - were viewed as properties of the laws that hold as a consequence of those particular laws. With Einstein, that changed: symmetries could be postulated prior to details of the laws being known and used to place restrictions on what laws might be postulated. Thus, symmetries acquired a new status, being postulated independently of the details of the laws, and as a result having strong heuristic power" (Brading & Castellani, 2007, p. 1347).

Innovative methodological trends resulted in the emergence of a paradigmatic orientation to-

wards the development and implementation of some complex conceptual schemes, which predisposed scientists to obtain a system-holistic, intrinsically multidimensional and dynamic vision of the object of study. Within this systemic framework, descriptive and explanatory methodologies of post-classical science have been formed. Furthermore, even nowadays, this systemic approach is employed in various disciplinary fields of science.

The methodological role of the principle of symmetry, combined with the principles of systemic integrity and complexity, is manifested in programs for the theoretical description of the behaviour of complex systems of the macro- and microworld. The methodological implications of the principles of systemic integrity and complexity are exposed in a set of standards that are best suited for an adequate description of integrated, structurally diverse and functionally heterogeneous formations, that is, systems that are characterised by the complexity of direct and inverse relations and featured by functional dynamics which cannot be accounted for in pure deterministic terms.

The recognition of the value of systemic approach by the scientists operating in various disciplinary fields raises the epistemological status of the principle of symmetry in the study of objects of a new type, specifically those that are subjected to investigation in physics and thermodynamics: when describing those objects as systemic, complexly organised in dynamic wholes and revealing their structural invariants, causal relations and functional dependencies, scientists are governed by the general idea of identifying universal symmetry groups, that is, they follow the course of determining the class of symmetric forms that meet the conditions for the systemic description of an object under investigation. The very idea of symmetry has systemic implications.

The epistemological value of the principle of symmetry has also increased due to some newly displayed methodological trends that are really characteristic of how theoretical knowledge is

generated and developed in non-classical science. Consider, for instance, such trends, as the growing effectiveness of applying a broad class of formal-axiomatic constructions and quite a regular use of the method of mathematical hypothesis. The use of group-theoretical approaches has become an intrinsic part of those methodological strategies employed for constructing theories in various fields of scientific knowledge. Consequently, the concept of symmetry group has been adapted for theoretical work in physics. This concept has its own genealogy: it is related to the fundamental mathematical concept of the group, which was translated from the field of algebra and integrated into the conceptual system of theoretical physics: "In modern mathematised theory, the concept of a group, developed based on Galois ideas, has turned out to be an efficient means of theorising various fields of knowledge, because it is this concept which most accurately expresses the idea of conservation, which is so important in the theoretical constructions of physics" (Ovchinnikov, 1997, p. 138).

The further exploration of the relationship between the concept of symmetry and the conditions under which conservation laws operate (consider, for example, E. Noether's theorem) made it possible to explain the efficiency of using formal mathematical structures in the course of theory development and to determine the method of interpreting formalisms by relating variables to the theoretical models incorporating definitions of symmetry. One of the valid solutions to the question about the effectiveness of using mathematical structures and logical-algebraic methods in non-classical science consists of recognising the prominent role of the principles of conservation and symmetry in determining the design of theoretical models.

So, on the one hand, the principle of symmetry has been reconsidered in terms of a systemic approach, which has proved to be very efficient in various disciplinary fields of science, on the other hand, as we have seen, the idea of symmetry was explicated in correlation with the

development of mathematical ideas that shaped the methodology of constructive theoretical work in physics. It has become an intrinsic part of the fundamental methodological frameworks of non-classical science.

Probabilistic Elements of Scientific Explanation and Some Challenges to Classical Determinism

An intention to reveal the fundamental deterministic basis of all natural phenomena and to describe the world in terms of essential invariant structures and causal relations was one of the ultimate goals of almost every research project in classical science. Thus, F. Bacon intended to discover proper forms of a given nature employing a new method. "Through these forms, the natural philosopher understands the general causes of phenomena" (Kargon, 1966, p. 48). The elaboration of classical physical theories involved the development of standard explanatory frameworks centred on the idea of a strict causal-deterministic explanation of natural phenomena. The structure of cause-and-effect relationships involved in the mechanisms of causal determination was seen as the paradigmatic instance of the invariant law-like structure of natural processes. The idea of a mechanistic, deterministic explanation of natural phenomena had acquired a universal status in classical Newtonian physics. It was developed in line with general assumptions about the homogeneity of the properties and states of natural things and the explanatory reducibility of those states and properties to substantial elementary structures. The classical strategy of deterministic explanation was sustained on the basis of ontological assumptions, such as the belief that "the dynamic world is homogeneous, reducible to the concept of integrable systems" (Prigo-gine & Stengers, 1984, p. 72), characterised by the predetermined invariants of structure and motion. Such systems were capable of demonstrating quite a predictable behaviour. Besides that, a strict deterministic approach presupposed the equivalence of different points of view; that is, it

assumed mutual consistency and translatability of various descriptions of the realms in which the laws of nature, denoted by the explanatory statements of deterministic theories, operate.

The development of a whole class of new theories in the fields of statistical physics, thermodynamics, quantum mechanics, etc., exposed the limits of classical deterministic approaches and necessitated novel solutions to the problem of an objective and unambiguous description and explanation of the relevant phenomena in certain newly discovered areas of research. As J. Ear-man and J. Norton noted, "there are many ways in which determinism can and may, in fact, fail: space invaders in the Newtonian setting; the non-existence of a Cauchy surface' in the general rel-ativistic setting; the existence of irreducibly stochastic elements in the quantum domain, etc." (Earman & Norton, 1987, p. 524).

Relativistic theories revealed essential limitations of Newtonian mechanics, which was based on a combination of determinism with spatial absolutism and invariable space - time distinctions. General Theory of Relativity (GTR) accounts for the relation between space and time and the dependence of spatial and temporal properties of material systems on their motion and interaction. Yet, the most interesting fact is that there were cases in which GTR provided the theoretical background for articulating some challenges to determinism. Consider, for instance, The Hole Argument, which was initially developed by Einstein in 1913. This argument was introduced as a challenge to any "generally covariant" theory of gravitation. The argument "shows a certain sense in which general relativity - or really, any theory of geometric objects formulated on a manifold - might be said to be "indeterministic" (Weatherall, 2020, p. 79). Later, Einstein abandoned this view and developed his "point-coincidence argument" which, as he believed, could make the relevant implications of GRT immune to indeterministic conclusions. However, the Hole Argument was reformulated by J. Earman and J. Norton in the late 1980s. In

its' refined version, this argument was used for the purposes of criticism of the substantivalist view of space - time. Substantivalists (that is, realists) believe that unobservable spatial and temporal properties of matter are not reducible to observable relational properties of matter. Ear-man and Norton argue that a space - time sub-stantivalist who believes general relativity cannot avoid facing the indeterminism dilemma which arises in local space - time theories of which general relativity is the best one. Specifically, Earman and Norton show that "the equations of these theories are simply not sufficiently strong to determine uniquely all the Spatio-temporal properties to which the substantivalist is committed" (Earman & Norton, 1987, p. 516). This can be qualified as an instance of 'radical local inde-terminism'.

According to Norton, "The Hole Argument begins by first considering an open region of space - time (the hole). One then takes advantage of the invariant nature of general relativity (GR) under diffeomorphisms by shifting all the objects within the hole to new space - time locations also within the hole. ... In general, the point location of fields get all mixed up. And yet, all our physically measurable quantities remain the same: the cat on the mat remains on the mat since both the cat and the mat are similarly shifted by dif-feomorphisms. The predictions of GR are independent of substantival localisation properties: where on the substantival manifold physical objects are located. The crux of the argument is that now, supposedly, we have an indeterministic theory" (Norton, 2020, p. 361). The argument shows that space - time substantivalism leads to a radical form of indeterminism, so there are two options for substantivalists: "either (a) accept radical local indeterminism in local space - time theories or (b) deny their substantivalism and accept "Leibniz equivalence", which is a hallmark of relationism" (Earman & Norton, 1987, p. 524). Nevertheless, it is worth being noted that in one of his recent works, Norton argues that the form of the original Hole Argument is unsound

and that the notion of determinism used in the Hole Argument ought to be modified. Specifically, Norton insists on limiting determinism to scope over just those facts or properties of the world for which GR is a theory. This should prevent everything from falling prey to the Hole Argument. And since "substantival localisation properties do not belong to the physical content of GR, substantivalism no longer threatens GR with indeterminism" (Norton, 2020, p. 363).

The study of quantum effects of atomic systems in physics and the developments in thermodynamics, especially the study of the synergetic effects of complex and open systems, led to the revision of a number of fundamental positions of classical determinism. Reconsideration of the principles of determinism and the development of some new explanatory frameworks became apparent in relativistic and quantum physics and thermodynamics. In these fields, scientists encountered objects of a new kind, that is, the objects characterised by systemic complexity, structural heterogeneity and nonlinear dynamism. In general, scientists met with "regularities of a novel type, incompatible with purely deterministic analysis" (Bohr, 1963b, p. 2). The universal applicability of deterministic approaches for predicting the behaviour of physical systems was also questioned. Researchers revealed the limitations of the principle of strict deterministic explanation. Thus, for example, the interpretation of quantum formalism involved the use of the methods of statistical explication of objective uncertainties. Scientists had to consider the features of experimental situations, which demonstrated the relevance of the uncertainty (or the indeterminacy) relations. The uncertainty relations imply the impossibility of simultaneous and accurate determination of the initial conditions of particle motion (such as coordinates and momentum). So, the study of quantum systems obviously showed the inadequacy of the classical (dynamic-mechanical) form of causality for describing regularities inherent in the behaviour of quantum mechanical objects. As Niels Bohr

(1963b) noted, "in the quantal formalism, the quantities by which the state of a physical system is ordinarily defined are replaced by symbolic operators subjected to a non-commutative algorism involving Planck's constant. This procedure prevents the fixation of such quantities to the extent required for the deterministic description of classical physics but allows us to determine their spectral distribution as revealed by evidence about atomic processes. In conformity with the non-pictorial character of the formalism, its physical interpretation finds expression in laws, of an essentially statistical type" (pp. 2-3). There was an urgent need to map a new type of regularities to the methods of conceptual representation, which could be used to describe changes in the states of quantum mechanical systems. Classical stringent deterministic models were gradually supplemented or replaced by models based on statistical, systemic and diatropic principles.

The dispositions to the methodological elaboration of innovative ideas about the complexity of determination of the behaviour of complex dynamical systems had already been shaped in the classical period of physics (consider, for instance, the development of classical statistical mechanics, specifically, Boltzmann's statistical explanation of the second law of thermodynamics). The further paradigmatic shifts in the conceptual foundations of deterministic approaches occurred due to the transition of physics from the study of integrable mechanical systems, characterised by regularity, determinacy and reversibility of behaviour, to the study of systems of quite a different physical nature - the latter were characterised by structural complexity, superposi-tionality, indeterminacy and stochastic variability.

The introduction of probabilistic ideas marked the development of methodological approaches that deviated from classical strict determinism into the explanatory frameworks of physical theories. The use of statistical methods proved to be adequate for studying a wide class of phenomena and enabled identifying some new

patterns of regularity in nature.

The concept of the probabilistic form of determinism underlies the statistical laws that have become a substantial part of a comprehensive account of experience in thermodynamics and quantum physics. The principle of operational relativity and the principle of complementarity both imply that a quantum system can be exposed through a combination of its' phenomenal states. Those states can be revealed with some probability, which depends on the specific conditions of interaction between the object and the measurement device. And one should also take into account a quazi-contradictory relationship between the wave description of the motion of material particles, obeying the principle of superposition, and the persisting individuality of particles. All these features of atomic systems behaviour, including a feature of wholeness inherent in atomic processes and the relevance of indeterminacy relations, impose obvious limitations on deterministic causal models of scientific explanation. Situations similar to those in quantum physics necessitated the introduction of a new paradigm of scientific explanation, which could be compatible with probabilistic considerations pertaining to the observable effects that are indicative of the evolution of a quantum state.

Yet, Earman argues against treating quantum mechanics as the paradigm example of an inde-terministic theory. He rejects the view that "quantum indeterminism arises because the "reduction of the wave packet" is based on a controversial interpretation of the quantum measurement process". Earman (2007) claims that "in some respects, QM is more deterministic and more predictable than classical physics" (p. 1399). One should consider "a procedure for quantisation that starts with a Hamiltonian formulation of the classical dynamics for the system of interest and produces, modulo operator ordering ambiguities, a formal expression for the quantum Hamiltonian operator H that is inserted into the equation" (Earman, 2007, p. 1400). Ear-man claims that the quantum Hamiltonian opera-

tor can be essentially self-adjoint, and since it can have a unique self-adjoint (SA) extension, the evolution of the quantum state can be accounted for in deterministic terms. Though, of course, in many respects, quantum determinism is obviously different from the classical one. Thus, "quantum determinism surely does not require that all quantum magnitudes always have determinate values, for a similar requirement would falsify classical determinism" (Earman, 1986, p. 226). Furthermore, this view is quite coherent with the view of determinism as a doctrine about the evolution of set or interval-valued magnitudes, as well as about point valued magnitudes.

In general, the introduction and interpretation of non-classical ideas about the wholeness, systemic complexity, topological multidimension-ality and stochastic variability of the objects under investigation determined the conceptual framework for many scientific theories, mainly physical and even physical biochemical theories. Ontological assumptions correlated with the models of description and schemes of explanation employed in these theories embrace a whole spectrum of various types of statistical and non-causal determination. The latter might be accounted for in terms of functional relations between essential properties of an object, or in terms of correlations between object's various states, as well as in terms of structural genesis processes, genetic propensities and systemic-synergetic effects (take, for instance, teleonomic synergism of biological systems). In some of the prominent research programs in contemporary science, the priority is given to the conceptualisation of the objects of the study as structurally polymorphic, functionally heterogeneous and dynamically variable systems, the behaviour of which can contain both deterministic and stochastic elements, and it no longer lends itself to monological rationalisation from predominant positions of classical causal determinism.

We can conclude that among the key indicators of the radical change in the paradigmatic foundations of natural science, there is a very

special trend that consists of strengthening the functional status of methodological principles accumulating the explanatory potential of the models of the nonlinear world.

Conclusion

There is a set of methodological principles that are really indicative of the standards of non-classical science. The integration of those principles into the methodological foundations of heuristic research programs in physics imposed essential restrictions on the positions of fundamentalism, essentialism, object-centrism, reduction-ism and strict determinism.

The principle of operational relativity of scientific descriptions determines the relevance of procedural and instrumental factors to the referential framework of any descriptive account of what is observed or exposed in actual experience, and it requires explicating the relation of theoretical constructs to the operational structure of experience. It is the hallmark of non-classical cognitive situations that the object can be grasped only within the framework of a specific interactive-phenomenal continuum which involves inevitable determination on the part of the experimental arrangement. Insofar the observation plays a constitutive role in the events, and it has become a general requirement to take into account the procedural structure of experience and to include the reference to the methods of observation into the descriptive content of object representations (which is also essential to the interpretation of the variables of the quantal formalism in terms of statistical laws). These radical changes in the methodological foundations of research programs indicate the essential transformation of the very concept of objectivity of scientific knowledge.

The introduction of epistemic ideals and norms of non-classical science can be clearly seen in the deviation from the standards of the monologic-linear description of the objects marked by the development of alternative de-

scriptive and interpretative frameworks which presuppose the implementation of the principle of complementarity. On the one hand, the introduction of the complementarity principle is correlated with the recognition of the ontological complexity of the objects under study: thus, in quantum physics, scientists encountered the cases of phenomenal heterogeneity which transcend ordinary experience (e.g., a certain entity cannot at the same time appear as a particle and a wave), and became aware of the limitations on the classical methodology (consider, e.g., the incommut-ability of certain variables required for the definition of the state of a system). On the other hand, the relevance of the complementarity principle can be explained in terms of the complexity of cognitive tasks: the completeness of scientific descriptions is likely to be obtained on the basis of the systemic exposition of the same object through a set of its' complementary - structural, functional and dynamic characteristics. This can be accomplished by using alternative experimental arrangements and systems of concepts.

The developments in non-classical physics also revealed the limitations of standard reductionists approaches and deterministic schemes of scientific explanation. In some of the newly discovered areas of research, scientists encountered objects of a rather specific nature and recognized that the considerations of systemic complexity, structural heterogeneity, localization ambiguity and nonlinear dynamism were exclusively relevant to the study of those objects. The reconsideration of the value of strict deterministic explanation strategies in favour of probabilistically oriented approaches is one of the prominent methodological trends in non-classical science. The recognition of the prominent value of probabilistic framework for the description of physical systems dynamics can be clearly seen in the introduction of statistical laws in thermodynamics, as well as in the use of probability function for the appropriate description of quantum phenomena, that is, in the interpretation of the quantal formalism which is based on the use of the me-

thods of statistical explication of objective uncertainties. Yet, it would be wrong to assume that any probabilistic account of natural phenomena implies indeterminism. Insofar quantum laws allow certain degrees of freedom and indeterminacy concerning physical quantities; they can be taken as giving expression to the regularities of a novel type. Nevertheless, we should acknowledge that in the case of these new regularities, scientists deal with a new - probabilistically reduced form of determinism. The development of explanatory frameworks that combine a deterministic account of phenomena with alternative views of the nature of regularities provides reliable indicators of the heuristic power of contemporary scientific theories.

References

Bacciagaluppi, G. (2013). Measurement and classical regime in quantum mechanics. In The Oxford handbook of philosophy of physics (pp. 304 -335). Oxford, New York: Oxford University Press. Bohr, N. (1963a). The unity of human knowledge. In N. Bohr (Ed.), Essays 19581962 on atomic physics and human knowledge (pp. 8-16). New York: Interscience Publishers. Bohr, N. (1963b). Quantum physics and philosophy - causality and complementarity. In N. Bohr (Ed.), Essays 1958-1962 on atomic physics and human knowledge (pp. 1-7). New York, London: Interscience Publishers. Bohr, N. (1948). On the notions of causality and complementarity. Dialectica, 2(3/4), 312-319.

Brading, K., & Castellani, E. (2007). Symmetries and invariances in classical physics. In J. Butterfield, & J. Earman (Eds.), Handbook of the Philosophy of Science. Philosophy of Physics. Part A (pp. 1331-1367). Amsterdam: Elsevier. Bryanik, N. V. (2019). The comparative analysis

of epistemological distinctions between laws in classical, non-classical and post-non-classical science. Tomsk State University Journal of Philosophy, Sociology and Political Science, 48, 5-14. doi:10.17223/1998863X/48/1

Djidjian, R. (2016). Paradoxes of human cognition. Wisdom, 7(2), 49-58. https://doi.-org/10.24234/wisdom.v2i7.137

Earman, J., & Norton, J. (1987). What price spacetime substantivalism: The hole story. British Journal for the Philosophy of Science, 38, 515-525.

Earman, J. (1986). A primer on determinism. Dordrecht: Reidel.

Earman, J. (2007). Aspects of determinism in modern physics. In J. Butterfield, & J. Ear-man, (Eds.), Handbook of the philosophy of science. Philosophy of physics. Part A (pp. 1369-1434). Amsterdam: Elsevier.

Fano, U., & Rao, A. R. P. (1996). Symmetries in quantum physics. London: Academic Press.

Heisenberg, W. (1990). The Copenhagen interpretation of quantum theory. In W. Heisenberg, Physics and philosophy. The revolution in modern science (pp. 14-25). London: Penguin Books.

Horgan, T, & Tienson, J (1994). A nonclassical framework for cognitive science. Synthese, 101(3), 305-345. doi:10.1007/-BF01063893

Il'in, V. V. (1994). Teoriya poznaniya. Episte-mologiya (Theory of knowledge. Epis-temology, in Russian). Moscow: MGU.

Kargon, R. H. (1966). Atomism in England. Oxford: Oxford University Press.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Markova, L. A. (1998). O transformaciyakh logi-ki estestvennonauchnogo myshleniya v XX veke (On the transformation of the logic of scientific thinking in XX century, in Russian). Filosofija nauki (Philosophy of science, in Russian) (Vol. 4, pp. 73-87). Moscow: IFRAN.

Norton, J. (2020). The hole argument against everything. Foundations of Physics, 50(4), 360-378.

Ovchinnikov, N. F. (1997). Metodologicheskie principy v istorii nauchnoi mysli (Methodological principles in the history of scientific thought, in Russian). Moscow: Jeditorial URSS.

Prigogine, I., & Stengers, I. (1984). Order out of chaos. Man's new dialogue with nature. New York: Bantam Books.

Romanovskaya, T. B. (1995). Nauka XIX-XX ve-kov v kontekste istorii kul'tury (Science of XIX-XX centuries in the context of culture history, in Russian). Moscow: Radiks.

Stefanov, A. S. (2018). Approaches to the theme about non-classical science. Philosophy, 27(2), 152-159.

Stepin, V. S. (1995). Sistemnost' teoreticheskih modelej i operacii ih postroenija (Systemic Value of Theoretical Models and Operations of their Constructing, in Russian). In Filosofiya nauki (Philosophy of science, in Russian) (Vol. 1, pp. 26-57). Moscow: IFRAN.

Weatherall, J. O. (2020). Some philosophical prehistory of the (Earman-Norton) hole argument. Studies in History and Philosophy of Modern Physics, 70(2020), 79-87.

Whitaker, A. (1996). Einstein, Bohr and the quantum dilemma. Cambridge / New York: Cambridge University Press.

Wigner, E. P. (1964). Symmetry and conservation laws. Physics Today, 17(3), 34-40. https://doi.org/10.1063/1.3051467

Zhdanov, G. B. (1995). Vybor estestvoznaniya: 8 principov ili 8 illyuzii racionalizma? (The choice of natural science: 8 principles or 8 illusions of rationalism?, in Russian). Filosofija nauki (Philosophy of Science, in Russian) (Vol. 1, pp. 5886). Moscow: IFRAN.

i Надоели баннеры? Вы всегда можете отключить рекламу.