Научная статья на тему 'ЭНЕРГИЯ И ИНФОРМАЦИЯ – ГЛАВНЫЕ КАТЕГОРИИ СОЗДАНИЯ И РАЗВИТИЯ МИРА И СЛОЖНЫХ СИСТЕМ'

ЭНЕРГИЯ И ИНФОРМАЦИЯ – ГЛАВНЫЕ КАТЕГОРИИ СОЗДАНИЯ И РАЗВИТИЯ МИРА И СЛОЖНЫХ СИСТЕМ Текст научной статьи по специальности «СМИ (медиа) и массовые коммуникации»

CC BY
72
17
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
информация / энергия / сложная система / энтропия. / information / energy / complex system / entropy.

Аннотация научной статьи по СМИ (медиа) и массовым коммуникациям, автор научной работы — Мошинская А.

В статье представлен энергоинформационный подход описания мира и сложных систем. Понятие информации имеет двойственную, не до конца изученную природу. С одной стороны понятие информации является ключевой категорией нового квантового подхода, с другой стороны информация, как философское представление мироздания является фундаментальным понятием науки

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

ENERGY AND INFORMATION ARE THE MAIN CATEGORIES OF CREATION AND EVOLUTION OF THE WORLD AND COMPLEX SYSTEMS

The article proposes an energy-informational approach to describing the world and complex systems. The concept of information has a dual, not fully understood nature. On the one hand, the concept of information is a key category of the new quantum approach, on the other hand, information, as a philosophical representation of the universe, is a fundamental concept in science.

Текст научной работы на тему «ЭНЕРГИЯ И ИНФОРМАЦИЯ – ГЛАВНЫЕ КАТЕГОРИИ СОЗДАНИЯ И РАЗВИТИЯ МИРА И СЛОЖНЫХ СИСТЕМ»

PHILOSOPHICAL SCIENCES

ЭНЕРГИЯ И ИНФОРМАЦИЯ - ГЛАВНЫЕ КАТЕГОРИИ СОЗДАНИЯ И РАЗВИТИЯ МИРА И

СЛОЖНЫХ СИСТЕМ

Мошинская А.

Д.т.н., доцент, профессор Института телекоммуникационных систем Национальный технический университет Украины «Киевский политехнический институт имени Игоря Сикорского»

ENERGY AND INFORMATION ARE THE MAIN CATEGORIES OF CREATION AND EVOLUTION

OF THE WORLD AND COMPLEX SYSTEMS

Moshynska A.

Doctor of Science, Professor, Associate Professor Institute of Telecommunication Systems National Technical University of Ukraine «Igor Sikorsky Kyiv Polytechnic Institute»

Аннотация

В статье представлен энергоинформационный подход описания мира и сложных систем. Понятие информации имеет двойственную, не до конца изученную природу. С одной стороны понятие информации является ключевой категорией нового квантового подхода, с другой стороны информация, как философское представление мироздания является фундаментальным понятием науки.

Abstract

The article proposes an energy-informational approach to describing the world and complex systems. The concept of information has a dual, not fully understood nature. On the one hand, the concept of information is a key category of the new quantum approach, on the other hand, information, as a philosophical representation of the universe, is a fundamental concept in science.

Ключевые слова: информация, энергия, сложная система, энтропия.

Keywords: information, energy, complex system, entropy.

Introduction

It is safe to say that the end of the second millennium was a special stage in the civilization's history. Humanity's concept of the world around and their role and place in this world have drastically changed during the last two centuries of the outgoing millennium. The 19th century was marked by a civilization's breakthrough into the wonderful world of energies. Human has learned to receive, transform and use a variety of energy types and the energy picture of the world, seemingly for centuries, has become established in the basis of the worldview.

But the next, 20th century opened a new world to mankind - the world of information. And further, the 21st century is marked by the information technologies expansion into vast areas of human activities: economy, industry, management, education, culture. The concepts of information, information processes are increasingly used in natural science, social science, the sciences of thinking and in the philosophical sciences. Today, few people doubt the decisive role of information in molecular biology, artificial intelligence theory, telecommunications and information technology, scientific information theory and management theory. And such derivative concepts as the "era of information", "global information revolution", "new information technology era", "information society", "information explosion", "information crisis" are increasingly found

not only in scientific literature, but also in mass media.

The rapid development of the information world changes the existing picture of the world. Ultimately: information and energy - two concepts that altered civilization at the end of the 2nd millennium, and not without reason, claim a worthy place in this picture.

What awaits civilization in the 3rd millennium? What the values will be, how the surrounding world and humanity will change in the 3rd millennium? To answer these and many other questions, a new concept of worldview should be established. A new concept, that includes experience of thousands of years of human evolution and, no doubt, the experience of the last centuries.

Complex systems and information

In the general case, a system is understood as a functionally complete set of elements and their interconnections, the properties of which are not identical to the sum of the properties of the elements that form it. Functional completeness is understood as the property of the system to perform a specific task (to have a completed function).

For information specialists, one of the fundamental concepts is a system or a technical system (a system implemented by technical means).

Information is the basic and one of the most difficult to define terms. In common sense, information is understood as any set of data about the material world

representation.

However, in technical applications, a subjective approach to the concept of information is more often used.

In this case, information is understood as the totality of information extracted by the subject from the material world manifestations.

A subjective approach to the concept of information ease the formulation and solution of specific problems.

In particular, the concept of informational environment is introduced as a part of the material world considered by the subject to help to extract the information he needs. The subject himself becomes a user of this informational environment.

The assignment of one or another real system to the category of "complex" or "simple" is very conditional and is largely determined by the tasks of researching the system. We will consider this system a "complex system" in the case when, due to the properties of the system itself and the nature of the tasks that arise during its study, it is necessary to take into account the presence in the system of a large number of mutually connected and interacting elements that ensure the system performs some rather complex function.

Thus, a complex system can be called both a technical system and the human body, the Solar system, etc.

The complexity and diversity of information explains the wide range of its definitions in the literature. Information is defined both as a side of the reflection process used for control, and as a characteristic of the objects complexity, and as an ordered structure of objects and interactions, and as negentropy (entropy with the opposite sign). In the general interpretation, information is defined as any variety, difference, structure, which raises it to the rank of a philosophical category.

Attempts to adequately inscribe the concept of "information" into the system of philosophical categories have been repeatedly undertaken both in our and in foreign literature, but it is still a headache and the subject of fierce discussions among specialists. The founder of cybernetics N. Wiener expressed his attitude to this issue, stating that "the mechanical brain does not secrete thought, like the liver secrets bile, and does not release it in the form of energy, like muscles. Information is information, not matter, and not energy. " [2]

But if not matter, then maybe "spirit", consciousness? However, information processes are found in various technical devices and all living organisms, including unicellular ones. Thus, the concept of "information" in terms of volume has become broader than the concept of "consciousness". So maybe it makes sense to accept information as a fundamental concept of the surrounding world? For example, Academician A.N. Ya-kovlev believes that "the basis of the world is not matter, but information ... Information is primary, matter and spirit are secondary."

Meanwhile, at the end of the twentieth century, "matter" itself appears in a new light. Experiments in recent decades in the field of quantum physics have revealed the dynamic nature of the world of particles. Material particles turned out to be in fact dynamic information structures with a certain amount of energy

contained in their mass. The appearance of material particles from pure energy, observed in laboratories millions of times, is the most extraordinary consequence of the theory of relativity, experimentally shown at the end of the twentieth century. Energy can be converted to particles and vice versa; any particle can be transformed into another. The concepts of classical physics lose their meaning: "elementary particle", "isolated object". The Universe is presented as a mobile network of energy-informational processes.

Information refers to those concepts that, on the one hand, are widely used in human culture, which speaks of their transparency and irreplaceability, but on the other hand, their scientific and philosophical understanding involves certain difficulties. In particular, this is reflected in various - and not always mutually exclusive - approaches to this subject, attempts to interpret the studied concept, etc. The peculiarity of the situation with information lies in the fact that such a scientific discipline as "information theory" does not cover the entire range of problems associated with the designated issue. This testifies in favor of the fact that such a phenomenon as information should be investigated using an interdisciplinary approach, the basis of which should be philosophy.

So what is information after all? One of the possible answers to this question is as follows:

"Information in the broad sense of this term is an objective property of reality, which manifests itself in the heterogeneity (asymmetry) of the matter and energy distribution, in the unevenness of all processes occurring in the world of animate and inanimate nature, as well as in human society and consciousness." [3] This definition is obtained as a result of combining the definitions of the information concept belonging to Academician V.M. Glushkov, and the statements of the author of the work "Elements of Information Physics" G.V. Vstovsky that information is the result of symmetry breaking.

Information cannot exist on its own, in isolation from this or that communication process, both between living and nonliving systems. The essence of this kind of exchange is the object of study of the reflection theory.

Information and reflection

Reflection is one of the universal properties of matter. Reflection presupposes the presence of two interacting material objects, and such interaction presupposes a certain kind of inequality of both objects-participants: one of them "reacts" to the other, repeats certain features of the reflected object. Reflective interaction can be not only dynamic, mechanical in nature, as, for example, is the case with the reflection of a face in a mirror or a footprint on wet sand, where the reflection does not repeat the internal structure of the reflected object.

Under certain conditions, reactions of a special type arise - not to the absolute value of the material-energy side of the influences, but to their relative magnitude and orderliness (organization, structure); at the same time, the one-sided relationship of one thing (as primary, independent) to another, secondary, dependent on the first comes to the fore. [6]

Reflection includes several types. First, it is a mechanical reflection inherent in inorganic nature (physical interactions - gravitational, electromagnetic, strong and weak). In plants and protozoa, reflectivity is provided by irritability - the ability to respond to biologically significant influences of the external environment, changing their own behavior or state. In animals and humans, the psyche develops - an active form of reflectivity, capable of anticipating a complex external impact in its initial phase. The highest type of reflection

- consciousness - is inherent only in humans; consciousness is the highest type of mental activity, its main features should include purposefulness, self-control, as well as a creative approach to reflecting reality.

In reality, reflection is accompanied by reverse reflection, when the reflected object itself, in turn, reflects certain characteristics of the reflecting object, and it seems rather difficult to separate the reflection from the general background of interactions in which this body or object participates:

In any reflection process, including the reflection of the external world in the mind of a person, there is never a complete separation of the display from the total result of the interaction of objects, including the interaction of a person with the external world. [6]

G.A. Spirkin wrote the following lines about the reciprocity of the reflection process:

Reflection in all the diversity of its forms, ranging from the simplest mechanical traces and ending with the human mind, occurs in the process of interaction of various real world systems. This interaction results in interrelation, which in the simplest cases appears in the form of a mutual restructuring of the internal state of interacting systems: in changing their connections or the direction of movement, as an external reaction or as a mutual transfer of energy and information. Reflection in the general case is a process, the result of which is informational reproduction of the reflected object properties. Any interaction includes an informational process: it is informational interaction, mutual causation in the sense that one leaves a memory about itself in the other. In the broadest sense, information is being reflected in another, that is, being, as Hegel would say. Thus, information is the objective side of the nature processes and as such is universal, which, of course, does not exclude at all, but, on the contrary, presupposes its specificity in various spheres of the real world

- in inorganic nature, living and social processes.

Everything in the world is in direct or mediated interaction of everything with everything, or retreating into infinity - everything carries information about everything. In this regard, let us recall the deep insight of the ancients: everything is in everything! This presupposes a universal information field of the universe, which is a universal form of communication, a form of universal interaction and thus the unity of the world: after all, everything in the world remembers everything! This follows from the principle of reflection as a universal property of matter. Figuratively speaking, each point of the universal field is a living mirror of the Universe. [5]

The universality of reflection, its reproductive role

is in logical connection with another group of philosophical categories, namely with cause and effect, which constitute the main object of the study of determinism. Determinism is called "the philosophical doctrine of the objective natural relationship and causality of all phenomena."

Hence, the inevitable connection between the theory and philosophy of information and the philosophical problem of determinism becomes clear, because if you develop the thought contained in the above quote from Spirkin, reflection turns out to be an inevitable companion of the causal relationship, where the cause is reflected in the effect in a certain way, which, in particular, helps on the basis of the investigation, get some idea of the cause that gave rise to it. In this case, information turns out to be what objects exchange in the process of reflection, namely, the form of the reflected object, reproduced in the reflecting object.

Mathematical definition of information

At first glance, the concept of information should be attributed to categories that cannot be quantified, primarily due to its subjectivity and ambiguity. However, there are quantitative definitions of information that have proven to be very successful. The impetus for the search for such definitions was the progress in technical means of communication, which required clarification of the concept of information, taking into account the specifics of its coding and transmission through communication channels. It was communications engineers who were at the origins of a new scientific discipline -information theory.

The first quantitative definition of information belongs to Ralph Hartley, who in his article "Transfer of Information" (1928) proposed calculating information as a logarithm of the total number of outcomes of an experiment (i.e., the more possible outcomes, the more information can be obtained separate outcome) [7].

The main flaw in Hartley's definition was that he proposed not to distinguish between outcomes of experiences that have different probabilities, referring the difference between them to insignificant "psychological factors." Nevertheless, it turned out that it is necessary to take these factors into account, since it was intuitively felt that a rare and frequent symbol can carry a different informational load. Hartley's lack of definition was corrected two decades later by the mathematician K. Shannon, who is rightfully considered the founder of information theory. Shannon proposed a measure of uncertainty equal to the product of the probability and its logarithm: —P • logP [8]. Shannon called this quantity entropy by analogy with physical entropy, whose definition looks similar (and, as we will see below, this similarity has deeper roots). Note that informational entropy has a negative sign due to the fact that the probability cannot be greater than one, and the logarithm of the probability, respectively, is greater than zero, and in order to avoid the representation of uncertainty as a negative value, a minus sign was added to its definition. Information was defined by Shannon as an increment of entropy, that which decreases entropy, i.e. uncertainty.

This - often called probabilistic - definition of in-

formation is the most common. There are other definitions of information, for example, algorithmic. This definition is that "the amount of information in a text is approximately equal to the logarithm of the length of the shortest program that can print this text"

The main link in information theory is the quantitative (probabilistic) measure of information formulated by K. Shannon. Shannon's approach made it possible to measure the performance of information sources with individual characteristics of its transmission channels, to determine the boundary of a given reliability of information transmission.

The first generalizing property of information is that it acts as a measure for eliminating uncertainty. A quantitative measure of uncertainty is entropy H, which characterizes the degree of variability of the microstate of an object. The higher the entropy, the greater the number of substantially different microstates an object can contain at a given macrostate. With the acquisition of information, uncertainty decreases, so the amount of information can be measured by the amount of reduced uncertainty, i.e. entropy. In the case of a discrete random variable describing the state of an object of the information environment, the entropy is determined by the Boltzmann formula

tf^-^pawnp©

where 4 is a random variable, P © is the probability distribution of this quantity over the sum of possible states [4]

In the late 50s - early 60s of the twentieth century, Brillouin took up the theory of information, having published two significant monographs on this topic: "Science and Information Theory" (Russian edition -Moscow, 1960) and "Scientific Uncertainty and Information" (M., 1966). The first book dealt with many technical issues related to the transmission and coding of information, while the latter, developing the ideas expressed in the previous work, was more focused on the philosophical understanding of the new scientific method.

The monograph "Scientific Uncertainty and Information" begins with a brief introduction to the history and problems of thermodynamics - that branch of physics where classical determinism raises the greatest doubts and controversy. One of the most important concepts of thermodynamics is entropy, which is very similar to the concept of the same name introduced by Shannon. In thermodynamics, entropy means "the state function of a thermodynamic system, the change in which (dS) in an equilibrium process is equal to the ratio of the amount of heat (dQ) imparted to the system or removed from it, to the thermodynamic temperature (T) of the system" [1]. One of the most important properties of entropy is that the entropy of a system in an irreversible process can only increase (due to heat loss, friction, etc.). In this case, an increase in entropy characterizes an increase in system's chaos (due to which the term "entropy" is on the way to becoming a subjective designation of a measure of chaos in general). In the statistical interpretation, entropy means "a measure of the probability of a system being in a given state" [1]: the greater the entropy, the more likely it is to find the system in the most probable, typical state.

Brillouin introduced the inverse value of entropy -negentropy, defined as entropy taken with the opposite sign. If entropy symbolizes "the law of depreciation, the rule of decreasing the level," then negentropy is associated with value and scarcity, equivalent from the physics point of view. The evolution of a system with increasing entropy leads to the fact that the system loses its unique organization, becoming more similar to a certain average statistical standard. Brillouin sees the connection of thermodynamics with information theory as follows: the more we know about the state of the system, the less uncertainty we have about the structure of the system, which reduces the number of elementary states, probability and entropy:

"Any added information increases the negentropy of the system.

This leads to an important consideration: we can establish a quantitative measure of information through the corresponding increment of negentropy. "[1]

Mathematics considers how separately taken phenomena of the surrounding world are qualitatively and quantitatively reflected by separately taken mathematical structures. A pertinent question: how to reflect the relationship between the phenomena of the surrounding world, in fact, the structure of the world? It is reasonable to assume - the relationship between the corresponding mathematical structures, that is, the internal hierarchy of mathematics. Indeed, modern mathematics has a harmonious hierarchical structure, the elements of which are the very mathematical structures, the amazing applicability of which is surprising ("the principle of hierarchy of structures" according to N. Burbaki).

According to the research of N. Burbaki's school, set theory is the foundation of modern mathematical knowledge. "You can derive all modern mathematics, -write Bourbaki, - from a single source - set theory." And set theory is based, as you know, two concepts -"set" and "relation". A set is a collection of elements. The element of the set is the main "building block" of the modeling of the surrounding world. Relationship refers to the relationship between elements. The totality of the elements of the set and connections, the relations between them represent a mathematical structure - the very object of philosophical discussions. The hierarchy of structural levels is formed by specifying relationships between the mathematical structures of the previous levels. Doesn't this hierarchy reflect the fundamental structure of the surrounding world?

Thus, the set and the relation are concepts that underlie the mathematical model of the surrounding world. This is the key to understanding the world, historically given to a person and unconscious by him until now. Having a model of the surrounding world, one can take the next step - to interpret the basic concepts in a model with the concepts of the surrounding world. There is another situation in science, the reasons for which are still the subject of endless philosophical discussions: the abstract mathematical model created in mathematics finds an unexpected interpretation in connection with new scientific discoveries.

Conclusions:

The article presents an energy-informational approach to describing the world and complex systems.

The concept of information has a dual, not fully understood nature. On the one hand, the concept of information is a key category of the new quantum approach, on the other hand, information, as a philosophical representation of the universe, is a fundamental concept of science. Today, a detailed study of information exchange using energy at the level of elementary particles is a topical scientific trend.

References

1. Бриллюэн Л. Наука и теория информации / Пер. с англ. А.А. Харкевича. - М.: Физматгиз., 1960. - 392 с.: ил., табл.

2. Винер Н. Кибернетика и общество, Наука, Москва (1958).

3. Глушков В.М. О кибернетике как науке // Кибернетика, мышление, жизнь. - М., 1964. - С. 53.

4. Ильченко М. Е. Разграничение и слияние уровней эталонной модели взаимодействия для информационно-телекоммуникационных систем/ М. Е. Ильченко, А.В. Мошинская, Л. А. Урывский // Кибернетика и системный анализ. - 2011. - Т. 47, № 4. -С. 108-116. - Бiблiогр.: с. 116 (6 назв.).

5. Спиркин А.Г. "Основы философии", М., 1988.

6. Украинцев Б.С. Отображение в неживой природе, стр. 77. Сб. "Теория познания и современная наука". М., 1968, стр. 106.

7. Хартли Р. "Передача информации" // "Теория информации и её приложения", М., 1959.

8. Шеннон К. Работы по теории информации и кибернетике / Пер. с англ.; Под ред. Ф.А. Добру-шина, О.Б. Лупанова. - М.: Изд-во иностр. лит., 1963. - 830 с.: ил., табл. - Библиогр.: с. 783-820, и библиогр. в конце статей.

МЕСТО КАТЕГОРИИ «СВОБОД» В ЖИЗНИ СРЕДНЕВЕКОВОГО ЧЕЛОВЕКА

Фархетдинова Ф. Ф.

Старший преподаватель Бирского филиала Башкирского государственного университета,

кандидат филологических наук

THE PLACE OF THE CATEGORY OF "FREEDOMS" IN THE LIFE OF A MEDIEVAL MAN

Farkhetdinova F.

Senior lecturer of the Birsky branch of the Bashkir State University,

Candidate of Philological Sciences

Аннотация

Свобода является важнейшей философской категорией, характеризующей сущность человеческой жизни. В разные эпохи это явление по-разному воспринималось и интерпретировалось, так как менялось отношение к человеку, без которого невозможно говорить о данной категории. В данной статье рассматривается категория свободы в контексте средневековой традиции.

Abstract

Freedom is the most important philosophical category that characterizes the essence of human life. In different epochs, this phenomenon was perceived and interpreted differently, as the attitude towards a person changed, without which it is impossible to talk about this category. This article examines the category of freedom in the context of medieval tradition.

Ключевые слова: свобода, Средние века, человек, общество, религия, христианство.

Keywords: freedom, Middle Ages, man, society, religion, Christianity.

Свобода является важнейшей философской категорией, характеризующей сущность человеческого жизни. Как писал Сартр, сущность экзистенции и есть свобода [1, С.79]. Свобода - это потенциальная возможность выбирать альтернативы, в результате внутреннего или внешнего поиска себя, возможность мыслить и действовать в соответствии со своими идеями и желаниями [5, С.159]. Свобода воплощается в форме реализации возможностей путем выбора конкретных целей и планов действий. В разные эпохи это явление по-разному воспринималось и интерпретировалось, так как менялось отношение к человеку. Рассмотрим категорию свободы в контексте средневековой традиции.

В ткани западноевропейской истории Средние века занимают большой временной промежуток.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Экономический строй, взаимоотношения классов, государственные порядки и правовые институты, духовный климат средневекового общества являются теми факторами, которые оказывали воздействие на содержание, дифференциацию, социальную направленность политических, юридических идей западноевропейского средневековья [2, С.243]. Начав свое шествие по Европе, христианство по-новому поставило проблему свободы человека в целом, и вопрос религиозной свободы в частности. Тема социальной свободы, столь актуальная в рабовладельческом обществе, раскрывается христианством в самом широком смысле как проблема равенства всех людей перед Богом и между собой, как созданных по образу божьему. Совокупность таких взглядов на мир существенно сказалась на

i Надоели баннеры? Вы всегда можете отключить рекламу.