All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Dynamical theory of information as a basis for natural-constructive approach to modeling a cognitive process
Computer Research and Modeling, 2017, v. 9, no. 3, pp. 433-447Views (last year): 6.The main statements and inferences of the Dynamic Theory Information (DTI) are considered. It is shown that DTI provides the possibility two reveal two essentially important types of information: objective (unconventional) and subjective (conventional) informtion. There are two ways of obtaining information: reception (perception of an already existing one) and generation (production of new) information. It is shown that the processes of generation and perception of information should proceed in two different subsystems of the same cognitive system. The main points of the Natural-Constructivist Approach to modeling the cognitive process are discussed. It is shown that any neuromorphic approach faces the problem of Explanatory Gap between the “Brain” and the “Mind”, i. e. the gap between objectively measurable information about the ensemble of neurons (“Brain”) and subjective information about the human consciousness (“Mind”). The Natural-Constructive Cognitive Architecture developed within the framework of this approach is discussed. It is a complex block-hierarchical combination of several neuroprocessors. The main constructive feature of this architecture is splitting the whole system into two linked subsystems, by analogy with the hemispheres of the human brain. One of the subsystems is processing the new information, learning, and creativity, i.e. for the generation of information. Another subsystem is responsible for processing already existing information, i.e. reception of information. It is shown that the lowest (zero) level of the hierarchy is represented by processors that should record images of real objects (distributed memory) as a response to sensory signals, which is objective information (and refers to the “Brain”). The next hierarchy levels are represented by processors containing symbols of the recorded images. It is shown that symbols represent subjective (conventional) information created by the system itself and providing its individuality. The highest hierarchy levels containing the symbols of abstract concepts provide the possibility to interpret the concepts of “consciousness”, “sub-consciousness”, “intuition”, referring to the field of “Mind”, in terms of the ensemble of neurons. Thus, DTI provides an opportunity to build a model that allows us to trace how the “Mind” could emerge basing on the “Brain”.
-
The tests for checking of the parallel organization in logical calculation which are based on the algebra and the automats
Computer Research and Modeling, 2017, v. 9, no. 4, pp. 621-638Views (last year): 14. Citations: 1 (RSCI).We build new tests which permit to increase the human capacity for the information processing by the parallel execution of the several logic operations of prescribed type. For checking of the causes of the capacity increasing we develop the check tests on the same logic operations class in which the parallel organization of the calculations is low-effectively. We use the apparatus of the universal algebra and automat theory. This article is the extension of the cycle of the work, which investigates the human capacity for the parallel calculations. The general publications on this theme content in the references. The tasks in the described tests may to define in the form of the calculation of the result in the sequence of the same type operations from some algebra. If this operation is associative then the parallel calculation is effectively by successful grouping of process. In Theory of operations that is the using the simultaneous work several processors. Each processor transforms in the time unit the certain known number of the elements of the input date or the intermediate results (the processor productivity). Now it is not known what kind elements of date are using by the brain for the logical or mathematical calculation, and how many elements are treating in the time units. Therefore the test contains the sequence of the presentations of the tasks with different numbers of logical operations in the fixed alphabet. That is the measure of the complexity for the task. The analysis of the depending of the time for the task solution from the complexity gives the possible to estimate the processor productivity and the form of the calculate organization. For the sequence calculations only one processor is working, and the time of solution is a line function of complexity. If the new processors begin to work in parallel when the complexities of the task increase than the depending of the solution time from complexity is represented by the curve which is convex at the bottom. For the detection of situation when the man increases the speed of the single processor under the condition of the increasing complexity we use the task series with similar operations but in the no associate algebra. In such tasks the parallel calculation is little affectivity in the sense of the increasing efficiency by the increasing the number of processors. That is the check set of the tests. In article we consider still one class of the tests, which are based on the calculation of the trajectory of the formal automat state if the input sequence is determined. We investigate the special class of automats (relay) for which the construction affect on the affectivity of the parallel calculations of the final automat state. For all tests we estimate the affectivity of the parallel calculation. This article do not contained the experiment results.
-
Hypergraph approach in the decomposition of complex technical systems
Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1007-1022The article considers a mathematical model of decomposition of a complex product into assembly units. This is an important engineering problem, which affects the organization of discrete production and its operational management. A review of modern approaches to mathematical modeling and automated computer-aided of decompositions is given. In them, graphs, networks, matrices, etc. serve as mathematical models of structures of technical systems. These models describe the mechanical structure as a binary relation on a set of system elements. The geometrical coordination and integrity of machines and mechanical devices during the manufacturing process is achieved by means of basing. In general, basing can be performed on several elements simultaneously. Therefore, it represents a variable arity relation, which can not be correctly described in terms of binary mathematical structures. A new hypergraph model of mechanical structure of technical system is described. This model allows to give an adequate formalization of assembly operations and processes. Assembly operations which are carried out by two working bodies and consist in realization of mechanical connections are considered. Such operations are called coherent and sequential. This is the prevailing type of operations in modern industrial practice. It is shown that the mathematical description of such operation is normal contraction of an edge of the hypergraph. A sequence of contractions transforming the hypergraph into a point is a mathematical model of the assembly process. Two important theorems on the properties of contractible hypergraphs and their subgraphs proved by the author are presented. The concept of $s$-hypergraphs is introduced. $S$-hypergraphs are the correct mathematical models of mechanical structures of any assembled technical systems. Decomposition of a product into assembly units is defined as cutting of an $s$-hypergraph into $s$-subgraphs. The cutting problem is described in terms of discrete mathematical programming. Mathematical models of structural, topological and technological constraints are obtained. The objective functions are proposed that formalize the optimal choice of design solutions in various situations. The developed mathematical model of product decomposition is flexible and open. It allows for extensions that take into account the characteristics of the product and its production.
-
Models of phytoplankton distribution over chlorophyll in various habitat conditions. Estimation of aquatic ecosystem bioproductivity
Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1177-1190A model of the phytoplankton abundance dynamics depending on changes in the content of chlorophyll in phytoplankton under the influence of changing environmental conditions is proposed. The model takes into account the dependence of biomass growth on environmental conditions, as well as on photosynthetic chlorophyll activity. The light and dark stages of photosynthesis have been identified. The processes of chlorophyll consumption during photosynthesis in the light and the growth of chlorophyll mass together with phytoplankton biomass are described. The model takes into account environmental conditions such as mineral nutrients, illumination and water temperature. The model is spatially distributed, the spatial variable corresponds to mass fraction of chlorophyll in phytoplankton. Thereby possible spreads of the chlorophyll contents in phytoplankton are taken into consideration. The model calculates the density distribution of phytoplankton by the proportion of chlorophyll in it. In addition, the rate of production of new phytoplankton biomass is calculated. In parallel, point analogs of the distributed model are considered. The diurnal and seasonal (during the year) dynamics of phytoplankton distribution by chlorophyll fraction are demonstrated. The characteristics of the rate of primary production in daily or seasonally changing environmental conditions are indicated. Model characteristics of the dynamics of phytoplankton biomass growth show that in the light this growth is about twice as large as in the dark. It shows, that illumination significantly affects the rate of production. Seasonal dynamics demonstrates an accelerated growth of biomass in spring and autumn. The spring maximum is associated with warming under the conditions of biogenic substances accumulated in winter, and the autumn, slightly smaller maximum, with the accumulation of nutrients during the summer decline in phytoplankton biomass. And the biomass in summer decreases, again due to a deficiency of nutrients. Thus, in the presence of light, mineral nutrition plays the main role in phytoplankton dynamics.
In general, the model demonstrates the dynamics of phytoplankton biomass, qualitatively similar to classical concepts, under daily and seasonal changes in the environment. The model seems to be suitable for assessing the bioproductivity of aquatic ecosystems. It can be supplemented with equations and terms of equations for a more detailed description of complex processes of photosynthesis. The introduction of variables in the physical habitat space and the conjunction of the model with satellite information on the surface of the reservoir leads to model estimates of the bioproductivity of vast marine areas. Introduction of physical space variables habitat and the interface of the model with satellite information about the surface of the basin leads to model estimates of the bioproductivity of vast marine areas.
-
Computational modeling of the thermal and physical processes in the high-temperature gas-cooled reactor
Computer Research and Modeling, 2023, v. 15, no. 4, pp. 895-906The development of a high-temperature gas-cooled reactor (HTGR) constituting a part of nuclear power-and-process station and intended for large-scale hydrogen production is now in progress in the Russian Federation. One of the key objectives in development of the high-temperature gas-cooled reactor is the computational justification of the accepted design.
The article gives the procedure for the computational analysis of thermal and physical characteristics of the high-temperature gas-cooled reactor. The procedure is based on the use of the state-of-the-art codes for personal computer (PC).
The objective of thermal and physical analysis of the reactor as a whole and of the core in particular was achieved in three stages. The idea of the first stage is to justify the neutron physical characteristics of the block-type core during burn-up with the use of the MCU-HTR code based on the Monte Carlo method. The second and the third stages are intended to study the coolant flow and the temperature condition of the reactor and the core in 3D with the required degree of detailing using the FlowVision and the ANSYS codes.
For the purpose of carrying out the analytical studies the computational models of the reactor flow path and the fuel assembly column were developed.
As per the results of the computational modeling the design of the support columns and the neutron physical characteristics of the fuel assembly were optimized. This results in the reduction of the total hydraulic resistance of the reactor and decrease of the maximum temperature of the fuel elements.
The dependency of the maximum fuel temperature on the value of the power peaking factors determined by the arrangement of the absorber rods and of the compacts of burnable absorber in the fuel assembly is demonstrated.
-
On the kinetics of entropy of a system with discrete microscopic states
Computer Research and Modeling, 2023, v. 15, no. 5, pp. 1207-1236An isolated system, which possesses a discrete set of microscopic states, is considered. The system performs spontaneous random transitions between the microstates. Kinetic equations for the probabilities of the system staying in various microstates are formulated. A general dimensionless expression for entropy of such a system, which depends on the probability distribution, is considered. Two problems are stated: 1) to study the effect of possible unequal probabilities of different microstates, in particular, when the system is in its internal equilibrium, on the system entropy value, and 2) to study the kinetics of microstate probability distribution and entropy evolution of the system in nonequilibrium states. The kinetics for the rates of transitions between the microstates is assumed to be first-order. Two variants of the effects of possible nonequiprobability of the microstates are considered: i) the microstates form two subgroups the probabilities of which are similar within each subgroup but differ between the subgroups, and ii) the microstate probabilities vary arbitrarily around the point at which they are all equal. It is found that, under a fixed total number of microstates, the deviations of entropy from the value corresponding to the equiprobable microstate distribution are extremely small. The latter is a rigorous substantiation of the known hypothesis about the equiprobability of microstates under the thermodynamic equilibrium. On the other hand, based on several characteristic examples, it is shown that the structure of random transitions between the microstates exerts a considerable effect on the rate and mode of the establishment of the system internal equilibrium, on entropy time dependence and expression of the entropy production rate. Under definite schemes of these transitions, there are possibilities of fast and slow components in the transients and of the existence of transients in the form of damped oscillations. The condition of universality and stability of equilibrium microstate distribution is that for any pair of microstates, a sequence of transitions should exist, which provides the passage from one microstate to next, and, consequently, any microstate traps should be absent.
-
Modelling spatio-temporal dynamics of circadian rythms in Neurospora crassa
Computer Research and Modeling, 2011, v. 3, no. 2, pp. 191-213Views (last year): 6. Citations: 20 (RSCI).We derive a new model of circadian oscillations in Neurospora crassa, which is suitable to analyze both temporal and spatial dynamics of proteins responsible for mechanism of rythms. The model is based on the non-linear interplay between proteins FRQ and WCC which are products of transcription of frequency and white collar genes forming a feedback loop comprised both positive and negative elements. The main component of oscillations mechanism is supposed to be time-delay in biochemical reactions of transcription. We show that the model accounts for various features observed in Neurospora’s experiments such as entrainment by light cycles, phase shift under light pulse, robustness to action of fluctuations and so on. Wave patterns excited during spatial development of the system are studied. It is shown that the wave of synchronization of biorythms arises under basal transcription factors.
-
Solution of the problem of optimal control of the process of methanogenesis based on the Pontryagin maximum principle
Computer Research and Modeling, 2020, v. 12, no. 2, pp. 357-367The paper presents a mathematical model that describes the process of obtaining biogas from livestock waste. This model describes the processes occurring in a biogas plant for mesophilic and thermophilic media, as well as for continuous and periodic modes of substrate inflow. The values of the coefficients of this model found earlier for the periodic mode, obtained by solving the problem of model identification from experimental data using a genetic algorithm, are given.
For the model of methanogenesis, an optimal control problem is formulated in the form of a Lagrange problem, whose criterial functionality is the output of biogas over a certain period of time. The controlling parameter of the task is the rate of substrate entry into the biogas plant. An algorithm for solving this problem is proposed, based on the numerical implementation of the Pontryagin maximum principle. In this case, a hybrid genetic algorithm with an additional search in the vicinity of the best solution using the method of conjugate gradients was used as an optimization method. This numerical method for solving an optimal control problem is universal and applicable to a wide class of mathematical models.
In the course of the study, various modes of submission of the substrate to the digesters, temperature environments and types of raw materials were analyzed. It is shown that the rate of biogas production in the continuous feed mode is 1.4–1.9 times higher in the mesophilic medium (1.9–3.2 in the thermophilic medium) than in the periodic mode over the period of complete fermentation, which is associated with a higher feed rate of the substrate and a greater concentration of nutrients in the substrate. However, the yield of biogas during the period of complete fermentation with a periodic mode is twice as high as the output over the period of a complete change of the substrate in the methane tank at a continuous mode, which means incomplete processing of the substrate in the second case. The rate of biogas formation for a thermophilic medium in continuous mode and the optimal rate of supply of raw materials is three times higher than for a mesophilic medium. Comparison of biogas output for various types of raw materials shows that the highest biogas output is observed for waste poultry farms, the least — for cattle farms waste, which is associated with the nutrient content in a unit of substrate of each type.
-
System modeling, risks evaluation and optimization of a distributed computer system
Computer Research and Modeling, 2020, v. 12, no. 6, pp. 1349-1359The article deals with the problem of a distributed system operation reliability. The system core is an open integration platform that provides interaction of varied software for modeling gas transportation. Some of them provide an access through thin clients on the cloud technology “software as a service”. Mathematical models of operation, transmission and computing are to ensure the operation of an automated dispatching system for oil and gas transportation. The paper presents a system solution based on the theory of Markov random processes and considers the stable operation stage. The stationary operation mode of the Markov chain with continuous time and discrete states is described by a system of Chapman–Kolmogorov equations with respect to the average numbers (mathematical expectations) of the objects in certain states. The objects of research are both system elements that are present in a large number – thin clients and computing modules, and individual ones – a server, a network manager (message broker). Together, they are interacting Markov random processes. The interaction is determined by the fact that the transition probabilities in one group of elements depend on the average numbers of other elements groups.
The authors propose a multi-criteria dispersion model of risk assessment for such systems (both in the broad and narrow sense, in accordance with the IEC standard). The risk is the standard deviation of estimated object parameter from its average value. The dispersion risk model makes possible to define optimality criteria and whole system functioning risks. In particular, for a thin client, the following is calculated: the loss profit risk, the total risk of losses due to non-productive element states, and the total risk of all system states losses.
Finally the paper proposes compromise schemes for solving the multi-criteria problem of choosing the optimal operation strategy based on the selected set of compromise criteria.
-
Modelling of cytokine storm in respiratory viral infections
Computer Research and Modeling, 2022, v. 14, no. 3, pp. 619-645In this work, we develop a model of the immune response to respiratory viral infections taking into account some particular properties of the SARS-CoV-2 infection. The model represents a system of ordinary differential equations for the concentrations of epithelial cells, immune cells, virus and inflammatory cytokines. Conventional analysis of the existence and stability of stationary points is completed by numerical simulations in order to study dynamics of solutions. Behavior of solutions is characterized by large peaks of virus concentration specific for acute respiratory viral infections.
At the first stage, we study the innate immune response based on the protective properties of interferon secreted by virus-infected cells. On the other hand, viral infection down-regulates interferon production. Their competition can lead to the bistability of the system with different regimes of infection progression with high or low intensity. In the case of infection outbreak, the incubation period and the maximal viral load depend on the initial viral load and the parameters of the immune response. In particular, increase of the initial viral load leads to shorter incubation period and higher maximal viral load.
In order to study the emergence and dynamics of cytokine storm, we consider proinflammatory cytokines produced by cells of the innate immune response. Depending on parameters of the model, the system can remain in the normal inflammatory state specific for viral infections or, due to positive feedback between inflammation and immune cells, pass to cytokine storm characterized by excessive production of proinflammatory cytokines. Furthermore, inflammatory cell death can stimulate transition to cytokine storm. However, it cannot sustain it by itself without the innate immune response. Assumptions of the model and obtained results are in qualitative agreement with the experimental and clinical data.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"