All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Stochastic transitions from order to chaos in a metapopulation model with migration
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 959-973This paper focuses on the problem of modeling and analyzing dynamic regimes, both regular and chaotic, in systems of coupled populations in the presence of random disturbances. The discrete Ricker model is used as the initial deterministic population model. The paper examines the dynamics of two populations coupled by migration. Migration is proportional to the difference between the densities of two populations with a coupling coefficient responsible for the strength of the migration flow. Isolated population subsystems, modeled by the Ricker map, exhibit various dynamic modes, including equilibrium, periodic, and chaotic ones. In this study, the coupling coefficient is treated as a bifurcation parameter and the parameters of natural population growth rate remain fixed. Under these conditions, one subsystem is in the equilibrium mode, while the other exhibits chaotic behavior. The coupling of two populations through migration creates new dynamic regimes, which were not observed in the isolated model. This article aims to analyze the dynamics of corporate systems with variations in the flow intensity between population subsystems. The article presents a bifurcation analysis of the attractors in a deterministic model of two coupled populations, identifies zones of monostability and bistability, and gives examples of regular and chaotic attractors. The main focus of the work is in comparing the stability of dynamic regimes against random disturbances in the migration intensity. Noise-induced transitions from a periodic attractor to a chaotic attractor are identified and described using direct numerical simulation methods. The Lyapunov exponents are used to analyze stochastic phenomena. It has been shown that in this model, there is a region of change in the bifurcation parameter in which, even with an increase in the intensity of random perturbations, there is no transition from order to chaos. For the analytical study of noise-induced transitions, the stochastic sensitivity function technique and the confidence domain method are used. The paper demonstrates how this mathematical tool can be employed to predict the critical noise intensity that causes a periodic regime to transform into a chaotic one.
-
Multistability for a mathematical model of a tritrophic system in a heterogeneous habitat
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 923-939We consider a spatiotemporal model of a tritrophic system describing the interaction between prey, predator, and superpredator in an environment with nonuniform resource distribution. The model incorporates superpredator omnivory (Intraguild Predation, IGP), diffusion, and directed migration (taxis), the latter modeled using a logarithmic function of resource availability and prey density. The primary focus is on analyzing the multistability of the system and the role of cosymmetry in the formation of continuous families of steady-state solutions. Using a numerical-analytical approach, we study both spatially homogeneous and inhomogeneous steady-state solutions. It is established that under additional relations between the parameters governing local predator interactions and diffusion coefficients, the system exhibits cosymmetry, leading to the emergence of a family of stable steady-state solutions proportional to the resource function. We demonstrate that the cosymmetry is independent of the resource function in the case of a heterogeneous environment. The stability of stationary distributions is investigated using spectral methods. Violation of the cosymmetry conditions results in the breakdown of the solution family and the emergence of isolated equilibria, as well as prolonged transient dynamics reflecting the system’s “memory” of the vanished states. Depending on initial conditions and parameters, the system exhibits transitions to single-predator regimes (survival of either the predator or superpredator) or predator coexistence. Numerical experiments based on the method of lines, which involves finite difference discretization in space and Runge –Kutta integration in time, confirm the system’s multistability and illustrate the disappearance of solution families when cosymmetry is broken.
-
Current issues in computational modeling of thrombosis, fibrinolysis, and thrombolysis
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 975-995Hemostasis system is one of the key body’s defense systems, which is presented in all the liquid tissues and especially important in blood. Hemostatic response is triggered as a result of the vessel injury. The interaction between specialized cells and humoral systems leads to the formation of the initial hemostatic clot, which stops bleeding. After that the slow process of clot dissolution occurs. The formation of hemostatic plug is a unique physiological process, because during several minutes the hemostatic system generates complex structures on a scale ranging from microns for microvessel injury or damaged endothelial cell-cell contacts, to centimeters for damaged systemic arteries. Hemostatic response depends on the numerous coordinated processes, which include platelet adhesion and aggregation, granule secretion, platelet shape change, modification of the chemical composition of the lipid bilayer, clot contraction, and formation of the fibrin mesh due to activation of blood coagulation cascade. Computer modeling is a powerful tool, which is used to study this complex system at different levels of organization. This includes study of intracellular signaling in platelets, modelling humoral systems of blood coagulation and fibrinolysis, and development of the multiscale models of thrombus growth. There are two key issues of the computer modeling in biology: absence of the adequate physico-mathematical description of the existing experimental data due to the complexity of the biological processes, and high computational complexity of the models, which doesn’t allow to use them to test physiologically relevant scenarios. Here we discuss some key unresolved problems in the field, as well as the current progress in experimental research of hemostasis and thrombosis. New findings lead to reevaluation of the existing concepts and development of the novel computer models. We focus on the arterial thrombosis, venous thrombosis, thrombosis in microcirculation and the problems of fibrinolysis and thrombolysis. We also briefly discuss basic types of the existing mathematical models, their computational complexity, and principal issues in simulation of thrombus growth in arteries.
-
Modeling the impact of epidemic spread and lockdown on economy
Computer Research and Modeling, 2025, v. 17, no. 2, pp. 339-363Epidemics severely destabilize economies by reducing productivity, weakening consumer spending, and overwhelming public infrastructure, often culminating in economic recessions. The COVID-19 pandemic underscored the critical role of nonpharmaceutical interventions, such as lockdowns, in containing infectious disease transmission. This study investigates how the progression of epidemics and the implementation of lockdown policies shape the economic well-being of populations. By integrating compartmental ordinary differential equation (ODE) models, the research analyzes the interplay between epidemic dynamics and economic outcomes, particularly focusing on how varying lockdown intensities influence both disease spread and population wealth. Findings reveal that epidemics inflict significant economic damage, but timely and stringent lockdowns can mitigate healthcare system overload by sharply reducing infection peaks and delaying the epidemic’s trajectory. However, carefully timed lockdown relaxation is equally vital to prevent resurgent outbreaks. The study identifies key epidemiological thresholds—such as transmission rates, recovery rates, and the basic reproduction number $(\mathfrak{R}0)$ — that determine the effectiveness of lockdowns. Analytically, it pinpoints the optimal proportion of isolated individuals required to minimize total infections in scenarios where permanent immunity is assumed. Economically, the analysis quantifies lockdown impacts by tracking population wealth, demonstrating that economic outcomes depend heavily on the fraction of isolated individuals who remain economically productive. Higher proportions of productive individuals during lockdowns correlate with better wealth retention, even under fixed epidemic conditions. These insights equip policymakers with actionable frameworks to design balanced lockdown strategies that curb disease spread while safeguarding economic stability during future health crises.
-
Running applications on a hybrid cluster
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 475-483Views (last year): 4.A hybrid cluster implies the use of computational devices with radically different architectures. Usually, these are conventional CPU architecture (e.g. x86_64) and GPU architecture (e. g. NVIDIA CUDA). Creating and exploiting such a cluster requires some experience: in order to harness all computational power of the described system and get substantial speedup for computational tasks many factors should be taken into account. These factors consist of hardware characteristics (e.g. network infrastructure, a type of data storage, GPU architecture) as well as software stack (e.g. MPI implementation, GPGPU libraries). So, in order to run scientific applications GPU capabilities, software features, task size and other factors should be considered.
This report discusses opportunities and problems of hybrid computations. Some statistics from tests programs and applications runs will be demonstrated. The main focus of interest is open source applications (e. g. OpenFOAM) that support GPGPU (with some parts rewritten to use GPGPU directly or by replacing libraries).
There are several approaches to organize heterogeneous computations for different GPU architectures out of which CUDA library and OpenCL framework are compared. CUDA library is becoming quite typical for hybrid systems with NVIDIA cards, but OpenCL offers portability opportunities which can be a determinant factor when choosing framework for development. We also put emphasis on multi-GPU systems that are often used to build hybrid clusters. Calculations were performed on a hybrid cluster of SPbU computing center.
-
Biomathematical system of the nucleic acids description
Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.
-
Generating database schema from requirement specification based on natural language processing and large language model
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1703-1713A Large Language Model (LLM) is an advanced artificial intelligence algorithm that utilizes deep learning methodologies and extensive datasets to process, understand, and generate humanlike text. These models are capable of performing various tasks, such as summarization, content creation, translation, and predictive text generation, making them highly versatile in applications involving natural language understanding. Generative AI, often associated with LLMs, specifically focuses on creating new content, particularly text, by leveraging the capabilities of these models. Developers can harness LLMs to automate complex processes, such as extracting relevant information from system requirement documents and translating them into a structured database schema. This capability has the potential to streamline the database design phase, saving significant time and effort while ensuring that the resulting schema aligns closely with the given requirements. By integrating LLM technology with Natural Language Processing (NLP) techniques, the efficiency and accuracy of generating database schemas based on textual requirement specifications can be significantly enhanced. The proposed tool will utilize these capabilities to read system requirement specifications, which may be provided as text descriptions or as Entity-Relationship Diagrams (ERDs). It will then analyze the input and automatically generate a relational database schema in the form of SQL commands. This innovation eliminates much of the manual effort involved in database design, reduces human errors, and accelerates development timelines. The aim of this work is to provide a tool can be invaluable for software developers, database architects, and organizations aiming to optimize their workflow and align technical deliverables with business requirements seamlessly.
-
Simulation of traffic flows based on the quasi-gasdynamic approach and the cellular automata theory using supercomputers
Computer Research and Modeling, 2024, v. 16, no. 1, pp. 175-194The purpose of the study is to simulate the dynamics of traffic flows on city road networks as well as to systematize the current state of affairs in this area. The introduction states that the development of intelligent transportation systems as an integral part of modern transportation technologies is coming to the fore. The core of these systems contain adequate mathematical models that allow to simulate traffic as close to reality as possible. The necessity of using supercomputers due to the large amount of calculations is also noted, therefore, the creation of special parallel algorithms is needed. The beginning of the article is devoted to the up-to-date classification of traffic flow models and characterization of each class, including their distinctive features and relevant examples with links. Further, the main focus of the article is shifted towards the development of macroscopic and microscopic models, created by the authors, and determination of the place of these models in the aforementioned classification. The macroscopic model is based on the continuum approach and uses the ideology of quasi-gasdynamic systems of equations. Its advantages are indicated in comparison with existing models of this class. The model is presented both in one-dimensional and two-dimensional versions. The both versions feature the ability to study multi-lane traffic. In the two-dimensional version it is made possible by introduction of the concept of “lateral” velocity, i. e., the speed of changing lanes. The latter version allows for carrying out calculations in the computational domain which corresponds to the actual geometry of the road. The section also presents the test results of modeling vehicle dynamics on a road fragment with the local widening and on a road fragment with traffic lights, including several variants of traffic light regimes. In the first case, the calculations allow to draw interesting conclusions about the impact of a road widening on a road capacity as a whole, and in the second case — to select the optimal regime configuration to obtain the “green wave” effect. The microscopic model is based on the cellular automata theory and the single-lane Nagel – Schreckenberg model and is generalized for the multi-lane case by the authors of the article. The model implements various behavioral strategies of drivers. Test computations for the real transport network section in Moscow city center are presented. To achieve an adequate representation of vehicles moving through the network according to road traffic regulations the authors implemented special algorithms adapted for parallel computing. Test calculations were performed on the K-100 supercomputer installed in the Centre of Collective Usage of KIAM RAS.
-
Valuation of machines at the random process of their degradation and premature sales
Computer Research and Modeling, 2024, v. 16, no. 3, pp. 797-815The model of the process of using machinery and equipment is considered, which takes into account the probabilistic nature of the process of their operation and sale. It takes into account the possibility of random hidden failures, after which the condition of the machine deteriorates abruptly, as well as the randomly arising need for premature (before the end of its service life) sale of the machine, which requires, generally speaking, random time. The model is focused on assessing the market value and service life of machines in accordance with International Valuation Standards. Strictly speaking, the market value of a used machine depends on its technical condition, but in practice, appraisers only take into account its age, since generally accepted measures of the technical condition of machines do not yet exist. As a result, the market value of a used machine is assumed to be equal to the average market value of similar machines of the corresponding age. For these purposes, appraisers use coefficients that reflect the influence of the age of machines on their market value. Such coefficients are not always justified and do not take into account either the degradation of the machine or the probabilistic nature of the process of its use. The proposed model is based on the anticipation of benefits principle. In it, we characterize the state of the machine by the intensity of the benefits it brings. The machine is subjected to a complex Poisson failure process, and after failure its condition abruptly worsens and may even reach its limit. Situations also arise that preclude further use of the machine by its owner. In such situations, the owner puts the machine up for sale before the end of its service life (prematurely), and the sale requires a random timing. The model allows us to take into account the influence of such situations and construct an analytical relationship linking the market value of a machine with its condition, and calculate the average coefficients of change in the market value of machines with age. At the same time, it is also possible to take into account the influence of inflation and the scrap cost of the machine. We have found that the rate of prematurely sales has a significant impact on the cost of new and used machines. The model also allows us to take into account the influence of inflation and the scrap value of the machine. We have found that the rate of premature sales has a significant impact on the service life and market value of new and used machines. At the same time, the dependence of the market value of machines on age is largely determined by the coefficient of variation of the service life of the machines. The results obtained allow us to obtain more reasonable estimates of the market value of machines, including for the purposes of the system of national accounts.
-
Changepoint detection in biometric data: retrospective nonparametric segmentation methods based on dynamic programming and sliding windows
Computer Research and Modeling, 2024, v. 16, no. 5, pp. 1295-1321This paper is dedicated to the analysis of medical and biological data obtained through locomotor training and testing of astronauts conducted both on Earth and during spaceflight. These experiments can be described as the astronaut’s movement on a treadmill according to a predefined regimen in various speed modes. During these modes, not only the speed is recorded but also a range of parameters, including heart rate, ground reaction force, and others, are collected. In order to analyze the dynamics of the astronaut’s condition over an extended period, it is necessary to perform a qualitative segmentation of their movement modes to independently assess the target metrics. This task becomes particularly relevant in the development of an autonomous life support system for astronauts that operates without direct supervision from Earth. The segmentation of target data is complicated by the presence of various anomalies, such as deviations from the predefined regimen, arbitrary and varying duration of mode transitions, hardware failures, and other factors. The paper includes a detailed review of several contemporary retrospective (offline) nonparametric methods for detecting multiple changepoints, which refer to sudden changes in the properties of the observed time series occurring at unknown moments. Special attention is given to algorithms and statistical measures that determine the homogeneity of the data and methods for detecting change points. The paper considers approaches based on dynamic programming and sliding window methods. The second part of the paper focuses on the numerical modeling of these methods using characteristic examples of experimental data, including both “simple” and “complex” speed profiles of movement. The analysis conducted allowed us to identify the preferred methods, which will be further evaluated on the complete dataset. Preference is given to methods that ensure the closeness of the markup to a reference one, potentially allow the detection of both boundaries of transient processes, as well as are robust relative to internal parameters.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




