All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Nonlinear modeling of oscillatory viscoelastic fluid with variable viscosity: a comparative analysis of dual solutions
Computer Research and Modeling, 2024, v. 16, no. 2, pp. 409-431The viscoelastic fluid flow model across a porous medium has captivated the interest of many contemporary researchers due to its industrial and technical uses, such as food processing, paper and textile coating, packed bed reactors, the cooling effect of transpiration and the dispersion of pollutants through aquifers. This article focuses on the influence of variable viscosity and viscoelasticity on the magnetohydrodynamic oscillatory flow of second-order fluid through thermally radiating wavy walls. A mathematical model for this fluid flow, including governing equations and boundary conditions, is developed using the usual Boussinesq approximation. The governing equations are transformed into a system of nonlinear ordinary differential equations using non-similarity transformations. The numerical results obtained by applying finite-difference code based on the Lobatto IIIa formula generated by bvp4c solver are compared to the semi-analytical solutions for the velocity, temperature and concentration profiles obtained using the homotopy perturbation method (HPM). The effect of flow parameters on velocity, temperature, concentration profiles, skin friction coefficient, heat and mass transfer rate, and skin friction coefficient is examined and illustrated graphically. The physical parameters governing the fluid flow profoundly affected the resultant flow profiles except in a few cases. By using the slope linear regression method, the importance of considering the viscosity variation parameter and its interaction with the Lorentz force in determining the velocity behavior of the viscoelastic fluid model is highlighted. The percentage increase in the velocity profile of the viscoelastic model has been calculated for different ranges of viscosity variation parameters. Finally, the results are validated numerically for the skin friction coefficient and Nusselt number profiles.
-
Sensitivity analysis and semi-analytical solution for analyzing the dynamics of coffee berry disease
Computer Research and Modeling, 2024, v. 16, no. 3, pp. 731-753Coffee berry disease (CBD), resulting from the Colletotrichum kahawae fungal pathogen, poses a severe risk to coffee crops worldwide. Focused on coffee berries, it triggers substantial economic losses in regions relying heavily on coffee cultivation. The devastating impact extends beyond agricultural losses, affecting livelihoods and trade economies. Experimental insights into coffee berry disease provide crucial information on its pathogenesis, progression, and potential mitigation strategies for control, offering valuable knowledge to safeguard the global coffee industry. In this paper, we investigated the mathematical model of coffee berry disease, with a focus on the dynamics of the coffee plant and Colletotrichum kahawae pathogen populations, categorized as susceptible, exposed, infected, pathogenic, and recovered (SEIPR) individuals. To address the system of nonlinear differential equations and obtain semi-analytical solution for the coffee berry disease model, a novel analytical approach combining the Shehu transformation, Akbari – Ganji, and Pade approximation method (SAGPM) was utilized. A comparison of analytical results with numerical simulations demonstrates that the novel SAGPM is excellent efficiency and accuracy. Furthermore, the sensitivity analysis of the coffee berry disease model examines the effects of all parameters on the basic reproduction number $R_0$. Moreover, in order to examine the behavior of the model individuals, we varied some parameters in CBD. Through this analysis, we obtained valuable insights into the responses of the coffee berry disease model under various conditions and scenarios. This research offers valuable insights into the utilization of SAGPM and sensitivity analysis for analyzing epidemiological models, providing significant utility for researchers in the field.
-
Stochastic transitions from order to chaos in a metapopulation model with migration
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 959-973This paper focuses on the problem of modeling and analyzing dynamic regimes, both regular and chaotic, in systems of coupled populations in the presence of random disturbances. The discrete Ricker model is used as the initial deterministic population model. The paper examines the dynamics of two populations coupled by migration. Migration is proportional to the difference between the densities of two populations with a coupling coefficient responsible for the strength of the migration flow. Isolated population subsystems, modeled by the Ricker map, exhibit various dynamic modes, including equilibrium, periodic, and chaotic ones. In this study, the coupling coefficient is treated as a bifurcation parameter and the parameters of natural population growth rate remain fixed. Under these conditions, one subsystem is in the equilibrium mode, while the other exhibits chaotic behavior. The coupling of two populations through migration creates new dynamic regimes, which were not observed in the isolated model. This article aims to analyze the dynamics of corporate systems with variations in the flow intensity between population subsystems. The article presents a bifurcation analysis of the attractors in a deterministic model of two coupled populations, identifies zones of monostability and bistability, and gives examples of regular and chaotic attractors. The main focus of the work is in comparing the stability of dynamic regimes against random disturbances in the migration intensity. Noise-induced transitions from a periodic attractor to a chaotic attractor are identified and described using direct numerical simulation methods. The Lyapunov exponents are used to analyze stochastic phenomena. It has been shown that in this model, there is a region of change in the bifurcation parameter in which, even with an increase in the intensity of random perturbations, there is no transition from order to chaos. For the analytical study of noise-induced transitions, the stochastic sensitivity function technique and the confidence domain method are used. The paper demonstrates how this mathematical tool can be employed to predict the critical noise intensity that causes a periodic regime to transform into a chaotic one.
-
Multistability for a mathematical model of a tritrophic system in a heterogeneous habitat
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 923-939We consider a spatiotemporal model of a tritrophic system describing the interaction between prey, predator, and superpredator in an environment with nonuniform resource distribution. The model incorporates superpredator omnivory (Intraguild Predation, IGP), diffusion, and directed migration (taxis), the latter modeled using a logarithmic function of resource availability and prey density. The primary focus is on analyzing the multistability of the system and the role of cosymmetry in the formation of continuous families of steady-state solutions. Using a numerical-analytical approach, we study both spatially homogeneous and inhomogeneous steady-state solutions. It is established that under additional relations between the parameters governing local predator interactions and diffusion coefficients, the system exhibits cosymmetry, leading to the emergence of a family of stable steady-state solutions proportional to the resource function. We demonstrate that the cosymmetry is independent of the resource function in the case of a heterogeneous environment. The stability of stationary distributions is investigated using spectral methods. Violation of the cosymmetry conditions results in the breakdown of the solution family and the emergence of isolated equilibria, as well as prolonged transient dynamics reflecting the system’s “memory” of the vanished states. Depending on initial conditions and parameters, the system exhibits transitions to single-predator regimes (survival of either the predator or superpredator) or predator coexistence. Numerical experiments based on the method of lines, which involves finite difference discretization in space and Runge –Kutta integration in time, confirm the system’s multistability and illustrate the disappearance of solution families when cosymmetry is broken.
-
Current issues in computational modeling of thrombosis, fibrinolysis, and thrombolysis
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 975-995Hemostasis system is one of the key body’s defense systems, which is presented in all the liquid tissues and especially important in blood. Hemostatic response is triggered as a result of the vessel injury. The interaction between specialized cells and humoral systems leads to the formation of the initial hemostatic clot, which stops bleeding. After that the slow process of clot dissolution occurs. The formation of hemostatic plug is a unique physiological process, because during several minutes the hemostatic system generates complex structures on a scale ranging from microns for microvessel injury or damaged endothelial cell-cell contacts, to centimeters for damaged systemic arteries. Hemostatic response depends on the numerous coordinated processes, which include platelet adhesion and aggregation, granule secretion, platelet shape change, modification of the chemical composition of the lipid bilayer, clot contraction, and formation of the fibrin mesh due to activation of blood coagulation cascade. Computer modeling is a powerful tool, which is used to study this complex system at different levels of organization. This includes study of intracellular signaling in platelets, modelling humoral systems of blood coagulation and fibrinolysis, and development of the multiscale models of thrombus growth. There are two key issues of the computer modeling in biology: absence of the adequate physico-mathematical description of the existing experimental data due to the complexity of the biological processes, and high computational complexity of the models, which doesn’t allow to use them to test physiologically relevant scenarios. Here we discuss some key unresolved problems in the field, as well as the current progress in experimental research of hemostasis and thrombosis. New findings lead to reevaluation of the existing concepts and development of the novel computer models. We focus on the arterial thrombosis, venous thrombosis, thrombosis in microcirculation and the problems of fibrinolysis and thrombolysis. We also briefly discuss basic types of the existing mathematical models, their computational complexity, and principal issues in simulation of thrombus growth in arteries.
-
Modeling the impact of epidemic spread and lockdown on economy
Computer Research and Modeling, 2025, v. 17, no. 2, pp. 339-363Epidemics severely destabilize economies by reducing productivity, weakening consumer spending, and overwhelming public infrastructure, often culminating in economic recessions. The COVID-19 pandemic underscored the critical role of nonpharmaceutical interventions, such as lockdowns, in containing infectious disease transmission. This study investigates how the progression of epidemics and the implementation of lockdown policies shape the economic well-being of populations. By integrating compartmental ordinary differential equation (ODE) models, the research analyzes the interplay between epidemic dynamics and economic outcomes, particularly focusing on how varying lockdown intensities influence both disease spread and population wealth. Findings reveal that epidemics inflict significant economic damage, but timely and stringent lockdowns can mitigate healthcare system overload by sharply reducing infection peaks and delaying the epidemic’s trajectory. However, carefully timed lockdown relaxation is equally vital to prevent resurgent outbreaks. The study identifies key epidemiological thresholds—such as transmission rates, recovery rates, and the basic reproduction number $(\mathfrak{R}0)$ — that determine the effectiveness of lockdowns. Analytically, it pinpoints the optimal proportion of isolated individuals required to minimize total infections in scenarios where permanent immunity is assumed. Economically, the analysis quantifies lockdown impacts by tracking population wealth, demonstrating that economic outcomes depend heavily on the fraction of isolated individuals who remain economically productive. Higher proportions of productive individuals during lockdowns correlate with better wealth retention, even under fixed epidemic conditions. These insights equip policymakers with actionable frameworks to design balanced lockdown strategies that curb disease spread while safeguarding economic stability during future health crises.
-
Running applications on a hybrid cluster
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 475-483Views (last year): 4.A hybrid cluster implies the use of computational devices with radically different architectures. Usually, these are conventional CPU architecture (e.g. x86_64) and GPU architecture (e. g. NVIDIA CUDA). Creating and exploiting such a cluster requires some experience: in order to harness all computational power of the described system and get substantial speedup for computational tasks many factors should be taken into account. These factors consist of hardware characteristics (e.g. network infrastructure, a type of data storage, GPU architecture) as well as software stack (e.g. MPI implementation, GPGPU libraries). So, in order to run scientific applications GPU capabilities, software features, task size and other factors should be considered.
This report discusses opportunities and problems of hybrid computations. Some statistics from tests programs and applications runs will be demonstrated. The main focus of interest is open source applications (e. g. OpenFOAM) that support GPGPU (with some parts rewritten to use GPGPU directly or by replacing libraries).
There are several approaches to organize heterogeneous computations for different GPU architectures out of which CUDA library and OpenCL framework are compared. CUDA library is becoming quite typical for hybrid systems with NVIDIA cards, but OpenCL offers portability opportunities which can be a determinant factor when choosing framework for development. We also put emphasis on multi-GPU systems that are often used to build hybrid clusters. Calculations were performed on a hybrid cluster of SPbU computing center.
-
Biomathematical system of the nucleic acids description
Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.
-
Generating database schema from requirement specification based on natural language processing and large language model
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1703-1713A Large Language Model (LLM) is an advanced artificial intelligence algorithm that utilizes deep learning methodologies and extensive datasets to process, understand, and generate humanlike text. These models are capable of performing various tasks, such as summarization, content creation, translation, and predictive text generation, making them highly versatile in applications involving natural language understanding. Generative AI, often associated with LLMs, specifically focuses on creating new content, particularly text, by leveraging the capabilities of these models. Developers can harness LLMs to automate complex processes, such as extracting relevant information from system requirement documents and translating them into a structured database schema. This capability has the potential to streamline the database design phase, saving significant time and effort while ensuring that the resulting schema aligns closely with the given requirements. By integrating LLM technology with Natural Language Processing (NLP) techniques, the efficiency and accuracy of generating database schemas based on textual requirement specifications can be significantly enhanced. The proposed tool will utilize these capabilities to read system requirement specifications, which may be provided as text descriptions or as Entity-Relationship Diagrams (ERDs). It will then analyze the input and automatically generate a relational database schema in the form of SQL commands. This innovation eliminates much of the manual effort involved in database design, reduces human errors, and accelerates development timelines. The aim of this work is to provide a tool can be invaluable for software developers, database architects, and organizations aiming to optimize their workflow and align technical deliverables with business requirements seamlessly.
-
Simulation of traffic flows based on the quasi-gasdynamic approach and the cellular automata theory using supercomputers
Computer Research and Modeling, 2024, v. 16, no. 1, pp. 175-194The purpose of the study is to simulate the dynamics of traffic flows on city road networks as well as to systematize the current state of affairs in this area. The introduction states that the development of intelligent transportation systems as an integral part of modern transportation technologies is coming to the fore. The core of these systems contain adequate mathematical models that allow to simulate traffic as close to reality as possible. The necessity of using supercomputers due to the large amount of calculations is also noted, therefore, the creation of special parallel algorithms is needed. The beginning of the article is devoted to the up-to-date classification of traffic flow models and characterization of each class, including their distinctive features and relevant examples with links. Further, the main focus of the article is shifted towards the development of macroscopic and microscopic models, created by the authors, and determination of the place of these models in the aforementioned classification. The macroscopic model is based on the continuum approach and uses the ideology of quasi-gasdynamic systems of equations. Its advantages are indicated in comparison with existing models of this class. The model is presented both in one-dimensional and two-dimensional versions. The both versions feature the ability to study multi-lane traffic. In the two-dimensional version it is made possible by introduction of the concept of “lateral” velocity, i. e., the speed of changing lanes. The latter version allows for carrying out calculations in the computational domain which corresponds to the actual geometry of the road. The section also presents the test results of modeling vehicle dynamics on a road fragment with the local widening and on a road fragment with traffic lights, including several variants of traffic light regimes. In the first case, the calculations allow to draw interesting conclusions about the impact of a road widening on a road capacity as a whole, and in the second case — to select the optimal regime configuration to obtain the “green wave” effect. The microscopic model is based on the cellular automata theory and the single-lane Nagel – Schreckenberg model and is generalized for the multi-lane case by the authors of the article. The model implements various behavioral strategies of drivers. Test computations for the real transport network section in Moscow city center are presented. To achieve an adequate representation of vehicles moving through the network according to road traffic regulations the authors implemented special algorithms adapted for parallel computing. Test calculations were performed on the K-100 supercomputer installed in the Centre of Collective Usage of KIAM RAS.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




