All issues
- 2026 Vol. 18
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Computational algorithm for solving the nonlinear boundary-value problem of hydrogen permeability with dynamic boundary conditions and concentration-dependent diffusion coefficient
Computer Research and Modeling, 2024, v. 16, no. 5, pp. 1179-1193The article deals with the nonlinear boundary-value problem of hydrogen permeability corresponding to the following experiment. A membrane made of the target structural material heated to a sufficiently high temperature serves as the partition in the vacuum chamber. Degassing is performed in advance. A constant pressure of gaseous (molecular) hydrogen is built up at the inlet side. The penetrating flux is determined by mass-spectrometry in the vacuum maintained at the outlet side.
A linear model of dependence on concentration is adopted for the coefficient of dissolved atomic hydrogen diffusion in the bulk. The temperature dependence conforms to the Arrhenius law. The surface processes of dissolution and sorptiondesorption are taken into account in the form of nonlinear dynamic boundary conditions (differential equations for the dynamics of surface concentrations of atomic hydrogen). The characteristic mathematical feature of the boundary-value problem is that concentration time derivatives are included both in the diffusion equation and in the boundary conditions with quadratic nonlinearity. In terms of the general theory of functional differential equations, this leads to the so-called neutral type equations and requires a more complex mathematical apparatus. An iterative computational algorithm of second-(higher- )order accuracy is suggested for solving the corresponding nonlinear boundary-value problem based on explicit-implicit difference schemes. To avoid solving the nonlinear system of equations at every time step, we apply the explicit component of difference scheme to slower sub-processes.
The results of numerical modeling are presented to confirm the fitness of the model to experimental data. The degrees of impact of variations in hydrogen permeability parameters (“derivatives”) on the penetrating flux and the concentration distribution of H atoms through the sample thickness are determined. This knowledge is important, in particular, when designing protective structures against hydrogen embrittlement or membrane technologies for producing high-purity hydrogen. The computational algorithm enables using the model in the analysis of extreme regimes for structural materials (pressure drops, high temperatures, unsteady heating), identifying the limiting factors under specific operating conditions, and saving on costly experiments (especially in deuterium-tritium investigations).
-
Special action and counter-terrorism models
Computer Research and Modeling, 2024, v. 16, no. 6, pp. 1467-1498Special actions (guerrilla, anti-guerrilla, reconnaissance and sabotage, subversive, counter-terrorist, counter-sabotage, etc.) are organized and conducted by law enforcement and armed forces and are aimed at protecting citizens and ensuring national security. Since the early 2000s, the problems of special actions have attracted the attention of specialists in the field of modeling, sociologists, physicists and representatives of other sciences. This article reviews and characterizes the works in the field of modeling special actions and counterterrorism. The works are classified by modeling methods (descriptive, optimization and game-theoretic), by types and stages of actions, and by phases of management (preparation and conduct of activities). The second section presents a classification of methods and models for special actions and counterterrorism, and gives a brief overview of descriptive models. The method of geographic profiling, network games, models of dynamics of special actions, the function of victory in combat and special actions (the dependence of the probability of victory on the correlation of forces and means of the parties) are considered. The third section considers the “attacker – defender” game and its extensions: the Stackelberg game and the Stackelberg security game, as well as issues of their application in security tasks In the “attacker – defender” game and security games, known works are classified on the following grounds: the sequence of moves, the number of players and their target functions, the time horizon of the game, the degree of rationality of the players and their attitude to risk, the degree of awareness of the players. The fourth section is devoted to the description of patrolling games on a graph with discrete time and simultaneous choice by the parties of their actions (Nash equilibrium is computed to find optimal strategies). The fifth section deals with game-theoretic models of transportation security as applications of Stackelberg security games. The last section is devoted to the review and characterization of a number of models of border security in two phases of management: preparation and conduct of activities. An example of effective interaction between Coast Guard units and university researchers is considered. Promising directions for further research are the following: first, modeling of counter-terrorist and special operations to neutralize terrorist and sabotage groups with the involvement of multidepartmental and heterogeneous forces and means, second, complexification of models by levels and stages of activity cycles, third, development of game-theoretic models of combating maritime terrorism and piracy.
-
Deriving specifications of dependable systems
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1637-1650Although human skills are heavily involved in the Requirements Engineering process, in particular, in requirements elicitation, analysis and specification, still methodology and formalism play a determining role in providing clarity and enabling analysis. In this paper, we propose a method for deriving formal specifications, which are applicable to dependable software systems. First, we clarify what the method itself is. Computer science has a proliferation of languages and methods, but the difference between the two is not always clear. This is a conceptual contribution. Furthermore, we propose the idea of Layered Fault Tolerant Specification (LFTS). The principle consists in layering specifications in (at least) two different layers: one for normal behaviors and others (if more than one) for abnormal behaviors. Abnormal behaviors are described in terms of an Error Injector (EI), which represent a model of the expected erroneous interference coming from the environment. This structure has been inspired by the notion of an idealized Fault Tolerant component, but the combination of LFTS and EI using rely guarantee thinking to describe interference is our second contribution. The overall result is the definition of a method for the specification of systems that do not run in isolation but in the real, physical world. We propose an approach that is pragmatic to its target audience: techniques must scale and be usable by non-experts, if they are to make it into an industrial setting. This article is making tentative steps, but the recent trends in Software Engineering such as Microservices, smart and software-defined buildings, M2M micropayments and Devops are relevant fields continue the investigation concerning dependability and rely guarantee thinking.
Keywords: formal methods, dependability. -
The model of switching mode of reproduction with a continuous set of production subsystems under the conditions of balanced growth
Computer Research and Modeling, 2025, v. 17, no. 3, pp. 501-519This paper presents new research results that have been conducted at the Institute of Economics of the Russian Academy of Sciences since 2011 under the leadership of Academician of the Russian Academy of Sciences V. I.Mayevsky. These works are aimed at developing the theory of switching mode of reproduction and corresponding mathematical models, the peculiarity of which is that they explicitly model the interaction of the financial and real sectors of the economy, and the country’s economy itself is not disaggregated according to the sectoral principle (engineering, agriculture, services, etc.), but by production subsystems that differ from each other by the age of the fixed capital. One of the mathematical difficulties of working with such models, called models of switching mode of reproduction (SMR), is the difficulty of modeling competitive relationships between subsystems of different “ages”. Therefore, until now, the interaction of a finite number of production subsystems has been considered in the SMR models, the models themselves were of a discrete-continuous nature, calculations were done exclusively on computers, and obtaining analytical dependencies was difficult. This paper shows that for the special case of balanced economic growth and a continuum of production subsystems, it is possible to obtain analytical expressions that allow a better understanding of the impact of monetary policy on economic dynamics. In addition to purely scientific interest, this is of great practical importance, since it allows us to assess the possible reaction of the real sector of the economy to changes in the monetary sphere without conducting complex simulation calculations.
-
Classifier size optimisation in segmentation of three-dimensional point images of wood vegetation
Computer Research and Modeling, 2025, v. 17, no. 4, pp. 665-675The advent of laser scanning technologies has revolutionized forestry. Their use made it possible to switch from studying woodlands using manual measurements to computer analysis of stereo point images called point clouds.
Automatic calculation of some tree parameters (such as trunk diameter) using a point cloud requires the removal of foliage points. To perform this operation, a preliminary segmentation of the stereo image into the “foliage” and “trunk” classes is required. The solution to this problem often involves the use of machine learning methods.
One of the most popular classifiers used for segmentation of stereo images of trees is a random forest. This classifier is quite demanding on the amount of memory. At the same time, the size of the machine learning model can be critical if it needs to be sent by wire, which is required, for example, when performing distributed learning. In this paper, the goal is to find a classifier that would be less demanding in terms of memory, but at the same time would have comparable segmentation accuracy. The search is performed among classifiers such as logistic regression, naive Bayes classifier, and decision tree. In addition, a method for segmentation refinement performed by a decision tree using logistic regression is being investigated.
The experiments were conducted on data from the collection of the University of Heidelberg. The collection contains hand-marked stereo images of trees of various species, both coniferous and deciduous, typical of the forests of Central Europe.
It has been shown that classification using a decision tree, adjusted using logistic regression, is able to produce a result that is only slightly inferior to the result of a random forest in accuracy, while spending less time and RAM. The difference in balanced accuracy is no more than one percent on all the clouds considered, while the total size and inference time of the decision tree and logistic regression classifiers is an order of magnitude smaller than of the random forest classifier.
-
Seismic wave fields in spherically symmetric Earth with high details. Analytical solution
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 903-922An analytical solution is obtained for seismic wave fields in a spherically symmetric Earth. In the case of an arbitrary layered medium, the solution, which includes Bessel functions, is constructed by means of a differential sweep method. Asymptotic of Bessel functions is used for stable calculation of wave fields. It is shown that the classical asymptotic in the case of a sphere of large (in wavelengths) dimensions gives an error in the solution. The new asymptotic is used for efficient calculation of a solution without errors with high detail. A program has been created that makes it possible to carry out calculations for high-frequency (1 hertz and higher) teleseismic wave fields in a discrete (layered) sphere of planetary dimensions. Calculations can be carried even out on personal computers with OpenMP parallelization.
In the works of Burmin (2019) proposed a spherically symmetric model of the Earth. It is characterized by the fact that in it the outer core has a viscosity and, therefore, an effective shear modulus other than zero. For this model of the Earth, a highly detailed calculation was carried out with a carrier frequency of 1 hertz. As a result of the analytical calculation, it was found that highfrequency oscillations of small amplitude, the so-called “precursors”, appear ahead of the PKP waves. An analytical calculation showed that the theoretical seismograms for this model of the Earth are in many respects similar to the experimental data. This confirms the correctness of the ideas underlying its construction.
-
Using Docker service containers to build browser-based clinical decision support systems (CDSS)
Computer Research and Modeling, 2026, v. 18, no. 1, pp. 133-147The article presents a technology for building clinical decision support systems (CDSS) based on service containers using Docker and a web interface that runs directly in the browser without installing specialized software on workstation of a clinician. A modular architecture is proposed in which each application module is packaged as an independent service container combining a lightweight web server, a user interface, and computational components for medical image processing. Communication between the browser and the server side is implemented via a persistent bidirectional WebSocket connection with binary message serialization (MessagePack), which provides low latency and efficient transfer of large data. For local storage of images and analysis of results, browser facilities (IndexedDB with the Dexie.js wrapper) are used to speed up repeated data access. Three-dimensional visualization and basic operations with DICOM data are implemented with Three.js and AMI.js: this toolchain supports the integration of interactive elements arising from the task context (annotations, landmarks, markers, 3D models) into volumetric medical images.
Server components and functional modules are assembled as a set of interacting containers managed by Docker. The paper discusses the choice of base images, approaches to minimizing containers down to runtime-only executables without external utilities, and the organization of multi-stage builds with a dedicated build container. It describes a hub service that launches application containers on user request, performs request proxying, manages sessions, and switches a container from shared to exclusive mode at the start of computations. Examples of application modules are provided (fractional flow reserve estimation, quantitative flow ratio computation, aortic valve closure modeling), along with the integration of a React-based interface with a three-dimensional scene, a versioning policy, automated reproducibility checks, and the deployment procedure on the target platform.
It is demonstrated that containerization ensures portability and reproducibility of the software environment, dependency isolation and scalability, while the browser-based interface provides accessibility, reduced infrastructure requirements, and interactive real-time visualization of medical data. Technical limitations are noted (dependence on versions of visualization libraries and data formats) together with practical mitigation measures.
-
Investigation of complex formation of flavodoxin and photosystem 1 by means of direct multiparticle computer simulation
Computer Research and Modeling, 2009, v. 1, no. 1, pp. 85-91Views (last year): 4. Citations: 2 (RSCI).Kinetics of complex formation between components of the photosynthetic electron transport chain — flavodoxin and membrane complex photosystem I has been studied using computer model based on methods of multiparticle simulation and Brownian dynamics. We simulated Brownian motion of several hundreds of flavodoxin molecules, taking into account electrostatic interactions and complex shape of the molecules. Our model could describe experimental nonmonotonic dependence of the association rate constant for flavodoxin and photosystem I. This lets us conclude that electrostatic interactions are sufficient to form such kind of nonmonotonic dependence.
-
Simulation of copper nanocrystal plastic deformation at uniaxial tension
Computer Research and Modeling, 2013, v. 5, no. 2, pp. 225-230Views (last year): 3. Citations: 1 (RSCI).Computer simulation of plastic deformation of FCC copper nanocrystal in the process of uniaxial tension in a direction [001] is performed by methods of molecular dynamics and a static relaxation. It is shown that thermoelastic martensite transformation is responsible for plastic deformation, FCC lattice is reconstructed into HCP lattice. Orientation relationship of contacting phases is identified.
-
Deformation model of polymer nanocomposites based on cellular automata
Computer Research and Modeling, 2014, v. 6, no. 1, pp. 131-136Views (last year): 3. Citations: 2 (RSCI).This paper discusses the modeling of the deformation of polymer nanocomposites containing "hard" and "soft" inclusions, using cellular automata and parallel computing. The paper describes an algorithm based on the model, a comparison with experimental data is shown, software for the numerical experiment is described.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




