All issues
- 2026 Vol. 18
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Migration processes modelling: methods and tools (overview)
Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1205-1232Migration has a significant impact on the shaping of the demographic structure of the territories population, the state of regional and local labour markets. As a rule, rapid change in the working-age population of any territory due to migration processes results in an imbalance in supply and demand on labour markets and a change in the demographic structure of the population. Migration is also to a large extent a reflection of socio-economic processes taking place in the society. Hence, the issues related to the study of migration factors, the direction, intensity and structure of migration flows, and the prediction of their magnitude are becoming topical issues these days.
Mathematical tools are often used to analyze, predict migration processes and assess their consequences, allowing for essentially accurate modelling of migration processes for different territories on the basis of the available statistical data. In recent years, quite a number of scientific papers on modelling internal and external migration flows using mathematical methods have appeared both in Russia and in foreign countries in recent years. Consequently, there has been a need to systematize the currently most commonly used methods and tools applied in migration modelling to form a coherent picture of the main trends and research directions in this field.
The presented review considers the main approaches to migration modelling and the main components of migration modelling methodology, i. e. stages, methods, models and model classification. Their comparative analysis was also conducted and general recommendations on the choice of mathematical tools for modelling were developed. The review contains two sections: migration modelling methods and migration models. The first section describes the main methods used in the model development process — econometric, cellular automata, system-dynamic, probabilistic, balance, optimization and cluster analysis. Based on the analysis of modern domestic and foreign publications on migration, the most common classes of models — regression, agent-based, simulation, optimization, probabilistic, balance, dynamic and combined — were identified and described. The features, advantages and disadvantages of different types of migration process models were considered.
-
Model for building of the radio environment map for cognitive communication system based on LTE
Computer Research and Modeling, 2022, v. 14, no. 1, pp. 127-146The paper is devoted to the secondary use of spectrum in telecommunication networks. It is emphasized that one of the solutions to this problem is the use of cognitive radio technologies and dynamic spectrum access for the successful functioning of which a large amount of information is required, including the parameters of base stations and network subscribers. Storage and processing of information should be carried out using a radio environment map, which is a spatio-temporal database of all activity in the network and allows you to determine the frequencies available for use at a given time. The paper presents a two-level model for forming a map of the radio environment of a cellular communication system LTE, in which the local and global levels are highlighted, which is described by the following parameters: a set of frequencies, signal attenuation, signal propagation map, grid step, current time count. The key objects of the model are the base station and the subscriber unit. The main parameters of the base station include: name, identifier, cell coordinates, range number, radiation power, numbers of connected subscriber devices, dedicated resource blocks. For subscriber devices, the following parameters are used: name, identifier, location, current coordinates of the device cell, base station identifier, frequency range, numbers of resource blocks for communication with the station, radiation power, data transmission status, list of numbers of the nearest stations, schedules movement and communication sessions of devices. An algorithm for the implementation of the model is presented, taking into account the scenarios of movement and communication sessions of subscriber devices. A method for calculating a map of the radio environment at a point on a coordinate grid, taking into account losses during the propagation of radio signals from emitting devices, is presented. The software implementation of the model is performed using the MatLab package. The approaches are described that allow to increase the speed of its work. In the simulation, the choice of parameters was carried out taking into account the data of the existing communication systems and the economy of computing resources. The experimental results of the algorithm for the formation of a radio environment map are demonstrated, confirming the correctness of the developed model.
-
Mathematical modeling the kinetics and calculation of dosimetric characteristics of osteotropic radiopharmaceutical drugs
Computer Research and Modeling, 2022, v. 14, no. 3, pp. 647-660In Russian medicine two radiopharmaceuticals are currently used for radionuclide therapy of bone metastases: 89Sr-chloride and 153Sm-oxabifor. The first one has many side effects, so its use is limited. The second one is available only in clinics, its transportation to which does not take much time. Currently, the third radiopharmaceutical 188Re-solerene is undergoing clinical trials. Due to the generator method of obtaining 188Re, this radiopharmaceutical should become available for use in many regions of our country. Therefore, there is a need for a comparative analysis of the characteristics of these radiopharmaceuticals, including on the basis of mathematical modeling.
The article discusses the features of mathematical modeling the kinetics of osteotropic radiopharmaceutical drugs in the human body with bone metastases. Based on the four-compartment model, a complex of modeling and calculation of pharmacokinetic and dosimetric characteristics of radiopharmaceuticals for radionuclide therapy of bone metastases was developed and tested. Using clinical data, the transport constants of the model were identified and the individual characteristics of Russian radiopharmaceuticals labeled 89Sr, 153Sm and 188Re were calculated (effective half-lives, maximum activity in the compartments and the times of their achievement, absorbed doses to bone tissue and metastases, endosteal bone layer, red bone marrow, blood, kidneys and bladder). The time activity dependencies for all compartments of the model are obtained and analyzed. A comparative analysis of the pharmacokinetics and dosimetry of three radiopharmaceuticals (89Sr-chloride, 153Sm-oxabiphore, 188Re-solerene) was carried out.
From a comparative analysis of the pharmacokinetic and dosimetric characteristics of these radiopharmaceutical drugs, it follows that the best of them for widespread use in many regions of our country should be 188Re-solerene, taking into account the generator method of obtaining 188Re in a hospital.
-
Analysis of predictive properties of ground tremor using Huang decomposition
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 939-958A method is proposed for analyzing the tremor of the earth’s surface, measured by means of space geodesy, in order to highlight the prognostic effects of seismicity activation. The method is illustrated by the example of a joint analysis of a set of synchronous time series of daily vertical displacements of the earth’s surface on the Japanese Islands for the time interval 2009–2023. The analysis is based on dividing the source data (1047 time series) into blocks (clusters of stations) and sequentially applying the principal component method. The station network is divided into clusters using the K-means method from the maximum pseudo-F-statistics criterion, and for Japan the optimal number of clusters was chosen to be 15. The Huang decomposition method into a sequence of independent empirical oscillation modes (EMD — Empirical Mode Decomposition) is applied to the time series of principal components from station blocks. To provide the stability of estimates of the waveforms of the EMD decomposition, averaging of 1000 independent additive realizations of white noise of limited amplitude was performed. Using the Cholesky decomposition of the covariance matrix of the waveforms of the first three EMD components in a sliding time window, indicators of abnormal tremor behavior were determined. By calculating the correlation function between the average indicators of anomalous behavior and the released seismic energy in the vicinity of the Japanese Islands, it was established that bursts in the measure of anomalous tremor behavior precede emissions of seismic energy. The purpose of the article is to clarify common hypotheses that movements of the earth’s crust recorded by space geodesy may contain predictive information. That displacements recorded by geodetic methods respond to the effects of earthquakes is widely known and has been demonstrated many times. But isolating geodetic effects that predict seismic events is much more challenging. In our paper, we propose one method for detecting predictive effects in space geodesy data.
-
Computational algorithm for solving the nonlinear boundary-value problem of hydrogen permeability with dynamic boundary conditions and concentration-dependent diffusion coefficient
Computer Research and Modeling, 2024, v. 16, no. 5, pp. 1179-1193The article deals with the nonlinear boundary-value problem of hydrogen permeability corresponding to the following experiment. A membrane made of the target structural material heated to a sufficiently high temperature serves as the partition in the vacuum chamber. Degassing is performed in advance. A constant pressure of gaseous (molecular) hydrogen is built up at the inlet side. The penetrating flux is determined by mass-spectrometry in the vacuum maintained at the outlet side.
A linear model of dependence on concentration is adopted for the coefficient of dissolved atomic hydrogen diffusion in the bulk. The temperature dependence conforms to the Arrhenius law. The surface processes of dissolution and sorptiondesorption are taken into account in the form of nonlinear dynamic boundary conditions (differential equations for the dynamics of surface concentrations of atomic hydrogen). The characteristic mathematical feature of the boundary-value problem is that concentration time derivatives are included both in the diffusion equation and in the boundary conditions with quadratic nonlinearity. In terms of the general theory of functional differential equations, this leads to the so-called neutral type equations and requires a more complex mathematical apparatus. An iterative computational algorithm of second-(higher- )order accuracy is suggested for solving the corresponding nonlinear boundary-value problem based on explicit-implicit difference schemes. To avoid solving the nonlinear system of equations at every time step, we apply the explicit component of difference scheme to slower sub-processes.
The results of numerical modeling are presented to confirm the fitness of the model to experimental data. The degrees of impact of variations in hydrogen permeability parameters (“derivatives”) on the penetrating flux and the concentration distribution of H atoms through the sample thickness are determined. This knowledge is important, in particular, when designing protective structures against hydrogen embrittlement or membrane technologies for producing high-purity hydrogen. The computational algorithm enables using the model in the analysis of extreme regimes for structural materials (pressure drops, high temperatures, unsteady heating), identifying the limiting factors under specific operating conditions, and saving on costly experiments (especially in deuterium-tritium investigations).
-
Comprehensive analysis of copper ions effect on the primary processes of photosynthesis in Scenedesmus quadricauda based on chlorophyll a fluorescence measurements in suspension and on single cells
Computer Research and Modeling, 2025, v. 17, no. 2, pp. 293-322The effect of copper ions on the primary processes of photosynthesis in freshwater microalgae Scenedesmus quadricauda was studied using a set of biophysical and mathematical methods. Chlorophyll a fluorescence transients were recorded both in cell suspensions and at the level of single cells after incubation at copper concentrations of 0.1–10 $\mu$M under light and dark conditions. It was found that copper has a dose-dependent effect on the photosynthetic apparatus of microalgae. At low copper concentration (0.1 $\mu$M), a stimulating effect on a number of studied parameters was observed, whereas significant disruption of Photosystem II activity was detected at 10 $\mu$M. The method of analyzing fluorescence of single cells proved to be more sensitive compared to traditional suspension measurements, allowing the detection of heterogeneous cellular responses to the toxicant. Analysis of chlorophyll a fast fluorescence kinetics showed that the JIP-test parameters $\delta_{Ro}$ and $\varphi_{Ro}$ were the most sensitive to copper exposure and were significantly different from the control when exposed not only to high but also to medium (1 $\mu$M) copper concentrations. The decrease in photochemical activity of cells during light incubation was less pronounced compared to dark conditions. The application of data normalization to optical density at $\lambda = 455$ nm significantly increased the sensitivity of the method and accuracy of result interpretation. The use of L1-regularization (LASSO) by the least angles method (LARS) for the spectral multi-exponential approximation of the fluorescence transients allowed us to reveal their temporal characteristics. Mathematical analysis of the obtained data suggested that copper exposure leads to increased non-photochemical quenching of fluorescence, which serves as a protective mechanism for dissipating excess excitation energy. The revealed heterogeneity of cellular responses to copper action may have important ecological significance, ensuring the survival of part of the population under stress conditions. The obtained data confirm the promise of using fluorescent analysis methods for early diagnosis of heavy metal stress effects on photosynthesizing organisms.
-
Classifier size optimisation in segmentation of three-dimensional point images of wood vegetation
Computer Research and Modeling, 2025, v. 17, no. 4, pp. 665-675The advent of laser scanning technologies has revolutionized forestry. Their use made it possible to switch from studying woodlands using manual measurements to computer analysis of stereo point images called point clouds.
Automatic calculation of some tree parameters (such as trunk diameter) using a point cloud requires the removal of foliage points. To perform this operation, a preliminary segmentation of the stereo image into the “foliage” and “trunk” classes is required. The solution to this problem often involves the use of machine learning methods.
One of the most popular classifiers used for segmentation of stereo images of trees is a random forest. This classifier is quite demanding on the amount of memory. At the same time, the size of the machine learning model can be critical if it needs to be sent by wire, which is required, for example, when performing distributed learning. In this paper, the goal is to find a classifier that would be less demanding in terms of memory, but at the same time would have comparable segmentation accuracy. The search is performed among classifiers such as logistic regression, naive Bayes classifier, and decision tree. In addition, a method for segmentation refinement performed by a decision tree using logistic regression is being investigated.
The experiments were conducted on data from the collection of the University of Heidelberg. The collection contains hand-marked stereo images of trees of various species, both coniferous and deciduous, typical of the forests of Central Europe.
It has been shown that classification using a decision tree, adjusted using logistic regression, is able to produce a result that is only slightly inferior to the result of a random forest in accuracy, while spending less time and RAM. The difference in balanced accuracy is no more than one percent on all the clouds considered, while the total size and inference time of the decision tree and logistic regression classifiers is an order of magnitude smaller than of the random forest classifier.
-
Seismic wave fields in spherically symmetric Earth with high details. Analytical solution
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 903-922An analytical solution is obtained for seismic wave fields in a spherically symmetric Earth. In the case of an arbitrary layered medium, the solution, which includes Bessel functions, is constructed by means of a differential sweep method. Asymptotic of Bessel functions is used for stable calculation of wave fields. It is shown that the classical asymptotic in the case of a sphere of large (in wavelengths) dimensions gives an error in the solution. The new asymptotic is used for efficient calculation of a solution without errors with high detail. A program has been created that makes it possible to carry out calculations for high-frequency (1 hertz and higher) teleseismic wave fields in a discrete (layered) sphere of planetary dimensions. Calculations can be carried even out on personal computers with OpenMP parallelization.
In the works of Burmin (2019) proposed a spherically symmetric model of the Earth. It is characterized by the fact that in it the outer core has a viscosity and, therefore, an effective shear modulus other than zero. For this model of the Earth, a highly detailed calculation was carried out with a carrier frequency of 1 hertz. As a result of the analytical calculation, it was found that highfrequency oscillations of small amplitude, the so-called “precursors”, appear ahead of the PKP waves. An analytical calculation showed that the theoretical seismograms for this model of the Earth are in many respects similar to the experimental data. This confirms the correctness of the ideas underlying its construction.
-
Using Docker service containers to build browser-based clinical decision support systems (CDSS)
Computer Research and Modeling, 2026, v. 18, no. 1, pp. 133-147The article presents a technology for building clinical decision support systems (CDSS) based on service containers using Docker and a web interface that runs directly in the browser without installing specialized software on workstation of a clinician. A modular architecture is proposed in which each application module is packaged as an independent service container combining a lightweight web server, a user interface, and computational components for medical image processing. Communication between the browser and the server side is implemented via a persistent bidirectional WebSocket connection with binary message serialization (MessagePack), which provides low latency and efficient transfer of large data. For local storage of images and analysis of results, browser facilities (IndexedDB with the Dexie.js wrapper) are used to speed up repeated data access. Three-dimensional visualization and basic operations with DICOM data are implemented with Three.js and AMI.js: this toolchain supports the integration of interactive elements arising from the task context (annotations, landmarks, markers, 3D models) into volumetric medical images.
Server components and functional modules are assembled as a set of interacting containers managed by Docker. The paper discusses the choice of base images, approaches to minimizing containers down to runtime-only executables without external utilities, and the organization of multi-stage builds with a dedicated build container. It describes a hub service that launches application containers on user request, performs request proxying, manages sessions, and switches a container from shared to exclusive mode at the start of computations. Examples of application modules are provided (fractional flow reserve estimation, quantitative flow ratio computation, aortic valve closure modeling), along with the integration of a React-based interface with a three-dimensional scene, a versioning policy, automated reproducibility checks, and the deployment procedure on the target platform.
It is demonstrated that containerization ensures portability and reproducibility of the software environment, dependency isolation and scalability, while the browser-based interface provides accessibility, reduced infrastructure requirements, and interactive real-time visualization of medical data. Technical limitations are noted (dependence on versions of visualization libraries and data formats) together with practical mitigation measures.
-
Modeling of helix formation in peptides containing aspartic and glutamic residues
Computer Research and Modeling, 2010, v. 2, no. 1, pp. 83-90Views (last year): 2. Citations: 4 (RSCI).In present work we used the methods of molecular dynamics simulations and quantum chemistry to study the concept, according to which aspartic and glutamic residues play a key role in initiation of helix formation in oligopeptides. It has been shown, that the first turn of the alpha-helix can be organized from various amino acid sequences with Asp and Glu residues on the N-terminus. Thermodynamic properties of such a process were analyzed. The obtained results do not interfere with known experimental and statistical data and they substantially elaborate present views on the processes of early peptide folding stages.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




