All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Numerical-analytical modeling of gravitational lensing of the electromagnetic waves in random-inhomogeneous space plasma
Computer Research and Modeling, 2024, v. 16, no. 2, pp. 433-443Instrument of numerical-analytical modeling of characteristics of propagation of electromagnetic waves in chaotic space plasma with taking into account effects of gravitation is developed for interpretation of data of measurements of astrophysical precision instruments of new education. The task of propagation of waves in curved (Riemann’s) space is solved in Euclid’s space by introducing of the effective index of refraction of vacuum. The gravitational potential can be calculated for various model of distribution of mass of astrophysical objects and at solution of Poisson’s equation. As a result the effective index of refraction of vacuum can be evaluated. Approximate model of the effective index of refraction is suggested with condition that various objects additively contribute in total gravitational field. Calculation of the characteristics of electromagnetic waves in the gravitational field of astrophysical objects is performed by the approximation of geometrical optics with condition that spatial scales of index of refraction a lot more wavelength. Light differential equations in Euler’s form are formed the basis of numerical-analytical instrument of modeling of trajectory characteristic of waves. Chaotic inhomogeneities of space plasma are introduced by model of spatial correlation function of index of refraction. Calculations of refraction scattering of waves are performed by the approximation of geometrical optics. Integral equations for statistic moments of lateral deviations of beams in picture plane of observer are obtained. Integrals for moments are reduced to system of ordinary differential equations the firsts order with using analytical transformations for cooperative numerical calculation of arrange and meansquare deviations of light. Results of numerical-analytical modeling of trajectory picture of propagation of electromagnetic waves in interstellar space with taking into account impact of gravitational fields of space objects and refractive scattering of waves on inhomogeneities of index of refraction of surrounding plasma are shown. Based on the results of modeling quantitative estimation of conditions of stochastic blurring of the effect of gravitational lensing of electromagnetic waves at various frequency ranges is performed. It’s shown that operating frequencies of meter range of wavelengths represent conditional low-frequency limit for observational of the effect of gravitational lensing in stochastic space plasma. The offered instrument of numerical-analytical modeling can be used for analyze of structure of electromagnetic radiation of quasar propagating through group of galactic.
-
Modeling of rheological characteristics of aqueous suspensions based on nanoscale silicon dioxide particles
Computer Research and Modeling, 2024, v. 16, no. 5, pp. 1217-1252The rheological behavior of aqueous suspensions based on nanoscale silicon dioxide particles strongly depends on the dynamic viscosity, which affects directly the use of nanofluids. The purpose of this work is to develop and validate models for predicting dynamic viscosity from independent input parameters: silicon dioxide concentration SiO2, pH acidity, and shear rate $\gamma$. The influence of the suspension composition on its dynamic viscosity is analyzed. Groups of suspensions with statistically homogeneous composition have been identified, within which the interchangeability of compositions is possible. It is shown that at low shear rates, the rheological properties of suspensions differ significantly from those obtained at higher speeds. Significant positive correlations of the dynamic viscosity of the suspension with SiO2 concentration and pH acidity were established, and negative correlations with the shear rate $\gamma$. Regression models with regularization of the dependence of the dynamic viscosity $\eta$ on the concentrations of SiO2, NaOH, H3PO4, surfactant (surfactant), EDA (ethylenediamine), shear rate γ were constructed. For more accurate prediction of dynamic viscosity, the models using algorithms of neural network technologies and machine learning (MLP multilayer perceptron, RBF radial basis function network, SVM support vector method, RF random forest method) were trained. The effectiveness of the constructed models was evaluated using various statistical metrics, including the average absolute approximation error (MAE), the average quadratic error (MSE), the coefficient of determination $R^2$, and the average percentage of absolute relative deviation (AARD%). The RF model proved to be the best model in the training and test samples. The contribution of each component to the constructed model is determined. It is shown that the concentration of SiO2 has the greatest influence on the dynamic viscosity, followed by pH acidity and shear rate γ. The accuracy of the proposed models is compared to the accuracy of models previously published. The results confirm that the developed models can be considered as a practical tool for studying the behavior of nanofluids, which use aqueous suspensions based on nanoscale particles of silicon dioxide.
-
The task of trajectory calculation with the homogenous distribution of results
Computer Research and Modeling, 2014, v. 6, no. 5, pp. 803-828Citations: 3 (RSCI).We consider a new set of tests which assigns to detection of human capability for parallel calculation. The new tests support the homogenous statistical distribution of results in distinction to the tests discussed in our previous works. This feature simplifies the analysis of test results and decreases the estimate of statistical error. The new experimental data is close to results obtained in previous experiments.
-
Kinetic model of DNA double-strand break repair in primary human fibroblasts exposed to low-LET irradiation with various dose rates
Computer Research and Modeling, 2015, v. 7, no. 1, pp. 159-176Views (last year): 4. Citations: 3 (RSCI).Here we demonstrate the results of kinetic modeilng of DNA double-strand breaks induction and repair and phosphorilated histone H2AX ($\gamma$-H2AX) and Rad51 foci formation in primary human fibroblasts exposed to low-LET ionizing radiation (IR). The model describes two major paths of DNA double-strand breaks repair: non-homologous end joining (NHEJ) and homologous recombination (HR) and considers interactions between DNA and several repair proteins (DNA-PKcs, ATM, Ku70/80, XRCC1, XRCC4, Rad51, RPA, etc.) using mass action equations and Michaelis–Menten kinetics. Experimental data on DNA rejoining kinetics and $\gamma$-H2AX and Rad51 foci formation in vicinity of double strand breaks in primary human fibroblasts exposed to low-LET IR with various dose rates and exposure times was utilized for training and statistical validation of the model.
-
Running applications on a hybrid cluster
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 475-483Views (last year): 4.A hybrid cluster implies the use of computational devices with radically different architectures. Usually, these are conventional CPU architecture (e.g. x86_64) and GPU architecture (e. g. NVIDIA CUDA). Creating and exploiting such a cluster requires some experience: in order to harness all computational power of the described system and get substantial speedup for computational tasks many factors should be taken into account. These factors consist of hardware characteristics (e.g. network infrastructure, a type of data storage, GPU architecture) as well as software stack (e.g. MPI implementation, GPGPU libraries). So, in order to run scientific applications GPU capabilities, software features, task size and other factors should be considered.
This report discusses opportunities and problems of hybrid computations. Some statistics from tests programs and applications runs will be demonstrated. The main focus of interest is open source applications (e. g. OpenFOAM) that support GPGPU (with some parts rewritten to use GPGPU directly or by replacing libraries).
There are several approaches to organize heterogeneous computations for different GPU architectures out of which CUDA library and OpenCL framework are compared. CUDA library is becoming quite typical for hybrid systems with NVIDIA cards, but OpenCL offers portability opportunities which can be a determinant factor when choosing framework for development. We also put emphasis on multi-GPU systems that are often used to build hybrid clusters. Calculations were performed on a hybrid cluster of SPbU computing center.
-
Estimation of models parameters for time series with Markov switching regimes
Computer Research and Modeling, 2018, v. 10, no. 6, pp. 903-918Views (last year): 36.The paper considers the problem of estimating the parameters of time series described by regression models with Markov switching of two regimes at random instants of time with independent Gaussian noise. For the solution, we propose a variant of the EM algorithm based on the iterative procedure, during which an estimation of the regression parameters is performed for a given sequence of regime switching and an evaluation of the switching sequence for the given parameters of the regression models. In contrast to the well-known methods of estimating regression parameters in the models with Markov switching, which are based on the calculation of a posteriori probabilities of discrete states of the switching sequence, in the paper the estimates are calculated of the switching sequence, which are optimal by the criterion of the maximum of a posteriori probability. As a result, the proposed algorithm turns out to be simpler and requires less calculations. Computer modeling allows to reveal the factors influencing accuracy of estimation. Such factors include the number of observations, the number of unknown regression parameters, the degree of their difference in different modes of operation, and the signal-to-noise ratio which is associated with the coefficient of determination in regression models. The proposed algorithm is applied to the problem of estimating parameters in regression models for the rate of daily return of the RTS index, depending on the returns of the S&P 500 index and Gazprom shares for the period from 2013 to 2018. Comparison of the estimates of the parameters found using the proposed algorithm is carried out with the estimates that are formed using the EViews econometric package and with estimates of the ordinary least squares method without taking into account regimes switching. The account of regimes switching allows to receive more exact representation about structure of a statistical dependence of investigated variables. In switching models, the increase in the signal-to-noise ratio leads to the fact that the differences in the estimates produced by the proposed algorithm and using the EViews program are reduced.
-
Cytokines as indicators of the state of the organism in infectious diseases. Experimental data analysis
Computer Research and Modeling, 2020, v. 12, no. 6, pp. 1409-1426When person`s diseases is result of bacterial infection, various characteristics of the organism are used for observation the course of the disease. Currently, one of these indicators is dynamics of cytokine concentrations are produced, mainly by cells of the immune system. There are many types of these low molecular weight proteins in human body and many species of animals. The study of cytokines is important for the interpretation of functional disorders of the body's immune system, assessment of the severity, monitoring the effectiveness of therapy, predicting of the course and outcome of treatment. Cytokine response of the body indicating characteristics of course of disease. For research regularities of such indication, experiments were conducted on laboratory mice. Experimental data are analyzed on the development of pneumonia and treatment with several drugs for bacterial infection of mice. As drugs used immunomodulatory drugs “Roncoleukin”, “Leikinferon” and “Tinrostim”. The data are presented by two types cytokines` concentration in lung tissue and animal blood. Multy-sided statistical ana non statistical analysis of the data allowed us to find common patterns of changes in the “cytokine profile” of the body and to link them with the properties of therapeutic preparations. The studies cytokine “Interleukin-10” (IL-10) and “Interferon Gamma” (IFN$\gamma$) in infected mice deviate from the normal level of infact animals indicating the development of the disease. Changes in cytokine concentrations in groups of treated mice are compared with those in a group of healthy (not infected) mice and a group of infected untreated mice. The comparison is made for groups of individuals, since the concentrations of cytokines are individual and differ significantly in different individuals. Under these conditions, only groups of individuals can indicate the regularities of the processes of the course of the disease. These groups of mice were being observed for two weeks. The dynamics of cytokine concentrations indicates characteristics of the disease course and efficiency of used therapeutic drugs. The effect of a medicinal product on organisms is monitored by the location of these groups of individuals in the space of cytokine concentrations. The Hausdorff distance between the sets of vectors of cytokine concentrations of individuals is used in this space. This is based on the Euclidean distance between the elements of these sets. It was found that the drug “Roncoleukin” and “Leukinferon” have a generally similar and different from the drug “Tinrostim” effect on the course of the disease.
Keywords: data processing, experiment, cytokine, immune system, pneumonia, statistics, approximation, Hausdorff distance. -
Game-theoretic and reflexive combat models
Computer Research and Modeling, 2022, v. 14, no. 1, pp. 179-203Modeling combat operations is an urgent scientific and practical task aimed at providing commanders and staffs with quantitative grounds for making decisions. The authors proposed the function of victory in combat and military operations, based on the function of the conflict by G. Tullock and taking into account the scale of combat (military) operations. On a sufficient volume of military statistics, the scale parameter was assessed and its values were found for the tactical, operational and strategic levels. The game-theoretic models «offensive – defense», in which the sides solve the immediate and subsequent tasks, having the formation of troops in one or several echelons, have been investigated. At the first stage of modeling, the solution of the immediate task is found — the breakthrough (holding) of defense points, at the second — the solution of the subsequent task — the defeat of the enemy in the depth of the defense (counterattack and restoration of defense). For the tactical level, using the Nash equilibrium, solutions were found for the closest problem (distribution of the forces of the sides by points of defense) in an antagonistic game according to three criteria: a) breakthrough of the weakest point, b) breakthrough of at least one point, and c) weighted average probability. It is shown that it is advisable for the attacking side to use the criterion of «breaking through at least one point», in which, all other things being equal, the maximum probability of breaking through the points of defense is ensured. At the second stage of modeling for a particular case (the sides are guided by the criterion of breaking through the weakest point when breaking through and holding defense points), the problem of distributing forces and facilities between tactical tasks (echelons) was solved according to two criteria: a) maximizing the probability of breaking through the defense point and the probability of defeating the enemy in depth defense, b) maximizing the minimum value of the named probabilities (the criterion of the guaranteed result). Awareness is an important aspect of combat operations. Several examples of reflexive games (games characterized by complex mutual awareness) and information management are considered. It is shown under what conditions information control increases the player’s payoff, and the optimal information control is found.
-
Numerical modeling of physical processes leading to the destruction of meteoroids in the Earth’s atmosphere
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 835-851Within the framework of the actual problem of comet-asteroid danger, the physical processes causing the destruction and fragmentation of meteor bodies in the Earth’s atmosphere are numerically investigated. Based on the developed physicalmathematical models that determines the movements of space objects of natural origin in the atmosphere and their interaction with it, the fall of three, one of the largest and by some parameters unusual bolides in the history of meteoritics, are considered: Tunguska, Vitim and Chelyabinsk. Their singularity lies in the absence of any material meteorite remains and craters in the area of the alleged crash site for the first two bodies and the non-detection, as it is assumed, of the main mother body for the third body (due to the too small amount of mass of the fallen fragments compared to the estimated mass). The effect of aerodynamic loads and heat flows on these bodies are studied, which leads to intensive surface mass loss and possible mechanical destruction. The velocities of the studied celestial bodies and the change in their masses are determined from the modernized system of equations of the theory of meteoric physics. An important factor that is taken into account here is the variability of the meteorite mass entrainment parameter under the action of heat fluxes (radiation and convective) along the flight path. The process of fragmentation of meteoroids in this paper is considered within the framework of a progressive crushing model based on the statistical theory of strength, taking into account the influence of the scale factor on the ultimate strength of objects. The phenomena and effects arising at various kinematic and physical parameters of each of these bodies are revealed. In particular, the change in the ballistics of their flight in the denser layers of the atmosphere, consisting in the transition from the fall mode to the ascent mode. At the same time, the following scenarios of the event can be realized: 1) the return of the body back to outer space at its residual velocity greater than the second cosmic one; 2) the transition of the body to the orbit of the Earth satellite at a residual velocity greater than the first cosmic one; 3) at lower values of the residual velocity of the body, its return after some time to the fall mode and falling out at a considerable distance from the intended crash site. It is the implementation of one of these three scenarios of the event that explains, for example, the absence of material traces, including craters, in the case of the Tunguska bolide in the vicinity of the forest collapse. Assumptions about the possibility of such scenarios have been made earlier by other authors, and in this paper their implementation is confirmed by the results of numerical calculations.
-
Development of and research into a rigid algorithm for analyzing Twitter publications and its influence on the movements of the cryptocurrency market
Computer Research and Modeling, 2023, v. 15, no. 1, pp. 157-170Social media is a crucial indicator of the position of assets in the financial market. The paper describes the rigid solution for the classification problem to determine the influence of social media activity on financial market movements. Reputable crypto traders influencers are selected. Twitter posts packages are used as data. The methods of text, which are characterized by the numerous use of slang words and abbreviations, and preprocessing consist in lemmatization of Stanza and the use of regular expressions. A word is considered as an element of a vector of a data unit in the course of solving the problem of binary classification. The best markup parameters for processing Binance candles are searched for. Methods of feature selection, which is necessary for a precise description of text data and the subsequent process of establishing dependence, are represented by machine learning and statistical analysis. First, the feature selection is used based on the information criterion. This approach is implemented in a random forest model and is relevant for the task of feature selection for splitting nodes in a decision tree. The second one is based on the rigid compilation of a binary vector during a rough check of the presence or absence of a word in the package and counting the sum of the elements of this vector. Then a decision is made depending on the superiority of this sum over the threshold value that is predetermined previously by analyzing the frequency distribution of mentions of the word. The algorithm used to solve the problem was named benchmark and analyzed as a tool. Similar algorithms are often used in automated trading strategies. In the course of the study, observations of the influence of frequently occurring words, which are used as a basis of dimension 2 and 3 in vectorization, are described as well.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




