All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Simulation equatorial plasma bubbles started from plasma clouds
Computer Research and Modeling, 2019, v. 11, no. 3, pp. 463-476Views (last year): 14.Experimental, theoretical and numerical investigations of equatorial spread F, equatorial plasma bubbles (EPBs), plasma depletion shells, and plasma clouds are continued at new variety articles. Nonlinear growth, bifurcation, pinching, atomic and molecular ion dynamics are considered at there articles. But the authors of this article believe that not all parameters of EPB development are correct. For example, EPB bifurcation is highly questionable.
A maximum speed inside EPBs and a development time of EPB are defined and studied. EPBs starting from one, two or three zones of the increased density (initial plasma clouds). The development mechanism of EPB is the Rayleigh-Taylor instability (RTI). Time of the initial stage of EPB development went into EPB favorable time interval (in this case the increase linear increment is more than zero) and is 3000–7000 c for the Earth equatorial ionosphere.
Numerous computing experiments were conducted with use of the original two-dimensional mathematical and numerical model MI2, similar USA standard model SAMI2. This model MI2 is described in detail. The received results can be used both in other theoretical works and for planning and carrying out natural experiments for generation of F-spread in Earth ionosphere.
Numerical simulating was carried out for the geophysical conditions favorable for EPBs development. Numerical researches confirmed that development time of EPBs from initial irregularities with the increased density is significantly more than development time from zones of the lowered density. It is shown that developed irregularities interact among themselves strongly and not linearly even then when initial plasma clouds are strongly removed from each other. In addition, this interaction is stronger than interaction of EPBs starting from initial irregularities with the decreased density. The numerical experiments results showed the good consent of developed EPB parameters with experimental data and with theoretical researches of other authors.
-
Comparative analysis of human adaptation to the growth of visual information in the tasks of recognizing formal symbols and meaningful images
Computer Research and Modeling, 2021, v. 13, no. 3, pp. 571-586We describe an engineering-psychological experiment that continues the study of ways to adapt a person to the increasing complexity of logical problems by presenting a series of problems of increasing complexity, which is determined by the volume of initial data. Tasks require calculations in an associative or non-associative system of operations. By the nature of the change in the time of solving the problem, depending on the number of necessary operations, we can conclude that a purely sequential method of solving problems or connecting additional brain resources to the solution in parallel mode. In a previously published experimental work, a person in the process of solving an associative problem recognized color images with meaningful images. In the new study, a similar problem is solved for abstract monochrome geometric shapes. Analysis of the result showed that for the second case, the probability of the subject switching to a parallel method of processing visual information is significantly reduced. The research method is based on presenting a person with two types of tasks. One type of problem contains associative calculations and allows a parallel solution algorithm. Another type of problem is the control one, which contains problems in which calculations are not associative and parallel algorithms are ineffective. The task of recognizing and searching for a given object is associative. A parallel strategy significantly speeds up the solution with relatively small additional resources. As a control series of problems (to separate parallel work from the acceleration of a sequential algorithm), we use, as in the previous experiment, a non-associative comparison problem in cyclic arithmetic, presented in the visual form of the game “rock, paper, scissors”. In this problem, the parallel algorithm requires a large number of processors with a small efficiency coefficient. Therefore, the transition of a person to a parallel algorithm for solving this problem is almost impossible, and the acceleration of processing input information is possible only by increasing the speed. Comparing the dependence of the solution time on the volume of source data for two types of problems allows us to identify four types of strategies for adapting to the increasing complexity of the problem: uniform sequential, accelerated sequential, parallel computing (where possible), or undefined (for this method) strategy. The Reducing of the number of subjects, who switch to a parallel strategy when encoding input information with formal images, shows the effectiveness of codes that cause subject associations. They increase the speed of human perception and processing of information. The article contains a preliminary mathematical model that explains this phenomenon. It is based on the appearance of a second set of initial data, which occurs in a person as a result of recognizing the depicted objects.
-
Image classification based on deep learning with automatic relevance determination and structured Bayesian pruning
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 927-938Deep learning’s power stems from complex architectures; however, these can lead to overfitting, where models memorize training data and fail to generalize to unseen examples. This paper proposes a novel probabilistic approach to mitigate this issue. We introduce two key elements: Truncated Log-Uniform Prior and Truncated Log-Normal Variational Approximation, and Automatic Relevance Determination (ARD) with Bayesian Deep Neural Networks (BDNNs). Within the probabilistic framework, we employ a specially designed truncated log-uniform prior for noise. This prior acts as a regularizer, guiding the learning process towards simpler solutions and reducing overfitting. Additionally, a truncated log-normal variational approximation is used for efficient handling of the complex probability distributions inherent in deep learning models. ARD automatically identifies and removes irrelevant features or weights within a model. By integrating ARD with BDNNs, where weights have a probability distribution, we achieve a variational bound similar to the popular variational dropout technique. Dropout randomly drops neurons during training, encouraging the model not to rely heavily on any single feature. Our approach with ARD achieves similar benefits without the randomness of dropout, potentially leading to more stable training.
To evaluate our approach, we have tested the model on two datasets: the Canadian Institute For Advanced Research (CIFAR-10) for image classification and a dataset of Macroscopic Images of Wood, which is compiled from multiple macroscopic images of wood datasets. Our method is applied to established architectures like Visual Geometry Group (VGG) and Residual Network (ResNet). The results demonstrate significant improvements. The model reduced overfitting while maintaining, or even improving, the accuracy of the network’s predictions on classification tasks. This validates the effectiveness of our approach in enhancing the performance and generalization capabilities of deep learning models.
-
Detecting large fractures in geological media using convolutional neural networks
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 889-901This paper considers the inverse problem of seismic exploration — determining the structure of the media based on the recorded wave response from it. Large cracks are considered as target objects, whose size and position are to be determined.
he direct problem is solved using the grid-characteristic method. The method allows using physically based algorithms for calculating outer boundaries of the region and contact boundaries inside the region. The crack is assumed to be thin, a special condition on the crack borders is used to describe the crack.
The inverse problem is solved using convolutional neural networks. The input data of the neural network are seismograms interpreted as images. The output data are masks describing the medium on a structured grid. Each element of such a grid belongs to one of two classes — either an element of a continuous geological massif, or an element through which a crack passes. This approach allows us to consider a medium with an unknown number of cracks.
The neural network is trained using only samples with one crack. The final testing of the trained network is performed using additional samples with several cracks. These samples are not involved in the training process. The purpose of testing under such conditions is to verify that the trained network has sufficient generality, recognizes signs of a crack in the signal, and does not suffer from overtraining on samples with a single crack in the media.
The paper shows that a convolutional network trained on samples with a single crack can be used to process data with multiple cracks. The networks detects fairly small cracks at great depths if they are sufficiently spatially separated from each other. In this case their wave responses are clearly distinguishable on the seismogram and can be interpreted by the neural network. If the cracks are close to each other, artifacts and interpretation errors may occur. This is due to the fact that on the seismogram the wave responses of close cracks merge. This cause the network to interpret several cracks located nearby as one. It should be noted that a similar error would most likely be made by a human during manual interpretation of the data. The paper provides examples of some such artifacts, distortions and recognition errors.
-
System to store DNA physical properties profiles with application to the promoters of Escherichia coli
Computer Research and Modeling, 2013, v. 5, no. 3, pp. 443-450Views (last year): 3.Database to store, search and retrieve DNA physical properties profiles has been developed and its use for analysis of E. coli promoters has been demonstrated. Unique feature of the database is in its ability to handle whole profile as single internal object type in a way similar to integers or character strings. To demonstrate utility of such database it was populated with data of 1227 known promoters, their nucleotide sequence, profile of electrostatic potential, transcription factor binding sites. Each promoter is also connected to all genes, whose transcription is controlled by that promoter. Content of the database is available for search via web interface. Source code of profile datatype and library to work with it from R/Bioconductor are available from the internet, dump of the database is available from authors by request.
-
Seismic wave fields in spherically symmetric Earth with high details. Analytical solution
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 903-922An analytical solution is obtained for seismic wave fields in a spherically symmetric Earth. In the case of an arbitrary layered medium, the solution, which includes Bessel functions, is constructed by means of a differential sweep method. Asymptotic of Bessel functions is used for stable calculation of wave fields. It is shown that the classical asymptotic in the case of a sphere of large (in wavelengths) dimensions gives an error in the solution. The new asymptotic is used for efficient calculation of a solution without errors with high detail. A program has been created that makes it possible to carry out calculations for high-frequency (1 hertz and higher) teleseismic wave fields in a discrete (layered) sphere of planetary dimensions. Calculations can be carried even out on personal computers with OpenMP parallelization.
In the works of Burmin (2019) proposed a spherically symmetric model of the Earth. It is characterized by the fact that in it the outer core has a viscosity and, therefore, an effective shear modulus other than zero. For this model of the Earth, a highly detailed calculation was carried out with a carrier frequency of 1 hertz. As a result of the analytical calculation, it was found that highfrequency oscillations of small amplitude, the so-called “precursors”, appear ahead of the PKP waves. An analytical calculation showed that the theoretical seismograms for this model of the Earth are in many respects similar to the experimental data. This confirms the correctness of the ideas underlying its construction.
-
Biohydrochemical portrait of the White Sea
Computer Research and Modeling, 2018, v. 10, no. 1, pp. 125-160The biohydrochemical portrait of the White Sea is constructed on the CNPSi-model calculations based on long-term mean annual observations (average monthly hydrometeorological, hydrochemical and hydrobiological parameters of the marine environment) as well as on updated information on the nutrient input to the sea with the runoff of the main river tributaries (Niva, Onega, Northern Dvina, Mezen, Kem, Keret). Parameters of the marine environment are temperature, light, transparency, and biogenic load. Ecological characteristics of the sea “portrait” were calculated for nine marine areas (Kandalaksha, Onega, Dvinsky, Mezensky Bays, Solovetsky Islands, Basin, Gorlot, Voronka, Chupa Bay), these are: the concentration changes of organic and mineral compounds of biogenic elements (C, N, P, Si), the biomass of organisms of the lower trophic level (heterotrophic bacteria, diatomic phytoplankton, herbivorous and predatory zooplankton) and other ones (rates of substance concentration and organism biomass changes, internal and external substance flows, balances of individual substances and nutrients as a whole). Parameters of the marine environment state (water temperature, ratio of mineral fractions N < P) and dominant diatom phytoplankton in the sea (abundance, production, biomass, chlorophyll content a) were calculated and compared with the results of individual surveys (for 1972–1991 and 2007–2012) of the White Sea water areas. The methods for estimating the values of these parameters from observations and calculations differ, however, the calculated values of the phytoplankton state are comparable with the measurements and are similar to the data given in the literature. Therefore, according to the literature data, the annual production of diatoms in the White Sea is estimated at 1.5–3 million tons C (at a vegetation period of 180 days), and according to calculations it is ~2 and 3.5 million tons C for vegetation period of 150 and 180 days respectively.
Keywords: White Sea ecosystem, nutrients, heterotrophic bacterioplankton, diatom phytoplankton, herbivorous and predatory zooplankton, detritus, trophic chain, CNPSi-model of nutrient biotransformation, ecological portrait of the White Sea, the comparison of the observed and calculated parameters of diatoms (abundance, products, biomass, chlorophyll a).Views (last year): 15. Citations: 1 (RSCI). -
Application of the kinetic type model for study of a spatial spread of COVID-19
Computer Research and Modeling, 2021, v. 13, no. 3, pp. 611-627A simple model based on a kinetic-type equation is proposed to describe the spread of a virus in space through the migration of virus carriers from a certain center. The consideration is carried out on the example of three countries for which such a one-dimensional model is applicable: Russia, Italy and Chile. The geographical location of these countries and their elongation in the direction from the centers of infection (Moscow, Milan and Lombardia in general, as well as Santiago, respectively) makes it possible to use such an approximation. The aim is to determine the dynamic density of the infected in time and space. The model is two-parameter. The first parameter is the value of the average spreading rate associated with the transfer of infected moving by transport vehicles. The second parameter is the frequency of the decrease of the infected as they move through the country, which is associated with the passengers reaching their destination, as well as with quarantine measures. The parameters are determined from the actual known data for the first days of the spatial spread of the epidemic. An analytical solution is being built; simple numerical methods are also used to obtain a series of calculations. The geographical spread of the disease is a factor taken into account in the model, the second important factor is that contact infection in the field is not taken into account. Therefore, the comparison of the calculated values with the actual data in the initial period of infection coincides with the real data, then these data become higher than the model data. Those no less model calculations allow us to make some predictions. In addition to the speed of infection, a similar “speed of recovery” is possible. When such a speed is found for the majority of the country's population, a conclusion is made about the beginning of a global recovery, which coincides with real data.
-
Cytokines as indicators of the state of the organism in infectious diseases. Experimental data analysis
Computer Research and Modeling, 2020, v. 12, no. 6, pp. 1409-1426When person`s diseases is result of bacterial infection, various characteristics of the organism are used for observation the course of the disease. Currently, one of these indicators is dynamics of cytokine concentrations are produced, mainly by cells of the immune system. There are many types of these low molecular weight proteins in human body and many species of animals. The study of cytokines is important for the interpretation of functional disorders of the body's immune system, assessment of the severity, monitoring the effectiveness of therapy, predicting of the course and outcome of treatment. Cytokine response of the body indicating characteristics of course of disease. For research regularities of such indication, experiments were conducted on laboratory mice. Experimental data are analyzed on the development of pneumonia and treatment with several drugs for bacterial infection of mice. As drugs used immunomodulatory drugs “Roncoleukin”, “Leikinferon” and “Tinrostim”. The data are presented by two types cytokines` concentration in lung tissue and animal blood. Multy-sided statistical ana non statistical analysis of the data allowed us to find common patterns of changes in the “cytokine profile” of the body and to link them with the properties of therapeutic preparations. The studies cytokine “Interleukin-10” (IL-10) and “Interferon Gamma” (IFN$\gamma$) in infected mice deviate from the normal level of infact animals indicating the development of the disease. Changes in cytokine concentrations in groups of treated mice are compared with those in a group of healthy (not infected) mice and a group of infected untreated mice. The comparison is made for groups of individuals, since the concentrations of cytokines are individual and differ significantly in different individuals. Under these conditions, only groups of individuals can indicate the regularities of the processes of the course of the disease. These groups of mice were being observed for two weeks. The dynamics of cytokine concentrations indicates characteristics of the disease course and efficiency of used therapeutic drugs. The effect of a medicinal product on organisms is monitored by the location of these groups of individuals in the space of cytokine concentrations. The Hausdorff distance between the sets of vectors of cytokine concentrations of individuals is used in this space. This is based on the Euclidean distance between the elements of these sets. It was found that the drug “Roncoleukin” and “Leukinferon” have a generally similar and different from the drug “Tinrostim” effect on the course of the disease.
Keywords: data processing, experiment, cytokine, immune system, pneumonia, statistics, approximation, Hausdorff distance. -
Development of and research into a rigid algorithm for analyzing Twitter publications and its influence on the movements of the cryptocurrency market
Computer Research and Modeling, 2023, v. 15, no. 1, pp. 157-170Social media is a crucial indicator of the position of assets in the financial market. The paper describes the rigid solution for the classification problem to determine the influence of social media activity on financial market movements. Reputable crypto traders influencers are selected. Twitter posts packages are used as data. The methods of text, which are characterized by the numerous use of slang words and abbreviations, and preprocessing consist in lemmatization of Stanza and the use of regular expressions. A word is considered as an element of a vector of a data unit in the course of solving the problem of binary classification. The best markup parameters for processing Binance candles are searched for. Methods of feature selection, which is necessary for a precise description of text data and the subsequent process of establishing dependence, are represented by machine learning and statistical analysis. First, the feature selection is used based on the information criterion. This approach is implemented in a random forest model and is relevant for the task of feature selection for splitting nodes in a decision tree. The second one is based on the rigid compilation of a binary vector during a rough check of the presence or absence of a word in the package and counting the sum of the elements of this vector. Then a decision is made depending on the superiority of this sum over the threshold value that is predetermined previously by analyzing the frequency distribution of mentions of the word. The algorithm used to solve the problem was named benchmark and analyzed as a tool. Similar algorithms are often used in automated trading strategies. In the course of the study, observations of the influence of frequently occurring words, which are used as a basis of dimension 2 and 3 in vectorization, are described as well.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




