All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Numerical modeling of physical processes leading to the destruction of meteoroids in the Earth’s atmosphere
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 835-851Within the framework of the actual problem of comet-asteroid danger, the physical processes causing the destruction and fragmentation of meteor bodies in the Earth’s atmosphere are numerically investigated. Based on the developed physicalmathematical models that determines the movements of space objects of natural origin in the atmosphere and their interaction with it, the fall of three, one of the largest and by some parameters unusual bolides in the history of meteoritics, are considered: Tunguska, Vitim and Chelyabinsk. Their singularity lies in the absence of any material meteorite remains and craters in the area of the alleged crash site for the first two bodies and the non-detection, as it is assumed, of the main mother body for the third body (due to the too small amount of mass of the fallen fragments compared to the estimated mass). The effect of aerodynamic loads and heat flows on these bodies are studied, which leads to intensive surface mass loss and possible mechanical destruction. The velocities of the studied celestial bodies and the change in their masses are determined from the modernized system of equations of the theory of meteoric physics. An important factor that is taken into account here is the variability of the meteorite mass entrainment parameter under the action of heat fluxes (radiation and convective) along the flight path. The process of fragmentation of meteoroids in this paper is considered within the framework of a progressive crushing model based on the statistical theory of strength, taking into account the influence of the scale factor on the ultimate strength of objects. The phenomena and effects arising at various kinematic and physical parameters of each of these bodies are revealed. In particular, the change in the ballistics of their flight in the denser layers of the atmosphere, consisting in the transition from the fall mode to the ascent mode. At the same time, the following scenarios of the event can be realized: 1) the return of the body back to outer space at its residual velocity greater than the second cosmic one; 2) the transition of the body to the orbit of the Earth satellite at a residual velocity greater than the first cosmic one; 3) at lower values of the residual velocity of the body, its return after some time to the fall mode and falling out at a considerable distance from the intended crash site. It is the implementation of one of these three scenarios of the event that explains, for example, the absence of material traces, including craters, in the case of the Tunguska bolide in the vicinity of the forest collapse. Assumptions about the possibility of such scenarios have been made earlier by other authors, and in this paper their implementation is confirmed by the results of numerical calculations.
-
Fuzzy knowledge extraction in the development of expert predictive diagnostic systems
Computer Research and Modeling, 2022, v. 14, no. 6, pp. 1395-1408Expert systems imitate professional experience and thinking process of a specialist to solve problems in various subject areas. An example of the problem that it is expedient to solve with the help of the expert system is the problem of forming a diagnosis that arises in technology, medicine, and other fields. When solving the diagnostic problem, it is necessary to anticipate the occurrence of critical or emergency situations in the future. They are situations, which require timely intervention of specialists to prevent critical aftermath. Fuzzy sets theory provides one of the approaches to solve ill-structured problems, diagnosis-making problems belong to which. The theory of fuzzy sets provides means for the formation of linguistic variables, which are helpful to describe the modeled process. Linguistic variables are elements of fuzzy logical rules that simulate the reasoning of professionals in the subject area. To develop fuzzy rules it is necessary to resort to a survey of experts. Knowledge engineers use experts’ opinion to evaluate correspondence between a typical current situation and the risk of emergency in the future. The result of knowledge extraction is a description of linguistic variables that includes a combination of signs. Experts are involved in the survey to create descriptions of linguistic variables and present a set of simulated situations.When building such systems, the main problem of the survey is laboriousness of the process of interaction of knowledge engineers with experts. The main reason is the multiplicity of questions the expert must answer. The paper represents reasoning of the method, which allows knowledge engineer to reduce the number of questions posed to the expert. The paper describes the experiments carried out to test the applicability of the proposed method. An expert system for predicting risk groups for neonatal pathologies and pregnancy pathologies using the proposed knowledge extraction method confirms the feasibility of the proposed approach.
-
Utilizing multi-source real data for traffic flow optimization in CTraf
Computer Research and Modeling, 2024, v. 16, no. 1, pp. 147-159The problem of optimal control of traffic flow in an urban road network is considered. The control is carried out by varying the duration of the working phases of traffic lights at controlled intersections. A description of the control system developed is given. The control system enables the use of three types of control: open-loop, feedback and manual. In feedback control, road infrastructure detectors, video cameras, inductive loop and radar detectors are used to determine the quantitative characteristics of current traffic flow state. The quantitative characteristics of the traffic flows are fed into a mathematical model of the traffic flow, implemented in the computer environment of an automatic traffic flow control system, in order to determine the moments for switching the working phases of the traffic lights. The model is a system of finite-difference recurrent equations and describes the change in traffic flow on each road section at each time step, based on retrived data on traffic flow characteristics in the network, capacity of maneuvers and flow distribution through alternative maneuvers at intersections. The model has scaling and aggregation properties. The structure of the model depends on the structure of the graph of the controlled road network. The number of nodes in the graph is equal to the number of road sections in the considered network. The simulation of traffic flow changes in real time makes it possible to optimally determine the duration of traffic light operating phases and to provide traffic flow control with feedback based on its current state. The system of automatic collection and processing of input data for the model is presented. In order to model the states of traffic flow in the network and to solve the problem of optimal traffic flow control, the CTraf software package has been developed, a brief description of which is given in the paper. An example of the solution of the optimal control problem of traffic flows on the basis of real data in the road network of Moscow is given.
-
Synchronization of circadian rhythms in the scale of a gene, a cell and a whole organism
Computer Research and Modeling, 2013, v. 5, no. 2, pp. 255-270Views (last year): 1. Citations: 8 (RSCI).In the paper three characteristic scales of a biological system are proposed: microscopic (gene's size), mesoscopic (cell’s size) and macroscopic level (organism’s size). For each case the approach to modeling of circadian rhythms is discussed on the base of a time-delay model. At gene’s scale the stochastic description has been used. The robustness of rhythms mechanism to the fluctuations has been demonstrated. At the mesoscopic scale we propose the deterministic description within the spatially extended model. It was found the effect of collective synchronization of rhythms in cells. Macroscopic effects have been studied within the discrete model describing the collective behaviour of large amount of cells. The problem of cross-linking of results obtained at different scales is discussed. The comparison with experimental data is given.
-
Methodic of legacy information systems handling
Computer Research and Modeling, 2014, v. 6, no. 2, pp. 331-344Views (last year): 3. Citations: 1 (RSCI).In this article a method of legacy information systems handling is offered. During professional activities of specialists of various domains of industry they face with the problem that computer software that was involved in product development stage becomes obsolete much quickly than the product itself. At the same time switch to any modern software might be not possible due to various reasons. This problem is known as "legacy system" problem. It appears when product lifecycle is sufficiently longer than that of software systems that were used for product creation. In this article author offers an approach for solving this problem along with computer application based on this approach.
-
The integrated model of eco-economic system on the example of the Republic of Armenia
Computer Research and Modeling, 2014, v. 6, no. 4, pp. 621-631Views (last year): 14. Citations: 7 (RSCI).This article presents an integrated dynamic model of eco-economic system of the Republic of Armenia (RA). This model is constructed using system dynamics methods, which allow to consider the major feedback related to key characteristics of eco-economic system. Such model is a two-objective optimization problem where as target functions the level of air pollution and gross profit of national economy are considered. The air pollution is minimized due to modernization of stationary and mobile sources of pollution at simultaneous maximization of gross profit of national economy. At the same time considered eco-economic system is characterized by the presence of internal constraints that must be accounted at acceptance of strategic decisions. As a result, we proposed a systematic approach that allows forming sustainable solutions for the development of the production sector of RA while minimizing the impact on the environment. With the proposed approach, in particular, we can form a plan for optimal enterprise modernization and predict long-term dynamics of harmful emissions into the atmosphere.
-
Features of the DNA kink motion in the asynchronous switching on and off of the constant and periodic fields
Computer Research and Modeling, 2018, v. 10, no. 4, pp. 545-558Views (last year): 29. Citations: 1 (RSCI).Investigation of the influence of external fields on living systems is one of the most interesting and rapidly developing areas of modern biophysics. However, the mechanisms of such an impact are still not entirely clear. One approach to the study of this issue is associated with modeling the interaction of external fields with internal mobility of biological objects. In this paper, this approach is used to study the effect of external fields on the motion of local conformational distortions — kinks, in the DNA molecule. Realizing and taking into account that on the whole this task is closely connected with the problem of the mechanisms of regulation of vital processes of cells and cellular systems, we set the problem — to investigate the physical mechanisms regulating the motion of kinks and also to answer the question whether permanent and periodic fields can play the role of regulators of this movement. The paper considers the most general case, when constant and periodic fields are switching on and off asynchronously. Three variants of asynchronous switching on/off are studied in detail. In the first variant, the time intervals (or diapasons) of the actions of the constant and periodic fields do not overlap, in the second — overlap, and in the third — the intervals are putting in each other. The calculations were performed for the sequence of plasmid pTTQ18. The kink motion was modeled by the McLaughlin–Scott equation, and the coefficients of the equation were calculated in a quasi-homogeneous approximation. Numerical experiments showed that constant and periodic fields exert a significant influence on the character of the kink motion and regulate it. So the switching on of a constant field leads to a rapid increase of the kink velocity and to the establishment of a stationary velocity of motion, and the switching on of a periodic field leads to the steady oscillations of the kink with the frequency of the external periodic field. It is shown that the behavior of the kink depends on the mutual arrangement of the diapasons of the action of the external fields. As it turned out, events occurring in one of the two diapasons can affect the events in the other diapason, even when the diapasons are sufficiently far apart. It is shown that the overlapping of the diapasons of action of the constant and periodic fields leads to a significant increase in the path traversed by the kink to a complete stop. Maximal growth of the path is observed when one diapason is putting in each other. In conclusion, the question of how the obtained model results could be related to the most important task of biology — the problem of the mechanisms of regulation of the processes of vital activity of cells and cellular systems is discussed.
-
Analysis of the effectiveness of machine learning methods in the problem of gesture recognition based on the data of electromyographic signals
Computer Research and Modeling, 2021, v. 13, no. 1, pp. 175-194Gesture recognition is an urgent challenge in developing systems of human-machine interfaces. We analyzed machine learning methods for gesture classification based on electromyographic muscle signals to identify the most effective one. Methods such as the naive Bayesian classifier (NBC), logistic regression, decision tree, random forest, gradient boosting, support vector machine (SVM), $k$-nearest neighbor algorithm, and ensembles (NBC and decision tree, NBC and gradient boosting, gradient boosting and decision tree) were considered. Electromyography (EMG) was chosen as a method of obtaining information about gestures. This solution does not require the location of the hand in the field of view of the camera and can be used to recognize finger movements. To test the effectiveness of the selected methods of gesture recognition, a device was developed for recording the EMG signal, which includes three electrodes and an EMG sensor connected to the microcontroller and the power supply. The following gestures were chosen: clenched fist, “thumb up”, “Victory”, squeezing an index finger and waving a hand from right to left. Accuracy, precision, recall and execution time were used to evaluate the effectiveness of classifiers. These parameters were calculated for three options for the location of EMG electrodes on the forearm. According to the test results, the most effective methods are $k$-nearest neighbors’ algorithm, random forest and the ensemble of NBC and gradient boosting, the average accuracy of ensemble for three electrode positions was 81.55%. The position of the electrodes was also determined at which machine learning methods achieve the maximum accuracy. In this position, one of the differential electrodes is located at the intersection of the flexor digitorum profundus and flexor pollicis longus, the second — above the flexor digitorum superficialis.
-
Extracting knowledge from text messages: overview and state-of-the-art
Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1291-1315In general, solving the information explosion problem can be delegated to systems for automatic processing of digital data. These systems are intended for recognizing, sorting, meaningfully processing and presenting data in formats readable and interpretable by humans. The creation of intelligent knowledge extraction systems that handle unstructured data would be a natural solution in this area. At the same time, the evident progress in these tasks for structured data contrasts with the limited success of unstructured data processing, and, in particular, document processing. Currently, this research area is undergoing active development and investigation. The present paper is a systematic survey on both Russian and international publications that are dedicated to the leading trend in automatic text data processing: Text Mining (TM). We cover the main tasks and notions of TM, as well as its place in the current AI landscape. Furthermore, we analyze the complications that arise during the processing of texts written in natural language (NLP) which are weakly structured and often provide ambiguous linguistic information. We describe the stages of text data preparation, cleaning, and selecting features which, alongside the data obtained via morphological, syntactic, and semantic analysis, constitute the input for the TM process. This process can be represented as mapping a set of text documents to «knowledge». Using the case of stock trading, we demonstrate the formalization of the problem of making a trade decision based on a set of analytical recommendations. Examples of such mappings are methods of Information Retrieval (IR), text summarization, sentiment analysis, document classification and clustering, etc. The common point of all tasks and techniques of TM is the selection of word forms and their derivatives used to recognize content in NL symbol sequences. Considering IR as an example, we examine classic types of search, such as searching for word forms, phrases, patterns and concepts. Additionally, we consider the augmentation of patterns with syntactic and semantic information. Next, we provide a general description of all NLP instruments: morphological, syntactic, semantic and pragmatic analysis. Finally, we end the paper with a comparative analysis of modern TM tools which can be helpful for selecting a suitable TM platform based on the user’s needs and skills.
-
Stochastic optimization in digital pre-distortion of the signal
Computer Research and Modeling, 2022, v. 14, no. 2, pp. 399-416In this paper, we test the performance of some modern stochastic optimization methods and practices with respect to the digital pre-distortion problem, which is a valuable part of processing signal on base stations providing wireless communication. In the first part of our study, we focus on the search for the best performing method and its proper modifications. In the second part, we propose the new, quasi-online, testing framework that allows us to fit our modeling results with the behavior of real-life DPD prototype, retest some selected of practices considered in the previous section and approve the advantages of the method appearing to be the best under real-life conditions. For the used model, the maximum achieved improvement in depth is 7% in the standard regime and 5% in the online regime (metric itself is of logarithmic scale). We also achieve a halving of the working time preserving 3% and 6% improvement in depth for the standard and online regime, respectively. All comparisons are made to the Adam method, which was highlighted as the best stochastic method for DPD problem in [Pasechnyuk et al., 2021], and to the Adamax method, which is the best in the proposed online regime.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"