Результаты поиска по 'systematization':
Найдено статей: 14
  1. Musaev A.A., Grigoriev D.A.
    Extracting knowledge from text messages: overview and state-of-the-art
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1291-1315

    In general, solving the information explosion problem can be delegated to systems for automatic processing of digital data. These systems are intended for recognizing, sorting, meaningfully processing and presenting data in formats readable and interpretable by humans. The creation of intelligent knowledge extraction systems that handle unstructured data would be a natural solution in this area. At the same time, the evident progress in these tasks for structured data contrasts with the limited success of unstructured data processing, and, in particular, document processing. Currently, this research area is undergoing active development and investigation. The present paper is a systematic survey on both Russian and international publications that are dedicated to the leading trend in automatic text data processing: Text Mining (TM). We cover the main tasks and notions of TM, as well as its place in the current AI landscape. Furthermore, we analyze the complications that arise during the processing of texts written in natural language (NLP) which are weakly structured and often provide ambiguous linguistic information. We describe the stages of text data preparation, cleaning, and selecting features which, alongside the data obtained via morphological, syntactic, and semantic analysis, constitute the input for the TM process. This process can be represented as mapping a set of text documents to «knowledge». Using the case of stock trading, we demonstrate the formalization of the problem of making a trade decision based on a set of analytical recommendations. Examples of such mappings are methods of Information Retrieval (IR), text summarization, sentiment analysis, document classification and clustering, etc. The common point of all tasks and techniques of TM is the selection of word forms and their derivatives used to recognize content in NL symbol sequences. Considering IR as an example, we examine classic types of search, such as searching for word forms, phrases, patterns and concepts. Additionally, we consider the augmentation of patterns with syntactic and semantic information. Next, we provide a general description of all NLP instruments: morphological, syntactic, semantic and pragmatic analysis. Finally, we end the paper with a comparative analysis of modern TM tools which can be helpful for selecting a suitable TM platform based on the user’s needs and skills.

  2. Rusyak I.G., Tenenev V.A.
    Modeling of ballistics of an artillery shot taking into account the spatial distribution of parameters and backpressure
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1123-1147

    The paper provides a comparative analysis of the results obtained by various approaches to modeling the process of artillery shot. In this connection, the main problem of internal ballistics and its particular case of the Lagrange problem are formulated in averaged parameters, where, within the framework of the assumptions of the thermodynamic approach, the distribution of pressure and gas velocity over the projectile space for a channel of variable cross section is taken into account for the first time. The statement of the Lagrange problem is also presented in the framework of the gas-dynamic approach, taking into account the spatial (one-dimensional and two-dimensional axisymmetric) changes in the characteristics of the ballistic process. The control volume method is used to numerically solve the system of Euler gas-dynamic equations. Gas parameters at the boundaries of control volumes are determined using a selfsimilar solution to the Riemann problem. Based on the Godunov method, a modification of the Osher scheme is proposed, which allows to implement a numerical calculation algorithm with a second order of accuracy in coordinate and time. The solutions obtained in the framework of the thermodynamic and gas-dynamic approaches are compared for various loading parameters. The effect of projectile mass and chamber broadening on the distribution of the ballistic parameters of the shot and the dynamics of the projectile motion was studied. It is shown that the thermodynamic approach, in comparison with the gas-dynamic approach, leads to a systematic overestimation of the estimated muzzle velocity of the projectile in the entire range of parameters studied, while the difference in muzzle velocity can reach 35%. At the same time, the discrepancy between the results obtained in the framework of one-dimensional and two-dimensional gas-dynamic models of the shot in the same range of change in parameters is not more than 1.3%.

    A spatial gas-dynamic formulation of the backpressure problem is given, which describes the change in pressure in front of an accelerating projectile as it moves along the barrel channel. It is shown that accounting the projectile’s front, considered in the two-dimensional axisymmetric formulation of the problem, leads to a significant difference in the pressure fields behind the front of the shock wave, compared with the solution in the framework of the onedimensional formulation of the problem, where the projectile’s front is not possible to account. It is concluded that this can significantly affect the results of modeling ballistics of a shot at high shooting velocities.

  3. Tishkin V.F., Trapeznikova M.A., Chechina A.A., Churbanova N.G.
    Simulation of traffic flows based on the quasi-gasdynamic approach and the cellular automata theory using supercomputers
    Computer Research and Modeling, 2024, v. 16, no. 1, pp. 175-194

    The purpose of the study is to simulate the dynamics of traffic flows on city road networks as well as to systematize the current state of affairs in this area. The introduction states that the development of intelligent transportation systems as an integral part of modern transportation technologies is coming to the fore. The core of these systems contain adequate mathematical models that allow to simulate traffic as close to reality as possible. The necessity of using supercomputers due to the large amount of calculations is also noted, therefore, the creation of special parallel algorithms is needed. The beginning of the article is devoted to the up-to-date classification of traffic flow models and characterization of each class, including their distinctive features and relevant examples with links. Further, the main focus of the article is shifted towards the development of macroscopic and microscopic models, created by the authors, and determination of the place of these models in the aforementioned classification. The macroscopic model is based on the continuum approach and uses the ideology of quasi-gasdynamic systems of equations. Its advantages are indicated in comparison with existing models of this class. The model is presented both in one-dimensional and two-dimensional versions. The both versions feature the ability to study multi-lane traffic. In the two-dimensional version it is made possible by introduction of the concept of “lateral” velocity, i. e., the speed of changing lanes. The latter version allows for carrying out calculations in the computational domain which corresponds to the actual geometry of the road. The section also presents the test results of modeling vehicle dynamics on a road fragment with the local widening and on a road fragment with traffic lights, including several variants of traffic light regimes. In the first case, the calculations allow to draw interesting conclusions about the impact of a road widening on a road capacity as a whole, and in the second case — to select the optimal regime configuration to obtain the “green wave” effect. The microscopic model is based on the cellular automata theory and the single-lane Nagel – Schreckenberg model and is generalized for the multi-lane case by the authors of the article. The model implements various behavioral strategies of drivers. Test computations for the real transport network section in Moscow city center are presented. To achieve an adequate representation of vehicles moving through the network according to road traffic regulations the authors implemented special algorithms adapted for parallel computing. Test calculations were performed on the K-100 supercomputer installed in the Centre of Collective Usage of KIAM RAS.

  4. The article discusses the problem of the influence of the research goals on the structure of the multivariate model of regression analysis (in particular, on the implementation of the procedure for reducing the dimension of the model). It is shown how bringing the specification of the multiple regression model in line with the research objectives affects the choice of modeling methods. Two schemes for constructing a model are compared: the first does not allow taking into account the typology of primary predictors and the nature of their influence on the performance characteristics, the second scheme implies a stage of preliminary division of the initial predictors into groups, in accordance with the objectives of the study. Using the example of solving the problem of analyzing the causes of burnout of creative workers, the importance of the stage of qualitative analysis and systematization of a priori selected factors is shown, which is implemented not by computing means, but by attracting the knowledge and experience of specialists in the studied subject area. The presented example of the implementation of the approach to determining the specification of the regression model combines formalized mathematical and statistical procedures and the preceding stage of the classification of primary factors. The presence of this stage makes it possible to explain the scheme of managing (corrective) actions (softening the leadership style and increasing approval lead to a decrease in the manifestations of anxiety and stress, which, in turn, reduces the severity of the emotional exhaustion of the team members). Preclassification also allows avoiding the combination in one main component of controlled and uncontrolled, regulatory and controlled feature factors, which could worsen the interpretability of the synthesized predictors. On the example of a specific problem, it is shown that the selection of factors-regressors is a process that requires an individual solution. In the case under consideration, the following were consistently used: systematization of features, correlation analysis, principal component analysis, regression analysis. The first three methods made it possible to significantly reduce the dimension of the problem, which did not affect the achievement of the goal for which this task was posed: significant measures of controlling influence on the team were shown. allowing to reduce the degree of emotional burnout of its participants.

Pages: previous

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"