All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Optimization of the brain command dictionary based on the statistical proximity criterion in silent speech recognition task
Computer Research and Modeling, 2023, v. 15, no. 3, pp. 675-690In our research, we focus on the problem of classification for silent speech recognition to develop a brain– computer interface (BCI) based on electroencephalographic (EEG) data, which will be capable of assisting people with mental and physical disabilities and expanding human capabilities in everyday life. Our previous research has shown that the silent pronouncing of some words results in almost identical distributions of electroencephalographic signal data. Such a phenomenon has a suppressive impact on the quality of neural network model behavior. This paper proposes a data processing technique that distinguishes between statistically remote and inseparable classes in the dataset. Applying the proposed approach helps us reach the goal of maximizing the semantic load of the dictionary used in BCI.
Furthermore, we propose the existence of a statistical predictive criterion for the accuracy of binary classification of the words in a dictionary. Such a criterion aims to estimate the lower and the upper bounds of classifiers’ behavior only by measuring quantitative statistical properties of the data (in particular, using the Kolmogorov – Smirnov method). We show that higher levels of classification accuracy can be achieved by means of applying the proposed predictive criterion, making it possible to form an optimized dictionary in terms of semantic load for the EEG-based BCIs. Furthermore, using such a dictionary as a training dataset for classification problems grants the statistical remoteness of the classes by taking into account the semantic and phonetic properties of the corresponding words and improves the classification behavior of silent speech recognition models.
-
Evolutionary effects of non-selective sustainable harvesting in a genetically heterogeneous population
Computer Research and Modeling, 2025, v. 17, no. 4, pp. 717-735The problem of harvest optimization remains a central challenge in mathematical biology. The concept of Maximum Sustainable Yield (MSY), widely used in optimal exploitation theory, proposes maintaining target populations at levels ensuring maximum reproduction, theoretically balancing economic benefits with resource conservation. While MSYbased management promotes population stability and system resilience, it faces significant limitations due to complex intrapopulation structures and nonlinear dynamics in exploited species. Of particular concern are the evolutionary consequences of harvesting, as artificial selection may drive changes divergent from natural selection pressures. Empirical evidence confirms that selective harvesting alters behavioral traits, reduces offspring quality, and modifies population gene pools. In contrast, the genetic impacts of non-selective harvesting remain poorly understood and require further investigation.
This study examines how non-selective harvesting with constant removal rates affects evolution in genetically heterogeneous populations. We model genetic diversity controlled by a single diallelic locus, where different genotypes dominate at high/low densities: r-strategists (high fecundity) versus K-strategists (resource-limited resilience). The classical ecological and genetic model with discrete time is considered. The model assumes that the fitness of each genotype linearly depends on the population size. By including the harvesting withdrawal coefficient, the model allows for linking the problem of optimizing harvest with the that of predicting genotype selection.
Analytical results demonstrate that under MSY harvesting the equilibrium genetic composition remains unchanged while population size halves. The type of genetic equilibrium may shift, as optimal harvest rates differ between equilibria. Natural K-strategist dominance may reverse toward r-strategists, whose high reproduction compensates for harvest losses. Critical harvesting thresholds triggering strategy shifts were identified.
These findings explain why exploited populations show slow recovery after harvesting cessation: exploitation reinforces adaptations beneficial under removal pressure but maladaptive in natural conditions. For instance, captive arctic foxes select for high-productivity genotypes, whereas wild populations favor lower-fecundity/higher-survival phenotypes. This underscores the necessity of incorporating genetic dynamics into sustainable harvesting management strategies, as MSY policies may inadvertently alter evolutionary trajectories through density-dependent selection processes. Recovery periods must account for genetic adaptation timescales in management frameworks.
-
Prediction of embryo implantation potential by morphology assessment
Computer Research and Modeling, 2010, v. 2, no. 1, pp. 111-116The early embryos developing in vitro to the blastocyst stage have low implantation potential. In the current work the microinjection was used to evaluate the most viable blastocysts with high implantation ability on the basis of morphology changing. The recovery rate of the embryo volume allows assessing the functional activity of trophoblast cells that involved in implantation. The predictive model is suggested to forecast the development effectiveness of blastocysts in vitro. It’s shown the recovery rate of the blastocyst volume after microinjection is the most important feature of implantation potential of early embryos. The maximal recovery rate of blastocyst volume (35.7 % of initial volume per 1 h) correlates with the embryos ability to generate the colonies 72 h after microinjection. By the area under receiver operator curve (AUC) it was shown that combination of such characteristics as blastocyst stage (middle and late) and recovery rate after microinjection allowed to predict the blastocyst development.
-
The integrated model of eco-economic system on the example of the Republic of Armenia
Computer Research and Modeling, 2014, v. 6, no. 4, pp. 621-631Views (last year): 14. Citations: 7 (RSCI).This article presents an integrated dynamic model of eco-economic system of the Republic of Armenia (RA). This model is constructed using system dynamics methods, which allow to consider the major feedback related to key characteristics of eco-economic system. Such model is a two-objective optimization problem where as target functions the level of air pollution and gross profit of national economy are considered. The air pollution is minimized due to modernization of stationary and mobile sources of pollution at simultaneous maximization of gross profit of national economy. At the same time considered eco-economic system is characterized by the presence of internal constraints that must be accounted at acceptance of strategic decisions. As a result, we proposed a systematic approach that allows forming sustainable solutions for the development of the production sector of RA while minimizing the impact on the environment. With the proposed approach, in particular, we can form a plan for optimal enterprise modernization and predict long-term dynamics of harmful emissions into the atmosphere.
-
Simulation of forming of UFG Ti-6-4 alloy at low temperature of superplasticity
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 127-133Views (last year): 10.Superplastic forming of Ni and Ti based alloys is widely used in aerospace industry. The main advantage of using the effect of superplasticity in sheet metal forming processes is a feasibility of forming materials with a high amount of plastic strain in conditions of prevailing tensile stresses. This article is dedicated to study commercial FEM software SFTC DEFORM application for prediction thickness deviation during low temperature superplastic forming of UFG Ti-6-4 alloy. Experimentally, thickness deviation during superplastic forming can be observed in the local area of plastic deformation and this process is aggravated by local softening of the metal and this is stipulated by microstructure coarsening. The theoretical model was prepared to analyze experimentally observed metal flow. Two approaches have been used for that. The first one is the using of integrated creep rheology model in DEFORM. As superplastic effect is observed only in materials with fine and ultrafine grain sizes the second approach is carried out using own user procedures for rheology model which is based on microstructure evolution equations. These equations have been implemented into DEFORM via Fortran user’s solver subroutines. Using of FEM simulation for this type of forming allows tracking a strain rate in different parts of a workpiece during a process, which is crucial for maintaining the superplastic conditions. Comparison of these approaches allows us to make conclusions about effect of microstructure evolution on metal flow during superplastic deformation. The results of the FEM analysis and theoretical conclusions have been approved by results of the conducted Erichsen test. The main issues of this study are as follows: a) the DEFORM software allows an engineer to predict formation of metal shape under the condition of low-temperature superplasticity; b) in order to augment the accuracy of the prediction of local deformations, the effect of the microstructure state of an alloy having sub-microcristalline structure should be taken into account in the course of calculations in the DEFORM software.
-
Method for prediction of aerodynamic characteristics of helicopter rotors based on edge-based schemes in code NOISEtte
Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1097-1122The paper gives a detailed description of the developed methods for simulating the turbulent flow around a helicopter rotor and calculating its aerodynamic characteristics. The system of Reynolds-averaged Navier – Stokes equations for a viscous compressible gas closed by the Spalart –Allmaras turbulence model is used as the basic mathematical model. The model is formulated in a non-inertial rotating coordinate system associated with a rotor. To set the boundary conditions on the surface of the rotor, wall functions are used.
The numerical solution of the resulting system of differential equations is carried out on mixed-element unstructured grids including prismatic layers near the surface of a streamlined body.The numerical method is based on the original vertex-centered finite-volume EBR schemes. A feature of these schemes is their higher accuracy which is achieved through the use of edge-based reconstruction of variables on extended quasi-onedimensional stencils, and a moderate computational cost which allows for serial computations. The methods of Roe and Lax – Friedrichs are used as approximate Riemann solvers. The Roe method is corrected in the case of low Mach flows. When dealing with discontinuities or solutions with large gradients, a quasi-one-dimensional WENO scheme or local switching to a quasi-one-dimensional TVD-type reconstruction is used. The time integration is carried out according to the implicit three-layer second-order scheme with Newton linearization of the system of difference equations. To solve the system of linear equations, the stabilized conjugate gradient method is used.
The numerical methods are implemented as a part of the in-house code NOISEtte according to the two-level MPI–OpenMP parallel model, which allows high-performance computations on meshes consisting of hundreds of millions of nodes, while involving hundreds of thousands of CPU cores of modern supercomputers.
Based on the results of numerical simulation, the aerodynamic characteristics of the helicopter rotor are calculated, namely, trust, torque and their dimensionless coefficients.
Validation of the developed technique is carried out by simulating the turbulent flow around the Caradonna – Tung two-blade rotor and the KNRTU-KAI four-blade model rotor in hover mode mode, tail rotor in duct, and rigid main rotor in oblique flow. The numerical results are compared with the available experimental data.
-
Development of and research on an algorithm for distinguishing features in Twitter publications for a classification problem with known markup
Computer Research and Modeling, 2023, v. 15, no. 1, pp. 171-183Social media posts play an important role in demonstration of financial market state, and their analysis is a powerful tool for trading. The article describes the result of a study of the impact of social media activities on the movement of the financial market. The top authoritative influencers are selected. Twitter posts are used as data. Such texts usually include slang and abbreviations, so methods for preparing primary text data, including Stanza, regular expressions are presented. Two approaches to the representation of a point in time in the format of text data are considered. The difference of the influence of a single tweet or a whole package consisting of tweets collected over a certain period of time is investigated. A statistical approach in the form of frequency analysis is also considered, metrics defined by the significance of a particular word when identifying the relationship between price changes and Twitter posts are introduced. Frequency analysis involves the study of the occurrence distributions of various words and bigrams in the text for positive, negative or general trends. To build the markup, changes in the market are processed into a binary vector using various parameters, thus setting the task of binary classification. The parameters for Binance candlesticks are sorted out for better description of the movement of the cryptocurrency market, their variability is also explored in this article. Sentiment is studied using Stanford Core NLP. The result of statistical analysis is relevant to feature selection for further binary or multiclass classification tasks. The presented methods of text analysis contribute to the increase of the accuracy of models designed to solve natural language processing problems by selecting words, improving the quality of vectorization. Such algorithms are often used in automated trading strategies to predict the price of an asset, the trend of its movement.
-
Generating database schema from requirement specification based on natural language processing and large language model
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1703-1713A Large Language Model (LLM) is an advanced artificial intelligence algorithm that utilizes deep learning methodologies and extensive datasets to process, understand, and generate humanlike text. These models are capable of performing various tasks, such as summarization, content creation, translation, and predictive text generation, making them highly versatile in applications involving natural language understanding. Generative AI, often associated with LLMs, specifically focuses on creating new content, particularly text, by leveraging the capabilities of these models. Developers can harness LLMs to automate complex processes, such as extracting relevant information from system requirement documents and translating them into a structured database schema. This capability has the potential to streamline the database design phase, saving significant time and effort while ensuring that the resulting schema aligns closely with the given requirements. By integrating LLM technology with Natural Language Processing (NLP) techniques, the efficiency and accuracy of generating database schemas based on textual requirement specifications can be significantly enhanced. The proposed tool will utilize these capabilities to read system requirement specifications, which may be provided as text descriptions or as Entity-Relationship Diagrams (ERDs). It will then analyze the input and automatically generate a relational database schema in the form of SQL commands. This innovation eliminates much of the manual effort involved in database design, reduces human errors, and accelerates development timelines. The aim of this work is to provide a tool can be invaluable for software developers, database architects, and organizations aiming to optimize their workflow and align technical deliverables with business requirements seamlessly.
-
Random forest of risk factors as a predictive tool for adverse events in clinical medicine
Computer Research and Modeling, 2025, v. 17, no. 5, pp. 987-1004The aim of study was to develop an ensemble machine learning method for constructing interpretable predictive models and to validate it using the example of predicting in-hospital mortality (IHM) in patients with ST-segment elevation myocardial infarction (STEMI).
A retrospective cohort study was conducted using data from 5446 electronic medical records of STEMI patients who underwent percutaneous coronary intervention (PCI). Patients were divided into two groups: 335 (6.2%) patients who died during hospitalization and 5111 (93.8%) patients with a favourable in-hospital outcome. A pool of potential predictors was formed using statistical methods. Through multimetric categorization (minimizing p-values, maximizing the area under the ROC curve (AUC), and SHAP value analysis), decision trees, and multivariable logistic regression (MLR), predictors were transformed into risk factors for IHM. Predictive models for IHM were developed using MLR, Random Forest Risk Factors (RandFRF), Stochastic Gradient Boosting (XGboost), Random Forest (RF), Adaptive boosting, Gradient Boosting, Light Gradient-Boosting Machine, Categorical Boosting (CatBoost), Explainable Boosting Machine and Stacking methods.
Authors developed the RandFRF method, which integrates the predictive outcomes of modified decision trees, identifies risk factors and ranks them based on their contribution to the risk of adverse outcomes. RandFRF enables the development of predictive models with high discriminative performance (AUC 0.908), comparable to models based on CatBoost and Stacking (AUC 0.904 and 0.908, respectively). In turn, risk factors provide clinicians with information on the patient’s risk group classification and the extent of their impact on the probability of IHM. The risk factors identified by RandFRF can serve not only as rationale for the prediction results but also as a basis for developing more accurate models.
-
Numerical simulation of sportsman's external flow
Computer Research and Modeling, 2017, v. 9, no. 2, pp. 331-344Views (last year): 29.Numerical simulation of moving sportsman external flow is presented. The unique method is developed for obtaining integral aerodynamic characteristics, which were the function of the flow regime (i.e. angle of attack, flow speed) and body position. Individual anthropometric characteristics and moving boundaries of sportsman (or sports equipment) during the race are taken into consideration.
Numerical simulation is realized using FlowVision CFD. The software is based on the finite volume method, high-performance numerical methods and reliable mathematical models of physical processes. A Cartesian computational grid is used by FlowVision, the grid generation is a completely automated process. Local grid adaptation is used for solving high-pressure gradient and object complex shape. Flow simulation process performed by solutions systems of equations describing movement of fluid and/or gas in the computational domain, including: mass, moment and energy conservation equations; state equations; turbulence model equations. FlowVision permits flow simulation near moving bodies by means of computational domain transformation according to the athlete shape changes in the motion. Ski jumper aerodynamic characteristics are studied during all phases: take-off performance in motion, in-run and flight. Projected investigation defined simulation method, which includes: inverted statement of sportsman external flow development (velocity of the motion is equal to air flow velocity, object is immobile); changes boundary of the body technology defining; multiple calculations with the national team member data projecting. The research results are identification of the main factors affected to jumping performance: aerodynamic forces, rotating moments etc. Developed method was tested with active sportsmen. Ski jumpers used this method during preparations for Sochi Olympic Games 2014. A comparison of the predicted characteristics and experimental data shows a good agreement. Method versatility is underlined by performing swimmer and skater flow simulation. Designed technology is applicable for sorts of natural and technical objects.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




