All issues
- 2026 Vol. 18
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Conversion of the initial indices of the technological process of the smelting of steel for the subsequent simulation
Computer Research and Modeling, 2017, v. 9, no. 2, pp. 187-199Views (last year): 6. Citations: 1 (RSCI).Efficiency of production directly depends on quality of the management of technology which, in turn, relies on the accuracy and efficiency of the processing of control and measuring information. Development of the mathematical methods of research of the system communications and regularities of functioning and creation of the mathematical models taking into account structural features of object of researches, and also writing of the software products for realization of these methods are an actual task. Practice has shown that the list of parameters that take place in the study of complex object of modern production, ranging from a few dozen to several hundred names, and the degree of influence of each factor in the initial time is not clear. Before working for the direct determination of the model in these circumstances, it is impossible — the amount of the required information may be too great, and most of the work on the collection of this information will be done in vain due to the fact that the degree of influence on the optimization of most factors of the original list would be negligible. Therefore, a necessary step in determining a model of a complex object is to work to reduce the dimension of the factor space. Most industrial plants are hierarchical group processes and mass volume production, characterized by hundreds of factors. (For an example of realization of the mathematical methods and the approbation of the constructed models data of the Moldavian steel works were taken in a basis.) To investigate the systemic linkages and patterns of functioning of such complex objects are usually chosen several informative parameters, and carried out their sampling. In this article the sequence of coercion of the initial indices of the technological process of the smelting of steel to the look suitable for creation of a mathematical model for the purpose of prediction is described. The implementations of new types became also creation of a basis for development of the system of automated management of quality of the production. In the course of weak correlation the following stages are selected: collection and the analysis of the basic data, creation of the table the correlated of the parameters, abbreviation of factor space by means of the correlative pleiads and a method of weight factors. The received results allow to optimize process of creation of the model of multiple-factor process.
-
Hybrid neural network for predicting coating characteristics in flame spraying
Computer Research and Modeling, 2026, v. 18, no. 1, pp. 101-116The paper presents a hybrid artificial neural network model based on an architecture that incorporates a convolutional image encoder (CNN) and an attention module (Attention-based Multiple Instance Learning, Attention MIL). This module aggregates informative features from a sequence of frames capturing the flame spraying process. Additional technological parameters—air pressure, propane pressure, and standoff distance — are integrated into the model via a tabular channel, enabling it to account for the relationship between visual data and numerical process regime characteristics. The software implementation was developed using the Streamlit platform and the PyTorch library. It features an interactive interface for model training and result visualization, analysis of attention weights across frames, and a prediction mode for output characteristics: surface roughness ($R_a$) and the mass of the deposited coating ($m$). Experimental studies were conducted on data from real-world technological processes, and a comparative analysis of the accuracy of various model configurations was performed. The results demonstrate that the hybrid neural network, which combines visual and tabular features, achieves higher prediction accuracy compared to models using only a single modality. Furthermore, when comparing different implementations of the hybrid network, it was established that using the attention mechanism to process the series of flame spray images provides a significant increase in accuracy over a simple averaging of features without attention. The application includes an attention visualization module that creates a montage of the most significant frames and displays their attention weights, allowing users to identify which frames had the greatest influence on the prediction. The model’s capability for export to the ONNX format for integration into process control systems is also demonstrated. The proposed approach showcases the effectiveness of fusing visual and tabular information for manufacturing process monitoring tasks. The model can serve as a foundation for developing a decision support system or an automated quality control system for coatings produced by flame spraying. The limitations of the implemented model and prospects for its further development are also considered.
-
About applying Rayleigh formula based on the Kirchhoff integral equations for the seismic exploration problems
Computer Research and Modeling, 2017, v. 9, no. 5, pp. 761-771Views (last year): 11.In this paper we present Rayleigh formulas obtained from Kirchhoff integral formulas, which can later be used to obtain migration images. The relevance of the studies conducted in the work is due to the widespread use of migration in the interests of seismic oil and gas seismic exploration. A special feature of the work is the use of an elastic approximation to describe the dynamic behaviour of a geological environment, in contrast to the widespread acoustic approximation. The proposed approach will significantly improve the quality of seismic exploration in complex cases, such as permafrost and shelf zones of the southern and northern seas. The complexity of applying a system of equations describing the state of a linear-elastic medium to obtain Rayleigh formulas and algorithms based on them is a significant increase in the number of computations, the mathematical and analytical complexity of the resulting algorithms in comparison with the case of an acoustic medium. Therefore in industrial seismic surveys migration algorithms for the case of elastic waves are not currently used, which creates certain difficulties, since the acoustic approximation describes only longitudinal seismic waves in geological environments. This article presents the final analytical expressions that can be used to develop software systems using the description of elastic seismic waves: longitudinal and transverse, thereby covering the entire range of seismic waves: longitudinal reflected PP-waves, longitudinal reflected SP-waves, transverse reflected PS-waves and transverse reflected SS-waves. Also, the results of comparison of numerical solutions obtained on the basis of Rayleigh formulas with numerical solutions obtained by the grid-characteristic method are presented. The value of this comparison is due to the fact that the method based on Rayleigh integrals is based on analytical expressions, while the grid-characteristic method is a method of numerical integration of solutions based on a calculated grid. In the comparison, different types of sources were considered: a point source model widely used in marine and terrestrial seismic surveying and a flat wave model, which is also sometimes used in field studies.
-
Principles of sustainable scientific software: lessons from developing a data processing program for small-angle neutron scattering
Computer Research and Modeling, 2026, v. 18, no. 2, pp. 335-358The SAS program is the primary data processing tool for the YuMO small-angle neutron scattering spectrometer. The paper presents a retrospective analysis of its two-decade evolution, from a Fortran prototype to a modern software system. The analysis focuses on the architectural decisions that have ensured the program’s long-term viability and its ability to adapt to instrument upgrades.
The core solution was a modular architecture that abstracts the detector system. This enabled the seamless integration of data from two scattering detectors and, later, from a position-sensitive detector. A strict processing pipeline and a unified internal data representation formed the basis for physically grounded algorithms, including weighted merging of spectra, resolution-aware smoothing, and built-in statistical quality control. The program’s interfaces—a command line for batch processing and a graphical user interface for interactive work—are built upon a single computational core, ensuring result consistency and flexibility in use.
Long-term operation has confirmed that the underlying architectural principles naturally align with the key characteristics of international software quality standards, particularly those critical for long-term sustainability. Therefore, the development and evolution of SAS demonstrates a universal set of architectural principles that can serve as a foundation for building sustainable scientific software in related fields of experimental physics.
-
Technology for collecting initial data for constructing models for assessing the functional state of a human by pupil's response to illumination changes in the solution of some problems of transport safety
Computer Research and Modeling, 2021, v. 13, no. 2, pp. 417-427This article solves the problem of developing a technology for collecting initial data for building models for assessing the functional state of a person. This condition is assessed by the pupil response of a person to a change in illumination based on the pupillometry method. This method involves the collection and analysis of initial data (pupillograms), presented in the form of time series characterizing the dynamics of changes in the human pupils to a light impulse effect. The drawbacks of the traditional approach to the collection of initial data using the methods of computer vision and smoothing of time series are analyzed. Attention is focused on the importance of the quality of the initial data for the construction of adequate mathematical models. The need for manual marking of the iris and pupil circles is updated to improve the accuracy and quality of the initial data. The stages of the proposed technology for collecting initial data are described. An example of the obtained pupillogram is given, which has a smooth shape and does not contain outliers, noise, anomalies and missing values. Based on the presented technology, a software and hardware complex has been developed, which is a collection of special software with two main modules, and hardware implemented on the basis of a Raspberry Pi 4 Model B microcomputer, with peripheral equipment that implements the specified functionality. To evaluate the effectiveness of the developed technology, models of a single-layer perspetron and a collective of neural networks are used, for the construction of which the initial data on the functional state of intoxication of a person were used. The studies have shown that the use of manual marking of the initial data (in comparison with automatic methods of computer vision) leads to a decrease in the number of errors of the 1st and 2nd years of the kind and, accordingly, to an increase in the accuracy of assessing the functional state of a person. Thus, the presented technology for collecting initial data can be effectively used to build adequate models for assessing the functional state of a person by pupillary response to changes in illumination. The use of such models is relevant in solving individual problems of ensuring transport security, in particular, monitoring the functional state of drivers.
-
Modeling system of extrusion and forming polymeric materials for blown film quality control
Computer Research and Modeling, 2014, v. 6, no. 1, pp. 137-158Views (last year): 7. Citations: 3 (RSCI).Flexible software for modeling polymeric film production by use of blown extrusion has been developed. It consists of library of mathematical models for extrusion and forming blown film, sub-system for changeover to new type of film and sub-system for investigation of extrusion and forming for film quality control under film production. The sub-system for changeover allows to choose the equipment of extrusion line on technical and economic indices, to synthesize 3D model of the line and to generate regulation ranges of regime parameters for given type of film. The sub-system for investigation allows to calculate temperature profiles of heating and cooling material, geometrical and optical characteristics of film depending on regime parameters for stages of extrusion and forming and to evaluate regime parameters ensuring given quality of polymeric film.
-
Detection of influence of upper working roll’s vibrayion on thickness of sheet at cold rolling with the help of DEFORM-3D software
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 111-116Views (last year): 12. Citations: 1 (RSCI).Technical diagnosis’ current trends are connected to application of FEM computer simulation, which allows, to some extent, replace real experiments, reduce costs for investigation and minimize risks. Computer simulation, just at the stage of research and development, allows carrying out of diagnostics of equipment to detect permissible fluctuations of parameters of equipment’s work. Peculiarity of diagnosis of rolling equipment is that functioning of rolling equipment is directly tied with manufacturing of product with required quality, including accuracy. At that design of techniques of technical diagnosis and diagnostical modelling is very important. Computer simulation of cold rolling of strip was carried out. At that upper working roll was doing vibrations in horizontal direction according with published data of experiments on continuous 1700 rolling mill. Vibration of working roll in a stand appeared due to gap between roll’s craft and guide in a stand and led to periodical fluctuations of strip’s thickness. After computer simulation with the help of DEFORM software strip with longitudinal and transversal thickness variation was gotten. Visualization of strip’s geometrical parameters, according with simulation data, corresponded to type of inhomogeneity of surface of strip rolled in real. Further analysis of thickness variation was done in order to identify, on the basis of simulation, sources of periodical components of strip’s thickness, whose reasons are malfunctions of equipment. Advantage of computer simulation while searching the sources of forming of thickness variation is that different hypothesis concerning thickness formations may be tested without conducting real experiments and costs of different types may be reduced. Moreover, while simulation, initial strip’s thickness will not have fluctuations as opposed to industrial or laboratorial experiments. On the basis of spectral analysis of random process, it was established that frequency of changing of strip’s thickness after rolling in one stand coincides with frequency of working roll’s vibration. Results of computer simulation correlate with results of the researches for 1700 mill. Therefore, opportunity to apply computer simulation to find reasons of formation of thickness variation of strip on the industrial rolling mill is shown.
-
Simulation of laser polishing for fused quartz
Computer Research and Modeling, 2026, v. 18, no. 2, pp. 399-421Laser polishing is a promising technology for the finishing of fused quartz (fused silica or quartz glass) products, enabling the removal of subsurface defects induced by mechanical processing. However, the complexity and nonlinearity of the physical processes occurring during laser irradiation complicate the selection of optimal technological parameters. The present paper aims to develop, comparatively analyze, and apply high-precision predictive models for forecasting and optimizing the key performance indicators of the laser polishing process for quartz glass. A verified finite element model implemented in the ANSYS software environment produced a dataset of temperature and stress fields for various combinations of process parameters. This dataset was used to develop and validate four types of predictive models: Polynomial Regression, a Fuzzy Logic System, an Adaptive Neuro-Fuzzy Inference System (ANFIS), and a Multilayer Perceptron (MLP) neural network. The models’ quality was evaluated on a test set using the statistical metrics MAE, RMSE, MAPE, $R^2$, and $R^2_{Adj}$. A comparative analysis of the models revealed the significant superiority of the MLP neural network, which demonstrated the highest prediction accuracy for all output parameters, achieving Adjusted $R^2$ ($R^2_{Adj}$.) values above 0.97 and a Mean Absolute Percentage Error (MAPE) in the range of 0.7–2.8%. This model was effectively utilized as a surrogate function in combination with a genetic algorithm to successfully identify the optimal process parameters. The constructed MLP neural network model functions as a reliable and high-precision tool, facilitating both prediction and the optimization of fused quartz polishing outcomes using a CO2 laser. This approach effectively approximates the complex nonlinear dependencies inherent in the process and can serve as a foundation for developing intelligent control and optimization systems for this technology.
Keywords: laser polishing, ANSYS, modeling, regression, fuzzy logic system, ANFIS, neural network model, optimization. -
A survey on the application of large language models in software engineering
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1715-1726Large Language Models (LLMs) are transforming software engineering by bridging the gap between natural language and programming languages. These models have revolutionized communication within development teams and the Software Development Life Cycle (SDLC) by enabling developers to interact with code using natural language, thereby improving workflow efficiency. This survey examines the impact of LLMs across various stages of the SDLC, including requirement gathering, system design, coding, debugging, testing, and documentation. LLMs have proven to be particularly useful in automating repetitive tasks such as code generation, refactoring, and bug detection, thus reducing manual effort and accelerating the development process. The integration of LLMs into the development process offers several advantages, including the automation of error correction, enhanced collaboration, and the ability to generate high-quality, functional code based on natural language input. Additionally, LLMs assist developers in understanding and implementing complex software requirements and design patterns. This paper also discusses the evolution of LLMs from simple code completion tools to sophisticated models capable of performing high-level software engineering tasks. However, despite their benefits, there are challenges associated with LLM adoption, such as issues related to model accuracy, interpretability, and potential biases. These limitations must be addressed to ensure the reliable deployment of LLMs in production environments. The paper concludes by identifying key areas for future research, including improving the adaptability of LLMs to specific software domains, enhancing their contextual understanding, and refining their capabilities to generate semantically accurate and efficient code. This survey provides valuable insights into the evolving role of LLMs in software engineering, offering a foundation for further exploration and practical implementation.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




