All issues
- 2026 Vol. 18
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Experimental identification of the organization of mental calculations of the person on the basis of algebras of different associativity
Computer Research and Modeling, 2019, v. 11, no. 2, pp. 311-327Views (last year): 16.The work continues research on the ability of a person to improve the productivity of information processing, using parallel work or improving the performance of analyzers. A person receives a series of tasks, the solution of which requires the processing of a certain amount of information. The time and the validity of the decision are recorded. The dependence of the average solution time on the amount of information in the problem is determined by correctly solved problems. In accordance with the proposed method, the problems contain calculations of expressions in two algebras, one of which is associative and the other is nonassociative. To facilitate the work of the subjects in the experiment were used figurative graphic images of elements of algebra. Non-associative calculations were implemented in the form of the game “rock-paper-scissors”. It was necessary to determine the winning symbol in the long line of these figures, considering that they appear sequentially from left to right and play with the previous winner symbol. Associative calculations were based on the recognition of drawings from a finite set of simple images. It was necessary to determine which figure from this set in the line is not enough, or to state that all the pictures are present. In each problem there was no more than one picture. Computation in associative algebra allows the parallel counting, and in the absence of associativity only sequential computations are possible. Therefore, the analysis of the time for solving a series of problems reveals a consistent uniform, sequential accelerated and parallel computing strategy. In the experiments it was found that all subjects used a uniform sequential strategy to solve non-associative problems. For the associative task, all subjects used parallel computing, and some have used parallel computing acceleration of the growth of complexity of the task. A small part of the subjects with a high complexity, judging by the evolution of the solution time, supplemented the parallel account with a sequential stage of calculations (possibly to control the solution). We develop a special method for assessing the rate of processing of input information by a person. It allowed us to estimate the level of parallelism of the calculation in the associative task. Parallelism of level from two to three was registered. The characteristic speed of information processing in the sequential case (about one and a half characters per second) is twice less than the typical speed of human image recognition. Apparently the difference in processing time actually spent on the calculation process. For an associative problem in the case of a minimum amount of information, the solution time is near to the non-associativity case or less than twice. This is probably due to the fact that for a small number of characters recognition almost exhausts the calculations for the used non-associative problem.
-
Using Docker service containers to build browser-based clinical decision support systems (CDSS)
Computer Research and Modeling, 2026, v. 18, no. 1, pp. 133-147The article presents a technology for building clinical decision support systems (CDSS) based on service containers using Docker and a web interface that runs directly in the browser without installing specialized software on workstation of a clinician. A modular architecture is proposed in which each application module is packaged as an independent service container combining a lightweight web server, a user interface, and computational components for medical image processing. Communication between the browser and the server side is implemented via a persistent bidirectional WebSocket connection with binary message serialization (MessagePack), which provides low latency and efficient transfer of large data. For local storage of images and analysis of results, browser facilities (IndexedDB with the Dexie.js wrapper) are used to speed up repeated data access. Three-dimensional visualization and basic operations with DICOM data are implemented with Three.js and AMI.js: this toolchain supports the integration of interactive elements arising from the task context (annotations, landmarks, markers, 3D models) into volumetric medical images.
Server components and functional modules are assembled as a set of interacting containers managed by Docker. The paper discusses the choice of base images, approaches to minimizing containers down to runtime-only executables without external utilities, and the organization of multi-stage builds with a dedicated build container. It describes a hub service that launches application containers on user request, performs request proxying, manages sessions, and switches a container from shared to exclusive mode at the start of computations. Examples of application modules are provided (fractional flow reserve estimation, quantitative flow ratio computation, aortic valve closure modeling), along with the integration of a React-based interface with a three-dimensional scene, a versioning policy, automated reproducibility checks, and the deployment procedure on the target platform.
It is demonstrated that containerization ensures portability and reproducibility of the software environment, dependency isolation and scalability, while the browser-based interface provides accessibility, reduced infrastructure requirements, and interactive real-time visualization of medical data. Technical limitations are noted (dependence on versions of visualization libraries and data formats) together with practical mitigation measures.
-
Signal and noise calculation at Rician data analysis by means of combining maximum likelihood technique and method of moments
Computer Research and Modeling, 2018, v. 10, no. 4, pp. 511-523Views (last year): 11.The paper develops a new mathematical method of the joint signal and noise calculation at the Rice statistical distribution based on combing the maximum likelihood method and the method of moments. The calculation of the sough-for values of signal and noise is implemented by processing the sampled measurements of the analyzed Rician signal’s amplitude. The explicit equations’ system has been obtained for required signal and noise parameters and the results of its numerical solution are provided confirming the efficiency of the proposed technique. It has been shown that solving the two-parameter task by means of the proposed technique does not lead to the increase of the volume of demanded calculative resources if compared with solving the task in one-parameter approximation. An analytical solution of the task has been obtained for the particular case of small value of the signal-to-noise ratio. The paper presents the investigation of the dependence of the sought for parameters estimation accuracy and dispersion on the quantity of measurements in experimental sample. According to the results of numerical experiments, the dispersion values of the estimated sought-for signal and noise parameters calculated by means of the proposed technique change in inverse proportion to the quantity of measurements in a sample. There has been implemented a comparison of the accuracy of the soughtfor Rician parameters’ estimation by means of the proposed technique and by earlier developed version of the method of moments. The problem having been considered in the paper is meaningful for the purposes of Rician data processing, in particular, at the systems of magnetic-resonance visualization, in devices of ultrasonic visualization, at optical signals’ analysis in range-measuring systems, at radar signals’ analysis, as well as at solving many other scientific and applied tasks that are adequately described by the Rice statistical model.
-
Approach to Estimating the Dynamics of the Industry Consolidation Level
Computer Research and Modeling, 2023, v. 15, no. 1, pp. 129-140In this article we propose a new approach to the analysis of econometric industry parameters for the industry consolidation level. The research is based on the simple industry automatic control model. The state of the industry is measured by quarterly obtained econometric parameters from each industry’s company provided by the tax control regulator. An approach to analysis of the industry, which does not provide for tracking the economy of each company, but explores the parameters of the set of all companies as a whole, is proposed. Quarterly obtained econometric parameters from each industry’s company are Income, Quantity of employers, Taxes, and Income from Software Licenses. The ABC analysis method was modified by ABCD analysis (D — companies with zero-level impact to industry metrics) and used to make the results obtained for different indicators comparable. Pareto charts were formed for the set of econometric indicators.
To estimate the industry monopolization, the Herfindahl – Hirschman index was calculated for the most sensitive companies metrics. Using the HHI approach, it was proved that COVID-19 does not lead to changes in the monopolization of the Russian IT industry.
As the most visually obvious approach to the industry visualization, scattering diagrams in combination with the Pareto graph colors were proposed. The affect of the accreditation procedure is clearly observed by scattering diagram in combination with red/black dots for accredited and nonaccredited companies respectively.
The last reported result is the proposal to use the Licenses End-to-End Product Identification as the market structure control instrument. It is the basis to avoid the multiple accounting of the licenses reselling within the chain of software distribution.
The results of research could be the basis for future IT industry analysis and simulation on the agent based approach.
-
Pareto optimal analysis of global warming prevention by geoengineering methods
Computer Research and Modeling, 2015, v. 7, no. 5, pp. 1097-1108Views (last year): 1. Citations: 3 (RSCI).The study is based on a three-dimensional hydrodynamic global climate coupled model, including ocean model with real depths and continents configuration, sea ice evolution model and energy and moisture balance atmosphere model. Aerosol concentration from the year 2010 to 2100 is calculated as a controlling parameter to stabilize mean year surface air temperature. It is shown that by this way it is impossible to achieve the space and seasonal uniform approximation to the existing climate, although it is possible significantly reduce the greenhouse warming effect. Climate will be colder at 0.1–0.2 degrees in the low and mid-latitudes and at high latitudes it will be warmer at 0.2–1.2 degrees. The Pareto frontier is investigated and visualized for two parameters — atmospheric temperature mean square deviation for the winter and summer seasons. The Pareto optimal amount of sulfur emissions would be between 23.5 and 26.5 TgS/year.
-
Views (last year): 5. Citations: 33 (RSCI).
This work is devoted to creation of static atomic model of two surfaces in contact at electric diamond grinding: single-points diamond and material grinded of them. At the heart of the work there are issues of computer visualization of these surfaces at the molecular level, since traditional mathematical description does not possess sufficient visualization to demonstrate some aspects of the atomic tribology of metal cutting to simultaneously occurring the different, by their physical nature, processes. And in the electric diamond grinding blends effect of several processes simultaneously: mechanical, electrical and electrochemical. So the modeling technique proposed by authors is still the only way to see what is happening at the atomic level, cutting material of single-point diamond.
-
Detection of influence of upper working roll’s vibrayion on thickness of sheet at cold rolling with the help of DEFORM-3D software
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 111-116Views (last year): 12. Citations: 1 (RSCI).Technical diagnosis’ current trends are connected to application of FEM computer simulation, which allows, to some extent, replace real experiments, reduce costs for investigation and minimize risks. Computer simulation, just at the stage of research and development, allows carrying out of diagnostics of equipment to detect permissible fluctuations of parameters of equipment’s work. Peculiarity of diagnosis of rolling equipment is that functioning of rolling equipment is directly tied with manufacturing of product with required quality, including accuracy. At that design of techniques of technical diagnosis and diagnostical modelling is very important. Computer simulation of cold rolling of strip was carried out. At that upper working roll was doing vibrations in horizontal direction according with published data of experiments on continuous 1700 rolling mill. Vibration of working roll in a stand appeared due to gap between roll’s craft and guide in a stand and led to periodical fluctuations of strip’s thickness. After computer simulation with the help of DEFORM software strip with longitudinal and transversal thickness variation was gotten. Visualization of strip’s geometrical parameters, according with simulation data, corresponded to type of inhomogeneity of surface of strip rolled in real. Further analysis of thickness variation was done in order to identify, on the basis of simulation, sources of periodical components of strip’s thickness, whose reasons are malfunctions of equipment. Advantage of computer simulation while searching the sources of forming of thickness variation is that different hypothesis concerning thickness formations may be tested without conducting real experiments and costs of different types may be reduced. Moreover, while simulation, initial strip’s thickness will not have fluctuations as opposed to industrial or laboratorial experiments. On the basis of spectral analysis of random process, it was established that frequency of changing of strip’s thickness after rolling in one stand coincides with frequency of working roll’s vibration. Results of computer simulation correlate with results of the researches for 1700 mill. Therefore, opportunity to apply computer simulation to find reasons of formation of thickness variation of strip on the industrial rolling mill is shown.
-
Investigation of the averaged model of coked catalyst oxidative regeneration
Computer Research and Modeling, 2021, v. 13, no. 1, pp. 149-161The article is devoted to the construction and investigation of an averaged mathematical model of an aluminum-cobalt-molybdenum hydrocracking catalyst oxidative regeneration. The oxidative regeneration is an effective means of restoring the activity of the catalyst when its granules are coating with coke scurf.
The mathematical model of this process is a nonlinear system of ordinary differential equations, which includes kinetic equations for reagents’ concentrations and equations for changes in the temperature of the catalyst granule and the reaction mixture as a result of isothermal reactions and heat transfer between the gas and the catalyst layer. Due to the heterogeneity of the oxidative regeneration process, some of the equations differ from the standard kinetic ones and are based on empirical data. The article discusses the scheme of chemical interaction in the regeneration process, which the material balance equations are compiled on the basis of. It reflects the direct interaction of coke and oxygen, taking into account the degree of coverage of the coke granule with carbon-hydrogen and carbon-oxygen complexes, the release of carbon monoxide and carbon dioxide during combustion, as well as the release of oxygen and hydrogen inside the catalyst granule. The change of the radius and, consequently, the surface area of coke pellets is taken into account. The adequacy of the developed averaged model is confirmed by an analysis of the dynamics of the concentrations of substances and temperature.
The article presents a numerical experiment for a mathematical model of oxidative regeneration of an aluminum-cobalt-molybdenum hydrocracking catalyst. The experiment was carried out using the Kutta–Merson method. This method belongs to the methods of the Runge–Kutta family, but is designed to solve stiff systems of ordinary differential equations. The results of a computational experiment are visualized.
The paper presents the dynamics of the concentrations of substances involved in the oxidative regeneration process. A conclusion on the adequacy of the constructed mathematical model is drawn on the basis of the correspondence of the obtained results to physicochemical laws. The heating of the catalyst granule and the release of carbon monoxide with a change in the radius of the granule for various degrees of initial coking are analyzed. There are a description of the results.
In conclusion, the main results and examples of problems which can be solved using the developed mathematical model are noted.
-
Computer simulation of temperature field of blast furnace’s air tuyere
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 117-125Views (last year): 7.Study of work of heating equipment is an actual issue because it allows determining optimal regimes to reach highest efficiency. At that it is very helpful to use computer simulation to predict how different heating modes influence the effectiveness of the heating process and wear of heating equipment. Computer simulation provides results whose accuracy is proven by many studies and requires costs and time less than real experiments. In terms of present research, computer simulation of heating of air tuyere of blast furnace was realized with the help of FEM software. Background studies revealed possibility to simulate it as a flat, axisymmetric problem and DEFORM-2D software was used for simulation. Geometry, necessary for simulation, was designed with the help of SolidWorks, saved in .dxf format. Then it was exported to DEFORM-2D pre-processor and positioned. Preliminary and boundary conditions were set up. Several modes of operating regimes were under analysis. In order to demonstrate influence of eah of the modes and for better visualization point tracking option of the DEFORM-2D post-processor was applied. Influence of thermal insulation box plugged into blow channel, with and without air gap, and thermal coating on air tuyere’s temperature field was investigated. Simulation data demonstrated significant effect of thermal insulation box on air tuyere’s temperature field. Designed model allowed to simulate tuyere’s burnout as a result of interaction with liquid iron. Conducted researches have demonstrated DEFORM-2D effectiveness while using it for simulation of heat transfer and heating processes. DEFORM-2D is about to be used in further studies dedicated to more complex process connected with temperature field of blast furnace’s air tuyere.
-
Biomathematical system of the nucleic acids description
Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




