All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Comparative analysis of human adaptation to the growth of visual information in the tasks of recognizing formal symbols and meaningful images
Computer Research and Modeling, 2021, v. 13, no. 3, pp. 571-586We describe an engineering-psychological experiment that continues the study of ways to adapt a person to the increasing complexity of logical problems by presenting a series of problems of increasing complexity, which is determined by the volume of initial data. Tasks require calculations in an associative or non-associative system of operations. By the nature of the change in the time of solving the problem, depending on the number of necessary operations, we can conclude that a purely sequential method of solving problems or connecting additional brain resources to the solution in parallel mode. In a previously published experimental work, a person in the process of solving an associative problem recognized color images with meaningful images. In the new study, a similar problem is solved for abstract monochrome geometric shapes. Analysis of the result showed that for the second case, the probability of the subject switching to a parallel method of processing visual information is significantly reduced. The research method is based on presenting a person with two types of tasks. One type of problem contains associative calculations and allows a parallel solution algorithm. Another type of problem is the control one, which contains problems in which calculations are not associative and parallel algorithms are ineffective. The task of recognizing and searching for a given object is associative. A parallel strategy significantly speeds up the solution with relatively small additional resources. As a control series of problems (to separate parallel work from the acceleration of a sequential algorithm), we use, as in the previous experiment, a non-associative comparison problem in cyclic arithmetic, presented in the visual form of the game “rock, paper, scissors”. In this problem, the parallel algorithm requires a large number of processors with a small efficiency coefficient. Therefore, the transition of a person to a parallel algorithm for solving this problem is almost impossible, and the acceleration of processing input information is possible only by increasing the speed. Comparing the dependence of the solution time on the volume of source data for two types of problems allows us to identify four types of strategies for adapting to the increasing complexity of the problem: uniform sequential, accelerated sequential, parallel computing (where possible), or undefined (for this method) strategy. The Reducing of the number of subjects, who switch to a parallel strategy when encoding input information with formal images, shows the effectiveness of codes that cause subject associations. They increase the speed of human perception and processing of information. The article contains a preliminary mathematical model that explains this phenomenon. It is based on the appearance of a second set of initial data, which occurs in a person as a result of recognizing the depicted objects.
-
Image classification based on deep learning with automatic relevance determination and structured Bayesian pruning
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 927-938Deep learning’s power stems from complex architectures; however, these can lead to overfitting, where models memorize training data and fail to generalize to unseen examples. This paper proposes a novel probabilistic approach to mitigate this issue. We introduce two key elements: Truncated Log-Uniform Prior and Truncated Log-Normal Variational Approximation, and Automatic Relevance Determination (ARD) with Bayesian Deep Neural Networks (BDNNs). Within the probabilistic framework, we employ a specially designed truncated log-uniform prior for noise. This prior acts as a regularizer, guiding the learning process towards simpler solutions and reducing overfitting. Additionally, a truncated log-normal variational approximation is used for efficient handling of the complex probability distributions inherent in deep learning models. ARD automatically identifies and removes irrelevant features or weights within a model. By integrating ARD with BDNNs, where weights have a probability distribution, we achieve a variational bound similar to the popular variational dropout technique. Dropout randomly drops neurons during training, encouraging the model not to rely heavily on any single feature. Our approach with ARD achieves similar benefits without the randomness of dropout, potentially leading to more stable training.
To evaluate our approach, we have tested the model on two datasets: the Canadian Institute For Advanced Research (CIFAR-10) for image classification and a dataset of Macroscopic Images of Wood, which is compiled from multiple macroscopic images of wood datasets. Our method is applied to established architectures like Visual Geometry Group (VGG) and Residual Network (ResNet). The results demonstrate significant improvements. The model reduced overfitting while maintaining, or even improving, the accuracy of the network’s predictions on classification tasks. This validates the effectiveness of our approach in enhancing the performance and generalization capabilities of deep learning models.
-
Computer aided analysis of medical image recognition for example of scintigraphy
Computer Research and Modeling, 2016, v. 8, no. 3, pp. 541-548Views (last year): 3. Citations: 3 (RSCI).The practical application of nuclear medicine demonstrates the continued information deficiency of the algorithms and programs that provide visualization and analysis of medical images. The aim of the study was to determine the principles of optimizing the processing of planar osteostsintigraphy on the basis of сomputer aided diagnosis (CAD) for analysis of texture descriptions of images of metastatic zones on planar scintigrams of skeleton. A computer-aided diagnosis system for analysis of skeletal metastases based on planar scintigraphy data has been developed. This system includes skeleton image segmentation, calculation of textural, histogram and morphometrical parameters and the creation of a training set. For study of metastatic images’ textural characteristics on planar scintigrams of skeleton was developed the computer program of automatic analysis of skeletal metastases is used from data of planar scintigraphy. Also expert evaluation was used to distinguishing ‘pathological’ (metastatic) from ‘physiological’ (non-metastatic) radiopharmaceutical hyperfixation zones in which Haralick’s textural features were determined: autocorrelation, contrast, ‘forth moment’ and heterogeneity. This program was established on the principles of сomputer aided diagnosis researches planar scintigrams of skeletal patients with metastatic breast cancer hearths hyperfixation of radiopharmaceuticals were identified. Calculated parameters were made such as brightness, smoothness, the third moment of brightness, brightness uniformity, entropy brightness. It has been established that in most areas of the skeleton of histogram values of parameters in pathologic hyperfixation of radiopharmaceuticals predominate over the same values in the physiological. Most often pathological hyperfixation of radiopharmaceuticals as the front and rear fixed scintigramms prevalence of brightness and smoothness of the image brightness in comparison with those of the physiological hyperfixation of radiopharmaceuticals. Separate figures histogram analysis can be used in specifying the diagnosis of metastases in the mathematical modeling and interpretation bone scintigraphy. Separate figures histogram analysis can be used in specifying the diagnosis of metastases in the mathematical modeling and interpretation bone scintigraphy.
-
Signal and noise calculation at Rician data analysis by means of combining maximum likelihood technique and method of moments
Computer Research and Modeling, 2018, v. 10, no. 4, pp. 511-523Views (last year): 11.The paper develops a new mathematical method of the joint signal and noise calculation at the Rice statistical distribution based on combing the maximum likelihood method and the method of moments. The calculation of the sough-for values of signal and noise is implemented by processing the sampled measurements of the analyzed Rician signal’s amplitude. The explicit equations’ system has been obtained for required signal and noise parameters and the results of its numerical solution are provided confirming the efficiency of the proposed technique. It has been shown that solving the two-parameter task by means of the proposed technique does not lead to the increase of the volume of demanded calculative resources if compared with solving the task in one-parameter approximation. An analytical solution of the task has been obtained for the particular case of small value of the signal-to-noise ratio. The paper presents the investigation of the dependence of the sought for parameters estimation accuracy and dispersion on the quantity of measurements in experimental sample. According to the results of numerical experiments, the dispersion values of the estimated sought-for signal and noise parameters calculated by means of the proposed technique change in inverse proportion to the quantity of measurements in a sample. There has been implemented a comparison of the accuracy of the soughtfor Rician parameters’ estimation by means of the proposed technique and by earlier developed version of the method of moments. The problem having been considered in the paper is meaningful for the purposes of Rician data processing, in particular, at the systems of magnetic-resonance visualization, in devices of ultrasonic visualization, at optical signals’ analysis in range-measuring systems, at radar signals’ analysis, as well as at solving many other scientific and applied tasks that are adequately described by the Rice statistical model.
-
Detection of influence of upper working roll’s vibrayion on thickness of sheet at cold rolling with the help of DEFORM-3D software
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 111-116Views (last year): 12. Citations: 1 (RSCI).Technical diagnosis’ current trends are connected to application of FEM computer simulation, which allows, to some extent, replace real experiments, reduce costs for investigation and minimize risks. Computer simulation, just at the stage of research and development, allows carrying out of diagnostics of equipment to detect permissible fluctuations of parameters of equipment’s work. Peculiarity of diagnosis of rolling equipment is that functioning of rolling equipment is directly tied with manufacturing of product with required quality, including accuracy. At that design of techniques of technical diagnosis and diagnostical modelling is very important. Computer simulation of cold rolling of strip was carried out. At that upper working roll was doing vibrations in horizontal direction according with published data of experiments on continuous 1700 rolling mill. Vibration of working roll in a stand appeared due to gap between roll’s craft and guide in a stand and led to periodical fluctuations of strip’s thickness. After computer simulation with the help of DEFORM software strip with longitudinal and transversal thickness variation was gotten. Visualization of strip’s geometrical parameters, according with simulation data, corresponded to type of inhomogeneity of surface of strip rolled in real. Further analysis of thickness variation was done in order to identify, on the basis of simulation, sources of periodical components of strip’s thickness, whose reasons are malfunctions of equipment. Advantage of computer simulation while searching the sources of forming of thickness variation is that different hypothesis concerning thickness formations may be tested without conducting real experiments and costs of different types may be reduced. Moreover, while simulation, initial strip’s thickness will not have fluctuations as opposed to industrial or laboratorial experiments. On the basis of spectral analysis of random process, it was established that frequency of changing of strip’s thickness after rolling in one stand coincides with frequency of working roll’s vibration. Results of computer simulation correlate with results of the researches for 1700 mill. Therefore, opportunity to apply computer simulation to find reasons of formation of thickness variation of strip on the industrial rolling mill is shown.
-
Investigation of the averaged model of coked catalyst oxidative regeneration
Computer Research and Modeling, 2021, v. 13, no. 1, pp. 149-161The article is devoted to the construction and investigation of an averaged mathematical model of an aluminum-cobalt-molybdenum hydrocracking catalyst oxidative regeneration. The oxidative regeneration is an effective means of restoring the activity of the catalyst when its granules are coating with coke scurf.
The mathematical model of this process is a nonlinear system of ordinary differential equations, which includes kinetic equations for reagents’ concentrations and equations for changes in the temperature of the catalyst granule and the reaction mixture as a result of isothermal reactions and heat transfer between the gas and the catalyst layer. Due to the heterogeneity of the oxidative regeneration process, some of the equations differ from the standard kinetic ones and are based on empirical data. The article discusses the scheme of chemical interaction in the regeneration process, which the material balance equations are compiled on the basis of. It reflects the direct interaction of coke and oxygen, taking into account the degree of coverage of the coke granule with carbon-hydrogen and carbon-oxygen complexes, the release of carbon monoxide and carbon dioxide during combustion, as well as the release of oxygen and hydrogen inside the catalyst granule. The change of the radius and, consequently, the surface area of coke pellets is taken into account. The adequacy of the developed averaged model is confirmed by an analysis of the dynamics of the concentrations of substances and temperature.
The article presents a numerical experiment for a mathematical model of oxidative regeneration of an aluminum-cobalt-molybdenum hydrocracking catalyst. The experiment was carried out using the Kutta–Merson method. This method belongs to the methods of the Runge–Kutta family, but is designed to solve stiff systems of ordinary differential equations. The results of a computational experiment are visualized.
The paper presents the dynamics of the concentrations of substances involved in the oxidative regeneration process. A conclusion on the adequacy of the constructed mathematical model is drawn on the basis of the correspondence of the obtained results to physicochemical laws. The heating of the catalyst granule and the release of carbon monoxide with a change in the radius of the granule for various degrees of initial coking are analyzed. There are a description of the results.
In conclusion, the main results and examples of problems which can be solved using the developed mathematical model are noted.
-
Computer simulation of temperature field of blast furnace’s air tuyere
Computer Research and Modeling, 2017, v. 9, no. 1, pp. 117-125Views (last year): 7.Study of work of heating equipment is an actual issue because it allows determining optimal regimes to reach highest efficiency. At that it is very helpful to use computer simulation to predict how different heating modes influence the effectiveness of the heating process and wear of heating equipment. Computer simulation provides results whose accuracy is proven by many studies and requires costs and time less than real experiments. In terms of present research, computer simulation of heating of air tuyere of blast furnace was realized with the help of FEM software. Background studies revealed possibility to simulate it as a flat, axisymmetric problem and DEFORM-2D software was used for simulation. Geometry, necessary for simulation, was designed with the help of SolidWorks, saved in .dxf format. Then it was exported to DEFORM-2D pre-processor and positioned. Preliminary and boundary conditions were set up. Several modes of operating regimes were under analysis. In order to demonstrate influence of eah of the modes and for better visualization point tracking option of the DEFORM-2D post-processor was applied. Influence of thermal insulation box plugged into blow channel, with and without air gap, and thermal coating on air tuyere’s temperature field was investigated. Simulation data demonstrated significant effect of thermal insulation box on air tuyere’s temperature field. Designed model allowed to simulate tuyere’s burnout as a result of interaction with liquid iron. Conducted researches have demonstrated DEFORM-2D effectiveness while using it for simulation of heat transfer and heating processes. DEFORM-2D is about to be used in further studies dedicated to more complex process connected with temperature field of blast furnace’s air tuyere.
-
Biomathematical system of the nucleic acids description
Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.
-
A framework for medical image segmentation based on measuring diversity of pixel’s intensity utilizing interval approach
Computer Research and Modeling, 2021, v. 13, no. 5, pp. 1059-1066Segmentation of medical image is one of the most challenging tasks in analysis of medical image. It classifies the organs pixels or lesions from medical images background like MRI or CT scans, that is to provide critical information about the human organ’s volumes and shapes. In scientific imaging field, medical imaging is considered one of the most important topics due to the rapid and continuing progress in computerized medical image visualization, advances in analysis approaches and computer-aided diagnosis. Digital image processing becomes more important in healthcare field due to the growing use of direct digital imaging systems for medical diagnostics. Due to medical imaging techniques, approaches of image processing are now applicable in medicine. Generally, various transformations will be needed to extract image data. Also, a digital image can be considered an approximation of a real situation includes some uncertainty derived from the constraints on the process of vision. Since information on the level of uncertainty will influence an expert’s attitude. To address this challenge, we propose novel framework involving interval concept that consider a good tool for dealing with the uncertainty, In the proposed approach, the medical images are transformed into interval valued representation approach and entropies are defined for an image object and background. Then we determine a threshold for lower-bound image and for upper-bound image, and then calculate the mean value for the final output results. To demonstrate the effectiveness of the proposed framework, we evaluate it by using synthetic image and its ground truth. Experimental results showed how performance of the segmentation-based entropy threshold can be enhanced using proposed approach to overcome ambiguity.
-
Numerical simulation of fluid flow in a blood pump in the FlowVision software package
Computer Research and Modeling, 2023, v. 15, no. 4, pp. 1025-1038A numerical simulation of fluid flow in a blood pump was performed using the FlowVision software package. This test problem, provided by the Center for Devices and Radiological Health of the US. Food and Drug Administration, involved considering fluid flow according to several design modes. At the same time for each case of calculation a certain value of liquid flow rate and rotor speed was set. Necessary data for calculations in the form of exact geometry, flow conditions and fluid characteristics were provided to all research participants, who used different software packages for modeling. Numerical simulations were performed in FlowVision for six calculation modes with the Newtonian fluid and standard $k-\varepsilon$ turbulence model, in addition, the fifth mode with the $k-\omega$ SST turbulence model and with the Caro rheological fluid model were performed. In the first stage of the numerical simulation, the convergence over the mesh was investigated, on the basis of which a final mesh with a number of cells of the order of 6 million was chosen. Due to the large number of cells, in order to accelerate the study, part of the calculations was performed on the Lomonosov-2 cluster. As a result of numerical simulation, we obtained and analyzed values of pressure difference between inlet and outlet of the pump, velocity between rotor blades and in the area of diffuser, and also, we carried out visualization of velocity distribution in certain cross-sections. For all design modes there was compared the pressure difference received numerically with the experimental data, and for the fifth calculation mode there was also compared with the experiment by speed distribution between rotor blades and in the area of diffuser. Data analysis has shown good correlation of calculation results in FlowVision with experimental results and numerical simulation in other software packages. The results obtained in FlowVision for solving the US FDA test suggest that FlowVision software package can be used for solving a wide range of hemodynamic problems.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"