All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Grid based high performance computing in satellite imagery. Case study — Perona–Malik filter
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 399-406Views (last year): 3.The present paper discusses an approach to the efficient satellite image processing which involves two steps. The first step assumes the distribution of the steadily increasing volume of satellite collected data through a Grid infrastructure. The second step assumes the acceleration of the solution of the individual tasks related to image processing by implementing execution codes which make heavy use of spatial and temporal parallelism. An instance of such execution code is the image processing by means of the iterative Perona–Malik filter within FPGA application specific hardware architecture.
-
Bayesian localization for autonomous vehicle using sensor fusion and traffic signs
Computer Research and Modeling, 2018, v. 10, no. 3, pp. 295-303Views (last year): 22.The localization of a vehicle is an important task in the field of intelligent transportation systems. It is well known that sensor fusion helps to create more robust and accurate systems for autonomous vehicles. Standard approaches, like extended Kalman Filter or Particle Filter, are inefficient in case of highly non-linear data or have high computational cost, which complicates using them in embedded systems. Significant increase of precision, especially in case when GPS (Global Positioning System) is unavailable, may be achieved by using landmarks with known location — such as traffic signs, traffic lights, or SLAM (Simultaneous Localization and Mapping) features. However, this approach may be inapplicable if a priori locations are unknown or not accurate enough. We suggest a new approach for refining coordinates of a vehicle by using landmarks, such as traffic signs. Core part of the suggested system is the Bayesian framework, which refines vehicle location using external data about the previous traffic signs detections, collected with crowdsourcing. This paper presents an approach that combines trajectories built using global coordinates from GPS and relative coordinates from Inertial Measurement Unit (IMU) to produce a vehicle's trajectory in an unknown environment. In addition, we collected a new dataset, including from smartphone GPS and IMU sensors, video feed from windshield camera, which were recorded during 4 car rides on the same route. Also, we collected precise location data from Real Time Kinematic Global Navigation Satellite System (RTK-GNSS) device, which can be used for validation. This RTK-GNSS system was used to collect precise data about the traffic signs locations on the route as well. The results show that the Bayesian approach helps with the trajectory correction and gives better estimations with the increase of the amount of the prior information. The suggested method is efficient and requires, apart from the GPS/IMU measurements, only information about the vehicle locations during previous traffic signs detections.
-
Conversion of the initial indices of the technological process of the smelting of steel for the subsequent simulation
Computer Research and Modeling, 2017, v. 9, no. 2, pp. 187-199Views (last year): 6. Citations: 1 (RSCI).Efficiency of production directly depends on quality of the management of technology which, in turn, relies on the accuracy and efficiency of the processing of control and measuring information. Development of the mathematical methods of research of the system communications and regularities of functioning and creation of the mathematical models taking into account structural features of object of researches, and also writing of the software products for realization of these methods are an actual task. Practice has shown that the list of parameters that take place in the study of complex object of modern production, ranging from a few dozen to several hundred names, and the degree of influence of each factor in the initial time is not clear. Before working for the direct determination of the model in these circumstances, it is impossible — the amount of the required information may be too great, and most of the work on the collection of this information will be done in vain due to the fact that the degree of influence on the optimization of most factors of the original list would be negligible. Therefore, a necessary step in determining a model of a complex object is to work to reduce the dimension of the factor space. Most industrial plants are hierarchical group processes and mass volume production, characterized by hundreds of factors. (For an example of realization of the mathematical methods and the approbation of the constructed models data of the Moldavian steel works were taken in a basis.) To investigate the systemic linkages and patterns of functioning of such complex objects are usually chosen several informative parameters, and carried out their sampling. In this article the sequence of coercion of the initial indices of the technological process of the smelting of steel to the look suitable for creation of a mathematical model for the purpose of prediction is described. The implementations of new types became also creation of a basis for development of the system of automated management of quality of the production. In the course of weak correlation the following stages are selected: collection and the analysis of the basic data, creation of the table the correlated of the parameters, abbreviation of factor space by means of the correlative pleiads and a method of weight factors. The received results allow to optimize process of creation of the model of multiple-factor process.
-
Numerical simulation of ice accretion in FlowVision software
Computer Research and Modeling, 2020, v. 12, no. 1, pp. 83-96Certifying a transport airplane for the flights under icing conditions requires calculations aimed at definition of the dimensions and shapes of the ice bodies formed on the airplane surfaces. Up to date, software developed in Russia for simulation of ice accretion, which would be authorized by Russian certifying supervisory authority, is absent. This paper describes methodology IceVision recently developed in Russia on the basis of software FlowVision for calculations of ice accretion on airplane surfaces.
The main difference of methodology IceVision from the other approaches, known from literature, consists in using technology Volume Of Fluid (VOF — volume of fluid in cell) for tracking the surface of growing ice body. The methodology assumes solving a time-depended problem of continuous grows of ice body in the Euler formulation. The ice is explicitly present in the computational domain. The energy equation is integrated inside the ice body. In the other approaches, changing the ice shape is taken into account by means of modifying the aerodynamic surface and using Lagrangian mesh. In doing so, the heat transfer into ice is allowed for by an empirical model.
The implemented mathematical model provides capability to simulate formation of rime (dry) and glaze (wet) ice. It automatically identifies zones of rime and glaze ice. In a rime (dry) ice zone, the temperature of the contact surface between air and ice is calculated with account of ice sublimation and heat conduction inside the ice. In a glaze (wet) ice zone, the flow of the water film over the ice surface is allowed for. The film freezes due to evaporation and heat transfer inside the air and the ice. Methodology IceVision allows for separation of the film. For simulation of the two-phase flow of the air and droplets, a multi-speed model is used within the Euler approach. Methodology IceVision allows for size distribution of droplets. The computational algorithm takes account of essentially different time scales for the physical processes proceeding in the course of ice accretion, viz., air-droplets flow, water flow, and ice growth. Numerical solutions of validation test problems demonstrate efficiency of methodology IceVision and reliability of FlowVision results.
-
Methodology and program for the storage and statistical analysis of the results of computer experiment
Computer Research and Modeling, 2013, v. 5, no. 4, pp. 589-595Views (last year): 1. Citations: 5 (RSCI).The problem of accumulation and the statistical analysis of computer experiment results are solved. The main experiment program is considered as the data source. The results of main experiment are collected on specially prepared sheet Excel with pre-organized structure for the accumulation, statistical processing and visualization of the data. The created method and the program are used at efficiency research of the scientific researches which are carried out by authors.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"