All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
The modeling of dense materials with spherepolyhedra packing method
Computer Research and Modeling, 2012, v. 4, no. 4, pp. 757-766Views (last year): 7. Citations: 6 (RSCI).The paper presents a new dense material modeling method based on spherepolyhedra packing algorithm, describes mathematical model of spherepolyhedra and discuss the results of computation experiments on different spherepolyhedra packs. The results of experiments show convergence of proposed method. Experiments include investigations of spherepolyhedra packs with different shapes, polydisperse and oriented structures. Presented method would be applied to virtual design of dense materials composed of non-spherical particles.
-
The use of GIS INTEGRO in searching tasks for oil and gas deposits
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 439-444Views (last year): 4.GIS INTEGRO is the geo-information software system forming the basis for the integrated interpretation of geophysical data in researching a deep structure of Earth. GIS INTEGRO combines a variety of computational and analytical applications for the solution of geological and geophysical problems. It includes various interfaces that allow you to change the form of representation of data (raster, vector, regular and irregular network of observations), the conversion unit of map projections, application blocks, including block integrated data analysis and decision prognostic and diagnostic tasks.
The methodological approach is based on integration and integrated analysis of geophysical data on regional profiles, geophysical potential fields and additional geological information on the study area. Analytical support includes packages transformations, filtering, statistical processing, calculation, finding of lineaments, solving direct and inverse tasks, integration of geographic information.
Technology and software and analytical support was tested in solving problems tectonic zoning in scale 1:200000, 1:1000000 in Yakutia, Kazakhstan, Rostov region, studying the deep structure of regional profiles 1:S, 1-SC, 2-SAT, 3-SAT and 2-DV, oil and gas forecast in the regions of Eastern Siberia, Brazil.
The article describes two possible approaches of parallel calculations for data processing 2D or 3D nets in the field of geophysical research. As an example presented realization in the environment of GRID of the application software ZondGeoStat (statistical sensing), which create 3D net model on the basis of data 2d net. The experience has demonstrated the high efficiency of the use of environment of GRID during realization of calculations in field of geophysical researches.
-
A.S. Komarov’s publications about cellular automata modelling of the population-ontogenetic development in plants: a review
Computer Research and Modeling, 2016, v. 8, no. 2, pp. 285-295The possibilities of cellular automata simulation applied to herbs and dwarf shrubs are described. Basicprinciples of discrete description of the ontogenesis of plants on which the mathematical modeling based are presents. The review discusses the main research results obtained with the use of models that revealing the patterns of functioning of populations and communities. The CAMPUS model and the results of computer experiment to study the growth of two clones of lingonberry with different geometry of the shoots are described. The paper is dedicated to the works of the founder of the direction of prof. A. S. Komarov. A list of his major publications on this subject is given.
Keywords: computer models, individual-based approach.Views (last year): 2. Citations: 6 (RSCI). -
Layered Bénard–Marangoni convection during heat transfer according to the Newton’s law of cooling
Computer Research and Modeling, 2016, v. 8, no. 6, pp. 927-940Views (last year): 10. Citations: 3 (RSCI).The paper considers mathematical modeling of layered Benard–Marangoni convection of a viscous incompressible fluid. The fluid moves in an infinitely extended layer. The Oberbeck–Boussinesq system describing layered Benard–Marangoni convection is overdetermined, since the vertical velocity is zero identically. We have a system of five equations to calculate two components of the velocity vector, temperature and pressure (three equations of impulse conservation, the incompressibility equation and the heat equation). A class of exact solutions is proposed for the solvability of the Oberbeck–Boussinesq system. The structure of the proposed solution is such that the incompressibility equation is satisfied identically. Thus, it is possible to eliminate the «extra» equation. The emphasis is on the study of heat exchange on the free layer boundary, which is considered rigid. In the description of thermocapillary convective motion, heat exchange is set according to the Newton’s law of cooling. The application of this heat distribution law leads to the third-kind initial-boundary value problem. It is shown that within the presented class of exact solutions to the Oberbeck–Boussinesq equations the overdetermined initial-boundary value problem is reduced to the Sturm–Liouville problem. Consequently, the hydrodynamic fields are expressed using trigonometric functions (the Fourier basis). A transcendental equation is obtained to determine the eigenvalues of the problem. This equation is solved numerically. The numerical analysis of the solutions of the system of evolutionary and gradient equations describing fluid flow is executed. Hydrodynamic fields are analyzed by a computational experiment. The existence of counterflows in the fluid layer is shown in the study of the boundary value problem. The existence of counterflows is equivalent to the presence of stagnation points in the fluid, and this testifies to the existence of a local extremum of the kinetic energy of the fluid. It has been established that each velocity component cannot have more than one zero value. Thus, the fluid flow is separated into two zones. The tangential stresses have different signs in these zones. Moreover, there is a fluid layer thickness at which the tangential stresses at the liquid layer equal to zero on the lower boundary. This physical effect is possible only for Newtonian fluids. The temperature and pressure fields have the same properties as velocities. All the nonstationary solutions approach the steady state in this case.
-
Analysis of point model of fibrin polymerization
Computer Research and Modeling, 2017, v. 9, no. 2, pp. 247-258Views (last year): 8.Functional modeling of blood clotting and fibrin-polymer mesh formation is of a significant value for medical and biophysics applications. Despite the fact of some discrepancies present in simplified functional models their results are of the great interest for the experimental science as a handy tool of the analysis for research planning, data processing and verification. Under conditions of the good correspondence to the experiment functional models can be used as an element of the medical treatment methods and biophysical technologies. The aim of the paper in hand is a modeling of a point system of the fibrin-polymer formation as a multistage polymerization process with a sol-gel transition at the final stage. Complex-value Rosenbroke method of second order (CROS) used for computational experiments. The results of computational experiments are presented and discussed. It was shown that in the physiological range of the model coefficients there is a lag period of approximately 20 seconds between initiation of the reaction and fibrin gel appearance which fits well experimental observations of fibrin polymerization dynamics. The possibility of a number of the consequent $(n = 1–3)$ sol-gel transitions demonstrated as well. Such a specific behavior is a consequence of multistage nature of fibrin polymerization process. At the final stage the solution of fibrin oligomers of length 10 can reach a semidilute state, leading to an extremely fast gel formation controlled by oligomers’ rotational diffusion. Otherwise, if the semidilute state is not reached the gel formation is controlled by significantly slower process of translational diffusion. Such a duality in the sol-gel transition led authors to necessity of introduction of a switch-function in an equation for fibrin-polymer formation kinetics. Consequent polymerization events can correspond to experimental systems where fibrin mesh formed gets withdrawn from the volume by some physical process like precipitation. The sensitivity analysis of presented system shows that dependence on the first stage polymerization reaction constant is non-trivial.
-
Stochastic formalization of the gas dynamic hierarchy
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 767-779Mathematical models of gas dynamics and its computational industry, in our opinion, are far from perfect. We will look at this problem from the point of view of a clear probabilistic micro-model of a gas from hard spheres, relying on both the theory of random processes and the classical kinetic theory in terms of densities of distribution functions in phase space, namely, we will first construct a system of nonlinear stochastic differential equations (SDE), and then a generalized random and nonrandom integro-differential Boltzmann equation taking into account correlations and fluctuations. The key feature of the initial model is the random nature of the intensity of the jump measure and its dependence on the process itself.
Briefly recall the transition to increasingly coarse meso-macro approximations in accordance with a decrease in the dimensionalization parameter, the Knudsen number. We obtain stochastic and non-random equations, first in phase space (meso-model in terms of the Wiener — measure SDE and the Kolmogorov – Fokker – Planck equations), and then — in coordinate space (macro-equations that differ from the Navier – Stokes system of equations and quasi-gas dynamics systems). The main difference of this derivation is a more accurate averaging by velocity due to the analytical solution of stochastic differential equations with respect to the Wiener measure, in the form of which an intermediate meso-model in phase space is presented. This approach differs significantly from the traditional one, which uses not the random process itself, but its distribution function. The emphasis is placed on the transparency of assumptions during the transition from one level of detail to another, and not on numerical experiments, which contain additional approximation errors.
The theoretical power of the microscopic representation of macroscopic phenomena is also important as an ideological support for particle methods alternative to difference and finite element methods.
-
Tasks and algorithms for optimal clustering of multidimensional objects by a variety of heterogeneous indicators and their applications in medicine
Computer Research and Modeling, 2024, v. 16, no. 3, pp. 673-693The work is devoted to the description of the author’s formal statements of the clustering problem for a given number of clusters, algorithms for their solution, as well as the results of using this toolkit in medicine.
The solution of the formulated problems by exact algorithms of implementations of even relatively low dimensions before proving optimality is impossible in a finite time due to their belonging to the NP class.
In this regard, we have proposed a hybrid algorithm that combines the advantages of precise methods based on clustering in paired distances at the initial stage with the speed of methods for solving simplified problems of splitting by cluster centers at the final stage. In the development of this direction, a sequential hybrid clustering algorithm using random search in the paradigm of swarm intelligence has been developed. The article describes it and presents the results of calculations of applied clustering problems.
To determine the effectiveness of the developed tools for optimal clustering of multidimensional objects according to a variety of heterogeneous indicators, a number of computational experiments were performed using data sets including socio-demographic, clinical anamnestic, electroencephalographic and psychometric data on the cognitive status of patients of the cardiology clinic. An experimental proof of the effectiveness of using local search algorithms in the paradigm of swarm intelligence within the framework of a hybrid algorithm for solving optimal clustering problems has been obtained.
The results of the calculations indicate the actual resolution of the main problem of using the discrete optimization apparatus — limiting the available dimensions of task implementations. We have shown that this problem is eliminated while maintaining an acceptable proximity of the clustering results to the optimal ones. The applied significance of the obtained clustering results is also due to the fact that the developed optimal clustering toolkit is supplemented by an assessment of the stability of the formed clusters, which allows for known factors (the presence of stenosis or older age) to additionally identify those patients whose cognitive resources are insufficient to overcome the influence of surgical anesthesia, as a result of which there is a unidirectional effect of postoperative deterioration of complex visual-motor reaction, attention and memory. This effect indicates the possibility of differentiating the classification of patients using the proposed tools.
-
CUDA and OpenCL implementations of Conway’s Game of Life cellular automata
Computer Research and Modeling, 2010, v. 2, no. 3, pp. 323-326Views (last year): 9. Citations: 3 (RSCI).In this article the experience of reading “CUDA and OpenCL programming” course during high perfomance computing summer school MIPT-2010 is analyzed. Content of lectures and practical tasks, as well as manner of presenting of the material are regarded. Performance issues of different algorithms implemented by students at practical training session are dicussed.
-
Methodology and program for the storage and statistical analysis of the results of computer experiment
Computer Research and Modeling, 2013, v. 5, no. 4, pp. 589-595Views (last year): 1. Citations: 5 (RSCI).The problem of accumulation and the statistical analysis of computer experiment results are solved. The main experiment program is considered as the data source. The results of main experiment are collected on specially prepared sheet Excel with pre-organized structure for the accumulation, statistical processing and visualization of the data. The created method and the program are used at efficiency research of the scientific researches which are carried out by authors.
-
Research of possibility for man the parallel information handling in task series with increase complexity
Computer Research and Modeling, 2013, v. 5, no. 5, pp. 845-861Views (last year): 1. Citations: 4 (RSCI).We schedule the computer technology for present the engineer psychology tests which reveal probationer men which may hasten the logic task solution by simultaneous execution several standard logic operations. These tests based on the theory of two logic task kinds: in first kind the parallel logic is effectively, and in second kind it is not effectively. The realize experiment confirms the capability parallel logic for impotent part of people. The vital speedup execution of logic operations is very uncommon in simultaneous logic. The efficacy of methodic is confirmed.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"