Результаты поиска по 'modeling methods':
Найдено статей: 420
  1. The paper presents the results of applying a scheme of very high accuracy and resolution to obtain numerical solutions of the Navier – Stokes equations of a compressible gas describing the occurrence and development of instability of a two-dimensional laminar boundary layer on a flat plate. The peculiarity of the conducted studies is the absence of commonly used artificial exciters of instability in the implementation of direct numerical modeling. The multioperator scheme used made it possible to observe the subtle effects of the birth of unstable modes and the complex nature of their development caused presumably by its small approximation errors. A brief description of the scheme design and its main properties is given. The formulation of the problem and the method of obtaining initial data are described, which makes it possible to observe the established non-stationary regime fairly quickly. A technique is given that allows detecting flow fluctuations with amplitudes many orders of magnitude smaller than its average values. A time-dependent picture of the appearance of packets of Tollmien – Schlichting waves with varying intensity in the vicinity of the leading edge of the plate and their downstream propagation is presented. The presented amplitude spectra with expanding peak values in the downstream regions indicate the excitation of new unstable modes other than those occurring in the vicinity of the leading edge. The analysis of the evolution of instability waves in time and space showed agreement with the main conclusions of the linear theory. The numerical solutions obtained seem to describe for the first time the complete scenario of the possible development of Tollmien – Schlichting instability, which often plays an essential role at the initial stage of the laminar-turbulent transition. They open up the possibilities of full-scale numerical modeling of this process, which is extremely important for practice, with a similar study of the spatial boundary layer.

  2. Timiryanova V.M., Lakman I.A., Larkin M.M.
    Retail forecasting on high-frequency depersonalized data
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1713-1734

    Technological development determines the emergence of highly detailed data in time and space, which expands the possibilities of analysis, allowing us to consider consumer decisions and the competitive behavior of enterprises in all their diversity, taking into account the context of the territory and the characteristics of time periods. Despite the promise of such studies, they are currently limited in the scientific literature. This is due to the range of problems, the solution of which is considered in this paper. The article draws attention to the complexity of the analysis of depersonalized high-frequency data and the possibility of modeling consumption changes in time and space based on them. The features of the new type of data are considered on the example of real depersonalized data received from the fiscal data operator “First OFD” (JSC “Energy Systems and Communications”). It is shown that along with the spectrum of problems inherent in high-frequency data, there are disadvantages associated with the process of generating data on the side of the sellers, which requires a wider use of data mining tools. A series of statistical tests were carried out on the data under consideration, including a Unit-Root Test, test for unobserved individual effects, test for serial correlation and for cross-sectional dependence in panels, etc. The presence of spatial autocorrelation of the data was tested using modified tests of Lagrange multipliers. The tests carried out showed the presence of a consistent correlation and spatial dependence of the data, which determine the expediency of applying the methods of panel and spatial analysis in relation to high-frequency data accumulated by fiscal operators. The constructed models made it possible to substantiate the spatial relationship of sales growth and its dependence on the day of the week. The limitation for increasing the predictive ability of the constructed models and their subsequent complication, due to the inclusion of explanatory factors, was the lack of open access statistics grouped in the required detail in time and space, which determines the relevance of the formation of high-frequency geographically structured data bases.

  3. Pogorelova E.A., Lobanov A.I.
    High Performance Computing for Blood Modeling
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 917-941

    Methods for modeling blood flow and its rheological properties are reviewed. Blood is considered as a particle suspencion. The methods are boundary integral equation method (BIEM), lattice Boltzmann (LBM), finite elements on dynamic mesh, dissipative particle dynamics (DPD) and agent based modeling. The analysis of these methods’ applications on high-performance systems with various architectures is presented.

    Views (last year): 2. Citations: 3 (RSCI).
  4. Aronov I.Z., Maksimova O.V.
    Theoretical modeling consensus building in the work of standardization technical committees in coalitions based on regular Markov chains
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1247-1256

    Often decisions in social groups are made by consensus. This applies, for example, to the examination in the technical committee for standardization (TC) before the approval of the national standard by Rosstandart. The standard is approved if and only if the secured consensus in the TC. The same approach to standards development was adopted in almost all countries and at the regional and international level. Previously published works of authors dedicated to the construction of a mathematical model of time to reach consensus in technical committees for standardization in terms of variation in the number of TC members and their level of authoritarianism. The present study is a continuation of these works for the case of the formation of coalitions that are often formed during the consideration of the draft standard to the TC. In the article the mathematical model is constructed to ensure consensus on the work of technical standardization committees in terms of coalitions. In the framework of the model it is shown that in the presence of coalitions consensus is not achievable. However, the coalition, as a rule, are overcome during the negotiation process, otherwise the number of the adopted standards would be extremely small. This paper analyzes the factors that influence the bridging coalitions: the value of the assignment and an index of the effect of the coalition. On the basis of statistical modelling of regular Markov chains is investigated their effects on the time to ensure consensus in the technical Committee. It is proved that the time to reach consensus significantly depends on the value of unilateral concessions coalition and weakly depends on the size of coalitions. Built regression model of dependence of the average number of approvals from the value of the assignment. It was revealed that even a small concession leads to the onset of consensus, increasing the size of the assignment results (with other factors being equal) to a sharp decline in time before the consensus. It is shown that the assignment of a larger coalition against small coalitions takes on average more time before consensus. The result has practical value for all organizational structures, where the emergence of coalitions entails the inability of decision-making in the framework of consensus and requires the consideration of various methods for reaching a consensus decision.

  5. Guzev M.A., Nikitina E.Yu.
    Rank analysis of the criminal codes of the Russian Federation, the Federal Republic of Germany and the People’s Republic of China
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 969-981

    When making decisions in various fields of human activity, it is often required to create text documents. Traditionally, the study of texts is engaged in linguistics, which in a broad sense can be understood as a part of semiotics — the science of signs and sign systems, while semiotic objects are of different types. The method of rank distributions is widely used for the quantitative study of sign systems. Rank distribution is a set of item names sorted in descending order by frequency of occurrence. For frequency-rank distributions, researchers often use the term «power-law distributions».

    In this paper, the rank distribution method is used to analyze the Criminal Code of various countries. The general idea of the approach to solving this problem is to consider the code as a text document, in which the sign is the measure of punishment for certain crimes. The document is presented as a list of occurrences of a specific word (character) and its derivatives (word forms). The combination of all these signs characters forms a punishment dictionary, for which the occurrence frequency of each punishment in the code text is calculated. This allows us to transform the constructed dictionary into a frequency dictionary of punishments and conduct its further research using the V. P. Maslov approach, proposed to analyze the linguistics problems. This approach introduces the concept of the virtual frequency of crime occurrence, which is an assessment measure of the real harm to society and the consequences of the crime committed in various spheres of human life. On this path, the paper proposes a parametrization of the rank distribution to analyze the punishment dictionary of the Special Part of the Criminal Code of the Russian Federation concerning punishments for economic crimes. Various versions of the code are considered, and the constructed model was shown to reflect objectively undertaken over time by legislators its changes for the better. For the Criminal Codes in force in the Federal Republic of Germany and the People’s Republic of China, the texts including similar offenses and analogous to the Russian special section of the Special Part were studied. The rank distributions obtained in the article for the corresponding frequency dictionaries of codes coincide with those obtained by V. P. Maslov’s law, which essentially clarifies Zipf’s law. This allows us to conclude both the good text organization and the adequacy of the selected punishments for crimes.

  6. Shatrov A.V., Okhapkin V.P.
    Optimal control of bank investment as a factorof economic stability
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 959-967

    This paper presents a model of replenishment of bank liquidity by additional income of banks. Given the methodological basis for the necessity for bank stabilization funds to cover losses during the economy crisis. An econometric derivation of the equations describing the behavior of the bank financial and operating activity performed. In accordance with the purpose of creating a stabilization fund introduces an optimality criterion used controls. Based on the equations of the behavior of the bank by the method of dynamic programming is derived a vector of optimal controls.

    Views (last year): 5.
  7. Marosi A.C., Lovas R.
    Defining volunteer computing: a formal approach
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 565-571

    Volunteer computing resembles private desktop grids whereas desktop grids are not fully equivalent to volunteer computing. There are several attempts to distinguish and categorize them using informal and formal methods. However, most formal approaches model a particular middleware and do not focus on the general notion of volunteer or desktop grid computing. This work makes an attempt to formalize their characteristics and relationship. To this end formal modeling is applied that tries to grasp the semantic of their functionalities — as opposed to comparisons based on properties, features, etc. We apply this modeling method to formalize the Berkeley Open Infrastructure for Network Computing (BOINC) [Anderson D. P., 2004] volunteer computing system.

  8. Podryga V.O., Polyakov S.V.
    3D molecular dynamic simulation of thermodynamic equilibrium problem for heated nickel
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 573-579

    This work is devoted to molecular dynamic modeling of the thermal impact processes on the metal sample consisting of nickel atoms. For the solution of this problem, a continuous mathematical model on the basis of the classical Newton mechanics equations has been used; a numerical method based on the Verlet scheme has been chosen; a parallel algorithm has been offered, and its realization within the MPI and OpenMP technologies has been executed. By means of the developed parallel program, the investigation of thermodynamic equilibrium of nickel atoms’ system under the conditions of heating a sample to desired temperature has been executed. In numerical experiments both optimum parameters of calculation procedure and physical parameters of analyzed process have been defined. The obtained numerical results are well corresponding to known theoretical and experimental data.

    Views (last year): 2.
  9. Lotarev D.T.
    Allocation of steinerpoints in euclidean Steiner tree problem by means of MatLab package
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 707-713

    The problem of allocation of Steiner points in Euclidean Steiner Tree is considered. The cost of network is sum of building costs and cost of the information transportation. Euclidean Steiner tree problem in the form of topological network design is a good model of this problem.

    The package MatLab has the way to solve the second part of this problem — allocate Steiner points under condition that the adjacency matrix is set. The method to get solution has been worked out. The Steiner tree is formed by means of solving of the sequence of "three points" Steiner

    Views (last year): 4.
  10. Bogdanov A.V., Mareev V.V., Stepanov E.A., Panchenko M.V.
    Modeling of behavior of the option. The formulation of the problem
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 759-766

    Object of research: The creation of algorithm for mass computations of options‘ price for formation of a riskless portfolio. The method is based on the generalization of the Black–Scholes method. The task is the modeling of behavior of all options and tools for their insurance. This task is characterized by large volume of realtime complex computations that should be executed concurrently The problem of the research: depending on conditions approaches to the solution should be various. There are three methods which can be used with different conditions: the finite difference method, the path-integral approach and methods which work in conditions of trade stop. Distributed computating in these three cases is organized differently and it is necessary to involve various approaches. In addition to complexity the mathematical formulation of the problem in literature is not quite correct. There is no complete description of boundary and initial conditions and also several hypotheses of the model do not correspond to real market. It is necessary to give mathematically correct formulation of the task, and to neutralize a difference between hypotheses of the model and their prototypes in the market. For this purpose it is necessary to expand standard formulation by additional methods and develop methods of realization for each of solution branches.

    Views (last year): 2. Citations: 1 (RSCI).
Pages: « first previous

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"