Результаты поиска по 'models':
Найдено статей: 882
  1. Kireenkov A.A., Zhavoronok S.I., Nushtaev D.V.
    On tire models accounting for both deformed state and coupled dry friction in a contact spot
    Computer Research and Modeling, 2021, v. 13, no. 1, pp. 163-173

    A proposed approximate model of the rolling of a deforming wheel with a pneumatic tire allows one to account as well forces in tires as the effect of the dry friction on the stability of the rolling upon the shimmy phenomenon prognosis. The model os based on the theory of the dry friction with combined kinematics of relative motion of interacting bodies, i. e. under the condition of simultaneous rolling, sliding, and spinning with accounting for the real shape of a contact spot and contact pressure distribution. The resultant vector and couple of the forces generated by the contact interaction with dry friction are defined by integration over the contact area, whereas the static contact pressure under the conditions of vanishing velocity of sliding and angular velocity of spinning is computed after the finite-element solution for the statical contact of a pneumatic with a rigid road with accounting forreal internal structure and properties of a tire. The solid finite element model of a typical tire with longitudinal thread is used below as a background. Given constant boost pressure, vertical load and static friction factor 0.5 the numerical solution is constructed, as well as the appropriate solutions for lateral and torsional kinematic loading. It is shown that the contact interaction of a pneumatic tire and an absolutely rigid road could be represented without crucial loss of accuracy as two typical stages, the adhesion and the slip; the contact area shape remains nevertheless close to a circle. The approximate diagrams are constructed for both lateral force and friction torque; on the initial stage the diagrams are linear so that corresponds to the elastic deformation of a tire while on the second stage both force and torque values are constant and correspond to the dry friction force and torque. For the last stages the approximate formulae for the longitudinal and lateral friction force and the friction torque are constructed on the background of the theory of the dry friction with combined kinematics. The obtained model can be treated as a combination of the Keldysh model of elastic wheel with no slip and spin and the Klimov rigid wheel model interacting with a road by dry friction forces.

  2. Dementiev V.E.
    The model of interference of long waves of economic development
    Computer Research and Modeling, 2021, v. 13, no. 3, pp. 649-663

    The article substantiates the need to develop and analyze mathematical models that take into account the mutual influence of long (Kondratiev) waves of economic development. The analysis of the available publications shows that at the model level, the direct and inverse relationships between intersecting long waves are still insufficiently studied. As practice shows, the production of the current long wave can receive an additional impetus for growth from the technologies of the next long wave. The technologies of the next industrial revolution often serve as improving innovations for the industries born of the previous industrial revolution. As a result, the new long wave increases the amplitude of the oscillations of the trajectory of the previous long wave. Such results of the interaction of long waves in the economy are similar to the effects of interference of physical waves. The mutual influence of the recessions and booms of the economies of different countries gives even more grounds for comparing the consequences of this mutual influence with the interference of physical waves. The article presents a model for the development of the technological base of production, taking into account the possibilities of combining old and new technologies. The model consists of several sub-models. The use of a different mathematical description for the individual stages of updating the technological base of production allows us to take into account the significant differences between the successive phases of the life cycle of general purpose technologies, considered in modern literature as the technological basis of industrial revolutions. One of these phases is the period of formation of the appropriate infrastructure necessary for the intensive diffusion of new general purpose technology, for the rapid development of industries using this technology. The model is used for illustrative calculations with the values of exogenous parameters corresponding to the logic of changing long waves. Despite all the conditionality of the illustrative calculations, the configuration of the curve representing the change in the return on capital in the simulated period is close to the configuration of the real trajectory of the return on private fixed assets of the US economy in the period 1982-2019. The factors that remained outside the scope of the presented model, but which are advisable to take into account when describing the interference of long waves of economic development, are indicated.

  3. Safiullina L.F., Gubaydullin I.M.
    Analysis of the identifiability of the mathematical model of propane pyrolysis
    Computer Research and Modeling, 2021, v. 13, no. 5, pp. 1045-1057

    The article presents the numerical modeling and study of the kinetic model of propane pyrolysis. The study of the reaction kinetics is a necessary stage in modeling the dynamics of the gas flow in the reactor.

    The kinetic model of propane pyrolysis is a nonlinear system of ordinary differential equations of the first order with parameters, the role of which is played by the reaction rate constants. Math modeling of processes is based on the use of the mass conservation law. To solve an initial (forward) problem, implicit methods for solving stiff ordinary differential equation systems are used. The model contains 60 input kinetic parameters and 17 output parameters corresponding to the reaction substances, of which only 9 are observable. In the process of solving the problem of estimating parameters (inverse problem), there is a question of non-uniqueness of the set of parameters that satisfy the experimental data. Therefore, before solving the inverse problem, the possibility of determining the parameters of the model is analyzed (analysis of identifiability).

    To analyze identifiability, we use the orthogonal method, which has proven itself well for analyzing models with a large number of parameters. The algorithm is based on the analysis of the sensitivity matrix by the methods of differential and linear algebra, which shows the degree of dependence of the unknown parameters of the models on the given measurements. The analysis of sensitivity and identifiability showed that the parameters of the model are stably determined from a given set of experimental data. The article presents a list of model parameters from most to least identifiable. Taking into account the analysis of the identifiability of the mathematical model, restrictions were introduced on the search for less identifiable parameters when solving the inverse problem.

    The inverse problem of estimating the parameters was solved using a genetic algorithm. The article presents the found optimal values of the kinetic parameters. A comparison of the experimental and calculated dependences of the concentrations of propane, main and by-products of the reaction on temperature for different flow rates of the mixture is presented. The conclusion about the adequacy of the constructed mathematical model is made on the basis of the correspondence of the results obtained to physicochemical laws and experimental data.

  4. Ansori Moch.F., Sumarti N.N., Sidarto K.A., Gunadi I.I.
    An Algorithm for Simulating the Banking Network System and Its Application for Analyzing Macroprudential Policy
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1275-1289

    Modeling banking systems using a network approach has received growing attention in recent years. One of the notable models is that developed by Iori et al, who proposed a banking system model for analyzing systemic risks in interbank networks. The model is built based on the simple dynamics of several bank balance sheet variables such as deposit, equity, loan, liquid asset, and interbank lending (or borrowing) in the form of difference equations. Each bank faces random shocks in deposits and loans. The balance sheet is updated at the beginning or end of each period. In the model, banks are grouped into either potential lenders or borrowers. The potential borrowers are those that have lack of liquidity and the potential lenders are those which have excess liquids after dividend payment and channeling new investment. The borrowers and the lenders are connected through the interbank market. Those borrowers have some percentage of linkage to random potential lenders for borrowing funds to maintain their safety net of the liquidity. If the demand for borrowing funds can meet the supply of excess liquids, then the borrower bank survives. If not, they are deemed to be in default and will be removed from the banking system. However, in their paper, most part of the interbank borrowing-lending mechanism is described qualitatively rather than by detailed mathematical or computational analysis. Therefore, in this paper, we enhance the mathematical parts of borrowing-lending in the interbank market and present an algorithm for simulating the model. We also perform some simulations to analyze the effects of the model’s parameters on banking stability using the number of surviving banks as the measure. We apply this technique to analyze the effects of a macroprudential policy called loan-to-deposit ratio based reserve requirement for banking stability.

  5. Korepanov V.O., Chkhartishvili A.G., Shumov V.V.
    Game-theoretic and reflexive combat models
    Computer Research and Modeling, 2022, v. 14, no. 1, pp. 179-203

    Modeling combat operations is an urgent scientific and practical task aimed at providing commanders and staffs with quantitative grounds for making decisions. The authors proposed the function of victory in combat and military operations, based on the function of the conflict by G. Tullock and taking into account the scale of combat (military) operations. On a sufficient volume of military statistics, the scale parameter was assessed and its values were found for the tactical, operational and strategic levels. The game-theoretic models «offensive – defense», in which the sides solve the immediate and subsequent tasks, having the formation of troops in one or several echelons, have been investigated. At the first stage of modeling, the solution of the immediate task is found — the breakthrough (holding) of defense points, at the second — the solution of the subsequent task — the defeat of the enemy in the depth of the defense (counterattack and restoration of defense). For the tactical level, using the Nash equilibrium, solutions were found for the closest problem (distribution of the forces of the sides by points of defense) in an antagonistic game according to three criteria: a) breakthrough of the weakest point, b) breakthrough of at least one point, and c) weighted average probability. It is shown that it is advisable for the attacking side to use the criterion of «breaking through at least one point», in which, all other things being equal, the maximum probability of breaking through the points of defense is ensured. At the second stage of modeling for a particular case (the sides are guided by the criterion of breaking through the weakest point when breaking through and holding defense points), the problem of distributing forces and facilities between tactical tasks (echelons) was solved according to two criteria: a) maximizing the probability of breaking through the defense point and the probability of defeating the enemy in depth defense, b) maximizing the minimum value of the named probabilities (the criterion of the guaranteed result). Awareness is an important aspect of combat operations. Several examples of reflexive games (games characterized by complex mutual awareness) and information management are considered. It is shown under what conditions information control increases the player’s payoff, and the optimal information control is found.

  6. Syzranova N.G., Andruschenko V.A.
    Numerical modeling of physical processes leading to the destruction of meteoroids in the Earth’s atmosphere
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 835-851

    Within the framework of the actual problem of comet-asteroid danger, the physical processes causing the destruction and fragmentation of meteor bodies in the Earth’s atmosphere are numerically investigated. Based on the developed physicalmathematical models that determines the movements of space objects of natural origin in the atmosphere and their interaction with it, the fall of three, one of the largest and by some parameters unusual bolides in the history of meteoritics, are considered: Tunguska, Vitim and Chelyabinsk. Their singularity lies in the absence of any material meteorite remains and craters in the area of the alleged crash site for the first two bodies and the non-detection, as it is assumed, of the main mother body for the third body (due to the too small amount of mass of the fallen fragments compared to the estimated mass). The effect of aerodynamic loads and heat flows on these bodies are studied, which leads to intensive surface mass loss and possible mechanical destruction. The velocities of the studied celestial bodies and the change in their masses are determined from the modernized system of equations of the theory of meteoric physics. An important factor that is taken into account here is the variability of the meteorite mass entrainment parameter under the action of heat fluxes (radiation and convective) along the flight path. The process of fragmentation of meteoroids in this paper is considered within the framework of a progressive crushing model based on the statistical theory of strength, taking into account the influence of the scale factor on the ultimate strength of objects. The phenomena and effects arising at various kinematic and physical parameters of each of these bodies are revealed. In particular, the change in the ballistics of their flight in the denser layers of the atmosphere, consisting in the transition from the fall mode to the ascent mode. At the same time, the following scenarios of the event can be realized: 1) the return of the body back to outer space at its residual velocity greater than the second cosmic one; 2) the transition of the body to the orbit of the Earth satellite at a residual velocity greater than the first cosmic one; 3) at lower values of the residual velocity of the body, its return after some time to the fall mode and falling out at a considerable distance from the intended crash site. It is the implementation of one of these three scenarios of the event that explains, for example, the absence of material traces, including craters, in the case of the Tunguska bolide in the vicinity of the forest collapse. Assumptions about the possibility of such scenarios have been made earlier by other authors, and in this paper their implementation is confirmed by the results of numerical calculations.

  7. Belotelov N.V., Loginov F.V.
    The agent model of intercultural interactions: the emergence of cultural uncertainties
    Computer Research and Modeling, 2022, v. 14, no. 5, pp. 1143-1162

    The article describes a simulation agent-based model of intercultural interactions in a country whose population belongs to different cultures. It is believed that the space of cultures can be represented as a Hilbert space, in which certain subspaces correspond to different cultures. In the model, the concept of culture is understood as a structured subspace of the Hilbert space. This makes it possible to describe the state of agents by a vector in a Hilbert space. It is believed that each agent is described by belonging to a certain «culture». The number of agents belonging to certain cultures is determined by demographic processes that correspond to these cultures, the depth and integrity of the educational process, as well as the intensity of intercultural contacts. Interaction between agents occurs within clusters, into which, according to certain criteria, the entire set of agents is divided. When agents interact according to a certain algorithm, the length and angle that characterize the state of the agent change. In the process of imitation, depending on the number of agents belonging to different cultures, the intensity of demographic and educational processes, as well as the intensity of intercultural contacts, aggregates of agents (clusters) are formed, the agents of which belong to different cultures. Such intercultural clusters do not entirely belong to any of the cultures initially considered in the model. Such intercultural clusters create uncertainties in cultural dynamics. The paper presents the results of simulation experiments that illustrate the influence of demographic and educational processes on the dynamics of intercultural clusters. The issues of the development of the proposed approach to the study (discussion) of the transitional states of the development of cultures are discussed.

  8. Suzdaltsev V.A., Suzdaltsev I.V., Tarhavova E.G.
    Fuzzy knowledge extraction in the development of expert predictive diagnostic systems
    Computer Research and Modeling, 2022, v. 14, no. 6, pp. 1395-1408

    Expert systems imitate professional experience and thinking process of a specialist to solve problems in various subject areas. An example of the problem that it is expedient to solve with the help of the expert system is the problem of forming a diagnosis that arises in technology, medicine, and other fields. When solving the diagnostic problem, it is necessary to anticipate the occurrence of critical or emergency situations in the future. They are situations, which require timely intervention of specialists to prevent critical aftermath. Fuzzy sets theory provides one of the approaches to solve ill-structured problems, diagnosis-making problems belong to which. The theory of fuzzy sets provides means for the formation of linguistic variables, which are helpful to describe the modeled process. Linguistic variables are elements of fuzzy logical rules that simulate the reasoning of professionals in the subject area. To develop fuzzy rules it is necessary to resort to a survey of experts. Knowledge engineers use experts’ opinion to evaluate correspondence between a typical current situation and the risk of emergency in the future. The result of knowledge extraction is a description of linguistic variables that includes a combination of signs. Experts are involved in the survey to create descriptions of linguistic variables and present a set of simulated situations.When building such systems, the main problem of the survey is laboriousness of the process of interaction of knowledge engineers with experts. The main reason is the multiplicity of questions the expert must answer. The paper represents reasoning of the method, which allows knowledge engineer to reduce the number of questions posed to the expert. The paper describes the experiments carried out to test the applicability of the proposed method. An expert system for predicting risk groups for neonatal pathologies and pregnancy pathologies using the proposed knowledge extraction method confirms the feasibility of the proposed approach.

  9. Makarov I.S., Bagantsova E.R., Iashin P.A., Kovaleva M.D., Zakharova E.M.
    Development of and research into a rigid algorithm for analyzing Twitter publications and its influence on the movements of the cryptocurrency market
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 157-170

    Social media is a crucial indicator of the position of assets in the financial market. The paper describes the rigid solution for the classification problem to determine the influence of social media activity on financial market movements. Reputable crypto traders influencers are selected. Twitter posts packages are used as data. The methods of text, which are characterized by the numerous use of slang words and abbreviations, and preprocessing consist in lemmatization of Stanza and the use of regular expressions. A word is considered as an element of a vector of a data unit in the course of solving the problem of binary classification. The best markup parameters for processing Binance candles are searched for. Methods of feature selection, which is necessary for a precise description of text data and the subsequent process of establishing dependence, are represented by machine learning and statistical analysis. First, the feature selection is used based on the information criterion. This approach is implemented in a random forest model and is relevant for the task of feature selection for splitting nodes in a decision tree. The second one is based on the rigid compilation of a binary vector during a rough check of the presence or absence of a word in the package and counting the sum of the elements of this vector. Then a decision is made depending on the superiority of this sum over the threshold value that is predetermined previously by analyzing the frequency distribution of mentions of the word. The algorithm used to solve the problem was named benchmark and analyzed as a tool. Similar algorithms are often used in automated trading strategies. In the course of the study, observations of the influence of frequently occurring words, which are used as a basis of dimension 2 and 3 in vectorization, are described as well.

  10. Bernadotte A., Mazurin A.D.
    Optimization of the brain command dictionary based on the statistical proximity criterion in silent speech recognition task
    Computer Research and Modeling, 2023, v. 15, no. 3, pp. 675-690

    In our research, we focus on the problem of classification for silent speech recognition to develop a brain– computer interface (BCI) based on electroencephalographic (EEG) data, which will be capable of assisting people with mental and physical disabilities and expanding human capabilities in everyday life. Our previous research has shown that the silent pronouncing of some words results in almost identical distributions of electroencephalographic signal data. Such a phenomenon has a suppressive impact on the quality of neural network model behavior. This paper proposes a data processing technique that distinguishes between statistically remote and inseparable classes in the dataset. Applying the proposed approach helps us reach the goal of maximizing the semantic load of the dictionary used in BCI.

    Furthermore, we propose the existence of a statistical predictive criterion for the accuracy of binary classification of the words in a dictionary. Such a criterion aims to estimate the lower and the upper bounds of classifiers’ behavior only by measuring quantitative statistical properties of the data (in particular, using the Kolmogorov – Smirnov method). We show that higher levels of classification accuracy can be achieved by means of applying the proposed predictive criterion, making it possible to form an optimized dictionary in terms of semantic load for the EEG-based BCIs. Furthermore, using such a dictionary as a training dataset for classification problems grants the statistical remoteness of the classes by taking into account the semantic and phonetic properties of the corresponding words and improves the classification behavior of silent speech recognition models.

Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"