Результаты поиска по 'risk':
Найдено статей: 26
  1. Malinetsky G.G.
    Image of the teacher. Ten years afterward
    Computer Research and Modeling, 2015, v. 7, no. 4, pp. 789-811

    The work outlines the key ideas of Kurdyumov S.P., an outstanding specialist in applied mathematics, self-organization theory, transdisciplinary research. It considers the development of his scientific ideas in the last decade and formulates a set of open problems in synergetics which will probably stimulate the development of this approach. The article is an engaged version of the report made at Xth Kurdyumov readings held in Tver State University in 2015.

    Views (last year): 4.
  2. Dushkin R.V.
    Review of Modern State of Quantum Technologies
    Computer Research and Modeling, 2018, v. 10, no. 2, pp. 165-179

    At present modern quantum technologies can get a new twist of development, which will certainly give an opportunity to obtain solutions for numerous problems that previously could not be solved in the framework of “traditional” paradigms and computational models. All mankind stands at the threshold of the so-called “second quantum revolution”, and its short-term and long-term consequences will affect virtually all spheres of life of a global society. Such directions and branches of science and technology as materials science, nanotechnology, pharmacology and biochemistry in general, modeling of chaotic dynamic processes (nuclear explosions, turbulent flows, weather and long-term climatic phenomena), etc. will be directly developed, as well as the solution of any problems, which reduce to the multiplication of matrices of large dimensions (in particular, the modeling of quantum systems). However, along with extraordinary opportunities, quantum technologies carry with them certain risks and threats, in particular, the scrapping of all information systems based on modern achievements in cryptography, which will entail almost complete destruction of secrecy, the global financial crisis due to the destruction of the banking sector and compromise of all communication channels. Even in spite of the fact that methods of so-called “post-quantum” cryptography are already being developed today, some risks still need to be realized, since not all long-term consequences can be calculated. At the same time, one should be prepared to all of the above, including by training specialists working in the field of quantum technologies and understanding all their aspects, new opportunities, risks and threats. In this connection, this article briefly describes the current state of quantum technologies, namely, quantum sensorics, information transfer using quantum protocols, a universal quantum computer (hardware), and quantum computations based on quantum algorithms (software). For all of the above, forecasts are given for the development of the impact on various areas of human civilization.

    Views (last year): 56.
  3. Alkousa M.S., Gasnikov A.V., Dvurechensky P.E., Sadiev A.A., Razouk L.Ya.
    An approach for the nonconvex uniformly concave structured saddle point problem
    Computer Research and Modeling, 2022, v. 14, no. 2, pp. 225-237

    Recently, saddle point problems have received much attention due to their powerful modeling capability for a lot of problems from diverse domains. Applications of these problems occur in many applied areas, such as robust optimization, distributed optimization, game theory, and many applications in machine learning such as empirical risk minimization and generative adversarial networks training. Therefore, many researchers have actively worked on developing numerical methods for solving saddle point problems in many different settings. This paper is devoted to developing a numerical method for solving saddle point problems in the nonconvex uniformly-concave setting. We study a general class of saddle point problems with composite structure and H\"older-continuous higher-order derivatives. To solve the problem under consideration, we propose an approach in which we reduce the problem to a combination of two auxiliary optimization problems separately for each group of variables, the outer minimization problem w.r.t. primal variables, and the inner maximization problem w.r.t the dual variables. For solving the outer minimization problem, we use the Adaptive Gradient Method, which is applicable for nonconvex problems and also works with an inexact oracle that is generated by approximately solving the inner problem. For solving the inner maximization problem, we use the Restarted Unified Acceleration Framework, which is a framework that unifies the high-order acceleration methods for minimizing a convex function that has H\"older-continuous higher-order derivatives. Separate complexity bounds are provided for the number of calls to the first-order oracles for the outer minimization problem and higher-order oracles for the inner maximization problem. Moreover, the complexity of the whole proposed approach is then estimated.

  4. Soukhovolsky V.G., Kovalev A.V., Palnikova E.N., Tarasova O.V.
    Modelling the risk of insect impacts on forest stands after possible climate changes
    Computer Research and Modeling, 2016, v. 8, no. 2, pp. 241-253

    A model of forest insect population dynamics used to simulate of “forest-insect” interactions and for estimation of possible damages of forest stand by pests. This model represented a population as control system where the input variables characterized the influence of modifier (climatic) factors and the feedback loop describes the effect of regulatory factors (parasites, predators and population interactions). The technique of stress testing on the basis of population dynamics model proposed for assessment of the risks of forest stand damage and destruction after insect impact. The dangerous forest pest pine looper Bupalus piniarius L. considered as the object of analysis. Computer experiments were conducted to assess of outbreak risks with possible climate change in the territory of Central Siberia. Model experiments have shown that risk of insect impact on the forest is not increased significantly in condition of sufficiently moderate warming (not more than 4 °C in summer period). However, a stronger warming in the territory of Central Siberia, combined with a dry summer condition could cause a significant increase in the risk of pine looper outbreaks.

    Views (last year): 3. Citations: 1 (RSCI).
  5. Grachev V.A., Nayshtut Yu.S.
    Buckling problems of thin elastic shells
    Computer Research and Modeling, 2018, v. 10, no. 6, pp. 775-787

    The article covers several mathematical problems relating to elastic stability of thin shells in view of inconsistencies that have been recently identified between the experimental data and the predictions based on the shallow- shell theory. It is highlighted that the contradictions were caused by new algorithms that enabled updating the values of the so called “low critical stresses” calculated in the 20th century and adopted as a buckling criterion for thin shallow shells by technical standards. The new calculations often find the low critical stress close to zero. Therefore, the low critical stress cannot be used as a safety factor for the buckling analysis of the thinwalled structure, and the equations of the shallow-shell theory need to be replaced with other differential equations. The new theory also requires a buckling criterion ensuring the match between calculations and experimental data.

    The article demonstrates that the contradiction with the new experiments can be resolved within the dynamic nonlinear three-dimensional theory of elasticity. The stress when bifurcation of dynamic modes occurs shall be taken as a buckling criterion. The nonlinear form of original equations causes solitary (solitonic) waves that match non-smooth displacements (patterns, dents) of the shells. It is essential that the solitons make an impact at all stages of loading and significantly increase closer to bifurcation. The solitonic solutions are illustrated based on the thin cylindrical momentless shell when its three-dimensional volume is simulated with twodimensional surface of the set thickness. It is noted that the pattern-generating waves can be detected (and their amplitudes can by identified) with acoustic or electromagnetic devices.

    Thus, it is technically possible to reduce the risk of failure of the thin shells by monitoring the shape of the surface with acoustic devices. The article concludes with a setting of the mathematical problems requiring the solution for the reliable numerical assessment of the buckling criterion for thin elastic shells.

    Views (last year): 23.
  6. Dzhinchvelashvili G.A., Dzerzhinsky R.I., Denisenkova N.N.
    Quantitative assessment of seismic risk and energy concepts of earthquake engineering
    Computer Research and Modeling, 2018, v. 10, no. 1, pp. 61-76

    Currently, earthquake-resistant design of buildings based on the power calculation and presentation of effect of the earthquake static equivalent forces, which are calculated using elastic response spectra (linear-spectral method) that connects the law of motion of the soil with the absolute acceleration of the model in a nonlinear oscillator.

    This approach does not directly take into account either the influence of the duration of strong motion or the plastic behavior of the structure. Frequency content and duration of ground vibrations directly affect the energy received by the building and causing damage to its elements. Unlike power or kinematic calculation of the seismic effect on the structure can be interpreted without considering separately the forces and displacements and to provide, as the product of both variables, i.e., the work or input energy (maximum energy that can be purchased building to the earthquake).

    With the energy approach of seismic design, it is necessary to evaluate the input seismic energy in the structure and its distribution among various structural components.

    The article provides substantiation of the energy approach in the design of earthquake-resistant buildings and structures instead of the currently used method based on the power calculation and presentation of effect of the earthquake static equivalent forces, which are calculated using spectra of the reaction.

    Noted that interest in the use of energy concepts in earthquake-resistant design began with the works of Housner, which provided the seismic force in the form of the input seismic energy, using the range of speeds, and suggested that the damage in elastic-plastic system and elastic system causes one and the same input seismic energy.

    The indices of the determination of the input energy of the earthquake, proposed by various authors, are given in this paper. It is shown that modern approaches to ensuring seismic stability of structures, based on the representation of the earthquake effect as a static equivalent force, do not adequately describe the behavior of the system during an earthquake.

    In this paper, based on quantitative estimates of seismic risk analyzes developed in the NRU MSUCE Standard Organization (STO) “Seismic resistance structures. The main design provisions”. In the developed document a step forward with respect to the optimal design of earthquake-resistant structures.

    The proposed concept of using the achievements of modern methods of calculation of buildings and structures on seismic effects, which are harmonized with the Eurocodes and are not contrary to the system of national regulations.

    Views (last year): 21.
  7. Dvinskikh D.M., Pirau V.V., Gasnikov A.V.
    On the relations of stochastic convex optimization problems with empirical risk minimization problems on $p$-norm balls
    Computer Research and Modeling, 2022, v. 14, no. 2, pp. 309-319

    In this paper, we consider convex stochastic optimization problems arising in machine learning applications (e. g., risk minimization) and mathematical statistics (e. g., maximum likelihood estimation). There are two main approaches to solve such kinds of problems, namely the Stochastic Approximation approach (online approach) and the Sample Average Approximation approach, also known as the Monte Carlo approach, (offline approach). In the offline approach, the problem is replaced by its empirical counterpart (the empirical risk minimization problem). The natural question is how to define the problem sample size, i. e., how many realizations should be sampled so that the quite accurate solution of the empirical problem be the solution of the original problem with the desired precision. This issue is one of the main issues in modern machine learning and optimization. In the last decade, a lot of significant advances were made in these areas to solve convex stochastic optimization problems on the Euclidean balls (or the whole space). In this work, we are based on these advances and study the case of arbitrary balls in the $p$-norms. We also explore the question of how the parameter $p$ affects the estimates of the required number of terms as a function of empirical risk.

    In this paper, both convex and saddle point optimization problems are considered. For strongly convex problems, the existing results on the same sample sizes in both approaches (online and offline) were generalized to arbitrary norms. Moreover, it was shown that the strong convexity condition can be weakened: the obtained results are valid for functions satisfying the quadratic growth condition. In the case when this condition is not met, it is proposed to use the regularization of the original problem in an arbitrary norm. In contradistinction to convex problems, saddle point problems are much less studied. For saddle point problems, the sample size was obtained under the condition of $\gamma$-growth of the objective function. When $\gamma = 1$, this condition is the condition of sharp minimum in convex problems. In this article, it was shown that the sample size in the case of a sharp minimum is almost independent of the desired accuracy of the solution of the original problem.

  8. Lubashevsky I.A., Lubashevskiy V.I.
    Dynamical trap model for stimulus – response dynamics of human control
    Computer Research and Modeling, 2024, v. 16, no. 1, pp. 79-87

    We present a novel model for the dynamical trap of the stimulus – response type that mimics human control over dynamic systems when the bounded capacity of human cognition is a crucial factor. Our focus lies on scenarios where the subject modulates a control variable in response to a certain stimulus. In this context, the bounded capacity of human cognition manifests in the uncertainty of stimulus perception and the subsequent actions of the subject. The model suggests that when the stimulus intensity falls below the (blurred) threshold of stimulus perception, the subject suspends the control and maintains the control variable near zero with accuracy determined by the control uncertainty. As the stimulus intensity grows above the perception uncertainty and becomes accessible to human cognition, the subject activates control. Consequently, the system dynamics can be conceptualized as an alternating sequence of passive and active modes of control with probabilistic transitions between them. Moreover, these transitions are expected to display hysteresis due to decision-making inertia.

    Generally, the passive and active modes of human control are governed by different mechanisms, posing challenges in developing efficient algorithms for their description and numerical simulation. The proposed model overcomes this problem by introducing the dynamical trap of the stimulus-response type, which has a complex structure. The dynamical trap region includes two subregions: the stagnation region and the hysteresis region. The model is based on the formalism of stochastic differential equations, capturing both probabilistic transitions between control suspension and activation as well as the internal dynamics of these modes within a unified framework. It reproduces the expected properties in control suspension and activation, probabilistic transitions between them, and hysteresis near the perception threshold. Additionally, in a limiting case, the model demonstrates the capability of mimicking a similar subject’s behavior when (1) the active mode represents an open-loop implementation of locally planned actions and (2) the control activation occurs only when the stimulus intensity grows substantially and the risk of the subject losing the control over the system dynamics becomes essential.

  9. Stepin Y.P., Leonov D.G., Papilina T.M., Stepankina O.A.
    System modeling, risks evaluation and optimization of a distributed computer system
    Computer Research and Modeling, 2020, v. 12, no. 6, pp. 1349-1359

    The article deals with the problem of a distributed system operation reliability. The system core is an open integration platform that provides interaction of varied software for modeling gas transportation. Some of them provide an access through thin clients on the cloud technology “software as a service”. Mathematical models of operation, transmission and computing are to ensure the operation of an automated dispatching system for oil and gas transportation. The paper presents a system solution based on the theory of Markov random processes and considers the stable operation stage. The stationary operation mode of the Markov chain with continuous time and discrete states is described by a system of Chapman–Kolmogorov equations with respect to the average numbers (mathematical expectations) of the objects in certain states. The objects of research are both system elements that are present in a large number – thin clients and computing modules, and individual ones – a server, a network manager (message broker). Together, they are interacting Markov random processes. The interaction is determined by the fact that the transition probabilities in one group of elements depend on the average numbers of other elements groups.

    The authors propose a multi-criteria dispersion model of risk assessment for such systems (both in the broad and narrow sense, in accordance with the IEC standard). The risk is the standard deviation of estimated object parameter from its average value. The dispersion risk model makes possible to define optimality criteria and whole system functioning risks. In particular, for a thin client, the following is calculated: the loss profit risk, the total risk of losses due to non-productive element states, and the total risk of all system states losses.

    Finally the paper proposes compromise schemes for solving the multi-criteria problem of choosing the optimal operation strategy based on the selected set of compromise criteria.

  10. Kirilyuk I.L., Volynsky A.I., Kruglova M.S., Kuznetsova A.V., Rubinstein A.A., Sen'ko O.V.
    Empirical testing of institutional matrices theory by data mining
    Computer Research and Modeling, 2015, v. 7, no. 4, pp. 923-939

    The paper has a goal to identify a set of parameters of the environment and infrastructure with the most significant impact on institutional-matrices that dominate in different countries. Parameters of environmental conditions includes raw statistical indices, which were directly derived from the databases of open access, as well as complex integral indicators that were by method of principal components. Efficiency of discussed parameters in task of dominant institutional matrices type recognition (X or Y type) was evaluated by a number of methods based on machine learning. It was revealed that greatest informational content is associated with parameters characterizing risk of natural disasters, level of urbanization and the development of transport infrastructure, the monthly averages and seasonal variations of temperature and precipitation.

    Views (last year): 7. Citations: 13 (RSCI).
Pages: next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"