Результаты поиска по 'distributions':
Найдено статей: 250
  1. Fialko N.S., Olshevets M.M., Lakhno V.D.
    Numerical study of the Holstein model in different thermostats
    Computer Research and Modeling, 2024, v. 16, no. 2, pp. 489-502

    Based on the Holstein Hamiltonian, the dynamics of the charge introduced into the molecular chain of sites was modeled at different temperatures. In the calculation, the temperature of the chain is set by the initial data ¡ª random Gaussian distributions of velocities and site displacements. Various options for the initial charge density distribution are considered. Long-term calculations show that the system moves to fluctuations near a new equilibrium state. For the same initial velocities and displacements, the average kinetic energy, and, accordingly, the temperature of the T chain, varies depending on the initial distribution of the charge density: it decreases when a polaron is introduced into the chain, or increases if at the initial moment the electronic part of the energy is maximum. A comparison is made with the results obtained previously in the model with a Langevin thermostat. In both cases, the existence of a polaron is determined by the thermal energy of the entire chain.

    According to the simulation results, the transition from the polaron mode to the delocalized state occurs in the same range of thermal energy values of a chain of $N$ sites ~ $NT$ for both thermostat options, with an additional adjustment: for the Hamiltonian system the temperature does not correspond to the initially set one, but is determined after long-term calculations from the average kinetic energy of the chain.

    In the polaron region, the use of different methods for simulating temperature leads to a number of significant differences in the dynamics of the system. In the region of the delocalized state of charge, for high temperatures, the results averaged over a set of trajectories in a system with a random force and the results averaged over time for a Hamiltonian system are close, which does not contradict the ergodic hypothesis. From a practical point of view, for large temperatures T ≈ 300 K, when simulating charge transfer in homogeneous chains, any of these options for setting the thermostat can be used.

  2. Kazaryan A.M., Shapoval A.B.
    Timeclusterring of stock indicies’ big fall
    Computer Research and Modeling, 2012, v. 4, no. 3, pp. 631-638

    The paper estimates the recurrence rate of stock indicies S&P100, CAC40, DAX, FTSE, AMEX, ATX, NASDAQ, BEL20. The introduced qunatitative measure of the recurrence rate underlies type I and type II errors. We show that more than three quarters of the indicies’ falls occur on average during the first quarter of the time between them. This result expands from sufficiently large falls, which are observed on average two times a year, over smaller falls, which occur approximately once 1.5–2 months.

    Views (last year): 2.
  3. Nikulin V.N., Odintsova A.S.
    Statistically fair price for the European call options according to the discreet mean/variance model
    Computer Research and Modeling, 2014, v. 6, no. 5, pp. 861-874

    We consider a portfolio with call option and the corresponding underlying asset under the standard assumption that stock-market price represents a random variable with lognormal distribution. Minimizing the variance hedging risk of the portfolio on the date of maturity of the call option we find a fraction of the asset per unit call option. As a direct consequence we derive the statistically fair lookback call option price in explicit form. In contrast to the famous Black–Scholes theory, any portfolio cannot be regarded as  risk-free because no additional transactions are supposed to be conducted over the life of the contract, but the sequence of independent portfolios will reduce risk to zero asymptotically. This property is illustrated in the experimental section using a dataset of daily stock prices of 37 leading US-based companies for the period from April 2006 to January 2013.

    Views (last year): 1.
  4. Zenkov A.V.
    Deviation from Benford’s law and identification of author peculiarities in texts
    Computer Research and Modeling, 2015, v. 7, no. 1, pp. 197-201

    The distribution of the first significant digit in numerals of connected texts is considered. Benford's law is found to hold approximately for them. Deviations from Benford's law are statistically significant author peculiarities that allow, under certain conditions, to distinguish between parts of the text with a different authorship.

    Views (last year): 4. Citations: 6 (RSCI).
  5. Shpitonkov M.I.
    Application of correlation adaptometry technique to sports and biomedical research
    Computer Research and Modeling, 2017, v. 9, no. 2, pp. 345-354

    The paper outlines the approaches to mathematical modeling correlation adaptometry techniques widely used in biology and medicine. The analysis is based on models employed in descriptions of structured biological systems. It is assumed that the distribution density of the biological population numbers satisfies the equation of Kolmogorov-Fokker-Planck. Using this technique evaluated the effectiveness of treatment of patients with obesity. All patients depending on the obesity degree and the comorbidity nature were divided into three groups. Shows a decrease in weight of the correlation graph computed from the measured in the patients of the indicators that characterizes the effectiveness of the treatment for all studied groups. This technique was also used to assess the intensity of the training loads in academic rowing three age groups. It was shown that with the highest voltage worked with athletes for youth group. Also, using the technique of correlation adaptometry evaluated the effectiveness of the treatment of hormone replacement therapy in women. All the patients depending on the assigned drug were divided into four groups. In the standard analysis of the dynamics of mean values of indicators, it was shown that in the course of the treatment were observed normalization of the averages for all groups of patients. However, using the technique of correlation adaptometry it was found that during the first six months the weight of the correlation graph was decreasing and during the second six months the weight increased for all study groups. This indicates the excessive length of the annual course of hormone replacement therapy and the practicality of transition to a semiannual rate.

    Views (last year): 10.
  6. Mitin N.A., Orlov Y.N.
    Statistical analysis of bigrams of specialized texts
    Computer Research and Modeling, 2020, v. 12, no. 1, pp. 243-254

    The method of the stochastic matrix spectrum analysis is used to build an indicator that allows to determine the subject of scientific texts without keywords usage. This matrix is a matrix of conditional probabilities of bigrams, built on the statistics of the alphabet characters in the text without spaces, numbers and punctuation marks. Scientific texts are classified according to the mutual arrangement of invariant subspaces of the matrix of conditional probabilities of pairs of letter combinations. The separation indicator is the value of the cosine of the angle between the right and left eigenvectors corresponding to the maximum and minimum eigenvalues. The computational algorithm uses a special representation of the dichotomy parameter, which is the integral of the square norm of the resolvent of the stochastic matrix of bigrams along the circumference of a given radius in the complex plane. The tendency of the integral to infinity testifies to the approximation of the integration circuit to the eigenvalue of the matrix. The paper presents the typical distribution of the indicator of identification of specialties. For statistical analysis were analyzed dissertations on the main 19 specialties without taking into account the classification within the specialty, 20 texts for the specialty. It was found that the empirical distributions of the cosine of the angle for the mathematical and Humanities specialties do not have a common domain, so they can be formally divided by the value of this indicator without errors. Although the body of texts was not particularly large, nevertheless, in the case of arbitrary selection of dissertations, the identification error at the level of 2 % seems to be a very good result compared to the methods based on semantic analysis. It was also found that it is possible to make a text pattern for each of the specialties in the form of a reference matrix of bigrams, in the vicinity of which in the norm of summable functions it is possible to accurately identify the theme of the written scientific work, without using keywords. The proposed method can be used as a comparative indicator of greater or lesser severity of the scientific text or as an indicator of compliance of the text to a certain scientific level.

  7. Lobanov A.I., Mirov F.Kh.
    On the using the differential schemes to transport equation with drain in grid modeling
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1149-1164

    Modern power transportation systems are the complex engineering systems. Such systems include both point facilities (power producers, consumers, transformer substations, etc.) and the distributed elements (f.e. power lines). Such structures are presented in the form of the graphs with different types of nodes under creating the mathematical models. It is necessary to solve the system of partial differential equations of the hyperbolic type to study the dynamic effects in such systems.

    An approach similar to one already applied in modeling similar problems earlier used in the work. New variant of the splitting method was used proposed by the authors. Unlike most known works, the splitting is not carried out according to physical processes (energy transport without dissipation, separately dissipative processes). We used splitting to the transport equations with the drain and the exchange between Reimann’s invariants. This splitting makes possible to construct the hybrid schemes for Riemann invariants with a high order of approximation and minimal dissipation error. An example of constructing such a hybrid differential scheme is described for a single-phase power line. The difference scheme proposed is based on the analysis of the properties of the schemes in the space of insufficient coefficients.

    Examples of the model problem numerical solutions using the proposed splitting and the difference scheme are given. The results of the numerical calculations shows that the difference scheme allows to reproduce the arising regions of large gradients. It is shown that the difference schemes also allow detecting resonances in such the systems.

  8. Shinyaeva T.S.
    Activity dynamics in virtual networks: an epidemic model vs an excitable medium model
    Computer Research and Modeling, 2020, v. 12, no. 6, pp. 1485-1499

    Epidemic models are widely used to mimic social activity, such as spreading of rumors or panic. Simultaneously, models of excitable media are traditionally used to simulate the propagation of activity. Spreading of activity in the virtual community was simulated within two models: the SIRS epidemic model and the Wiener – Rosenblut model of the excitable media. We used network versions of these models. The network was assumed to be heterogeneous, namely, each element of the network has an individual set of characteristics, which corresponds to different psychological types of community members. The structure of a virtual network relies on an appropriate scale-free network. Modeling was carried out on scale-free networks with various values of the average degree of vertices. Additionally, a special case was considered, namely, a complete graph corresponding to a close professional group, when each member of the group interacts with each. Participants in a virtual community can be in one of three states: 1) potential readiness to accept certain information; 2) active interest to this information; 3) complete indifference to this information. These states correspond to the conditions that are usually used in epidemic models: 1) susceptible to infection, 2) infected, 3) refractory (immune or death due to disease). A comparison of the two models showed their similarity both at the level of main assumptions and at the level of possible modes. Distribution of activity over the network is similar to the spread of infectious diseases. It is shown that activity in virtual networks may experience fluctuations or decay.

  9. Voronina M.Y., Orlov Y.N.
    Identification of the author of the text by segmentation method
    Computer Research and Modeling, 2022, v. 14, no. 5, pp. 1199-1210

    The paper describes a method for recognizing authors of literary texts by the proximity of fragments into which a separate text is divided to the standard of the author. The standard is the empirical frequency distribution of letter combinations, built on a training sample, which included expertly selected reliably known works of this author. A set of standards of different authors forms a library, within which the problem of identifying the author of an unknown text is solved. The proximity between texts is understood in the sense of the norm in L1 for the frequency vector of letter combinations, which is constructed for each fragment and for the text as a whole. The author of an unknown text is assigned the one whose standard is most often chosen as the closest for the set of fragments into which the text is divided. The length of the fragment is optimized based on the principle of the maximum difference in distances from fragments to standards in the problem of recognition of «friend–foe». The method was tested on the corpus of domestic and foreign (translated) authors. 1783 texts of 100 authors with a total volume of about 700 million characters were collected. In order to exclude the bias in the selection of authors, authors whose surnames began with the same letter were considered. In particular, for the letter L, the identification error was 12%. Along with a fairly high accuracy, this method has another important property: it allows you to estimate the probability that the standard of the author of the text in question is missing in the library. This probability can be estimated based on the results of the statistics of the nearest standards for small fragments of text. The paper also examines statistical digital portraits of writers: these are joint empirical distributions of the probability that a certain proportion of the text is identified at a given level of trust. The practical importance of these statistics is that the carriers of the corresponding distributions practically do not overlap for their own and other people’s standards, which makes it possible to recognize the reference distribution of letter combinations at a high level of confidence.

  10. Chen J., Lobanov A.V., Rogozin A.V.
    Nonsmooth Distributed Min-Max Optimization Using the Smoothing Technique
    Computer Research and Modeling, 2023, v. 15, no. 2, pp. 469-480

    Distributed saddle point problems (SPPs) have numerous applications in optimization, matrix games and machine learning. For example, the training of generated adversarial networks is represented as a min-max optimization problem, and training regularized linear models can be reformulated as an SPP as well. This paper studies distributed nonsmooth SPPs with Lipschitz-continuous objective functions. The objective function is represented as a sum of several components that are distributed between groups of computational nodes. The nodes, or agents, exchange information through some communication network that may be centralized or decentralized. A centralized network has a universal information aggregator (a server, or master node) that directly communicates to each of the agents and therefore can coordinate the optimization process. In a decentralized network, all the nodes are equal, the server node is not present, and each agent only communicates to its immediate neighbors.

    We assume that each of the nodes locally holds its objective and can compute its value at given points, i. e. has access to zero-order oracle. Zero-order information is used when the gradient of the function is costly, not possible to compute or when the function is not differentiable. For example, in reinforcement learning one needs to generate a trajectory to evaluate the current policy. This policy evaluation process can be interpreted as the computation of the function value. We propose an approach that uses a smoothing technique, i. e., applies a first-order method to the smoothed version of the initial function. It can be shown that the stochastic gradient of the smoothed function can be viewed as a random two-point gradient approximation of the initial function. Smoothing approaches have been studied for distributed zero-order minimization, and our paper generalizes the smoothing technique on SPPs.

Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"