Результаты поиска по 'stochastic data processing':
Найдено статей: 5
  1. The paper concerns the study of the Rice statistical distribution’s peculiarities which cause the possibility of its efficient application in solving the tasks of high precision phase measuring in optics. The strict mathematical proof of the Rician distribution’s stable character is provided in the example of the differential signal consideration, namely: it has been proved that the sum or the difference of two Rician signals also obey the Rice distribution. Besides, the formulas have been obtained for the parameters of the resulting summand or differential signal’s Rice distribution. Based upon the proved stable character of the Rice distribution a new original technique of the high precision measuring of the two quasi-harmonic signals’ phase shift has been elaborated in the paper. This technique is grounded in the statistical analysis of the measured sampled data for the amplitudes of the both signals and for the amplitude of the third signal which is equal to the difference of the two signals to be compared in phase. The sought-for phase shift of two quasi-harmonic signals is being calculated from the geometrical considerations as an angle of a triangle which sides are equal to the three indicated signals’ amplitude values having been reconstructed against the noise background. Thereby, the proposed technique of measuring the phase shift using the differential signal analysis, is based upon the amplitude measurements only, what significantly decreases the demands to the equipment and simplifies the technique implementation in practice. The paper provides both the strict mathematical substantiation of a new phase shift measuring technique and the results of its numerical testing. The elaborated method of high precision phase measurements may be efficiently applied for solving a wide circle of tasks in various areas of science and technology, in particular — at distance measuring, in communication systems, in navigation, etc.

  2. Krat Y.G., Potapov I.I.
    Bottom stability in closed conduits
    Computer Research and Modeling, 2015, v. 7, no. 5, pp. 1061-1068

    In this paper on the basis of the riverbed model proposed earlier the one-dimensional stability problem of closed flow channel with sandy bed is solved. The feature of the investigated problem is used original equation of riverbed deformations, which takes into account the influence of mechanical and granulometric bed material characteristics and the bed slope when riverbed analyzing. Another feature of the discussed problem is the consideration together with shear stress influence normal stress influence when investigating the riverbed instability. The analytical dependence determined the wave length of fast-growing bed perturbations is obtained from the solution of the sandy bed stability problem for closed flow channel. The analysis of the obtained analytical dependence is performed. It is shown that the obtained dependence generalizes the row of well-known empirical formulas: Coleman, Shulyak and Bagnold. The structure of the obtained analytical dependence denotes the existence of two hydrodynamic regimes characterized by the Froude number, at which the bed perturbations growth can strongly or weakly depend on the Froude number. Considering a natural stochasticity of the waves movement process and the presence of a definition domain of the solution with a weak dependence on the Froude numbers it can be concluded that the experimental observation of the of the bed waves movement development should lead to the data acquisition with a significant dispersion and it occurs in reality.

    Views (last year): 1. Citations: 2 (RSCI).
  3. Gorshenin A.K., Korolev V.Y., Malakhov D.V., Skvortsova N.N.
    On the investigation of plasma turbulence by the analysis of the spectra
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 793-802

    The article describes the examples of the analysis of the experimental data spectra for identifying typical structures of processes forming plasma turbulence. The method is based on the original algorithm which is close to the one-sample bootstrap. The base model for description of the fine structure of stochastic processes is finite local-scale normal mixtures. For finding the statistical estimates (maximum likelihood estimates) well known EM algorithm is used. The efficiency of the proposed research technique is demonstrated for a number of spectra’s set obtained in different modes of low-frequency plasma turbulence.

    Views (last year): 2. Citations: 4 (RSCI).
  4. Orlova E.V.
    Model for operational optimal control of financial recourses distribution in a company
    Computer Research and Modeling, 2019, v. 11, no. 2, pp. 343-358

    A critical analysis of existing approaches, methods and models to solve the problem of financial resources operational management has been carried out in the article. A number of significant shortcomings of the presented models were identified, limiting the scope of their effective usage. There are a static nature of the models, probabilistic nature of financial flows are not taken into account, daily amounts of receivables and payables that significantly affect the solvency and liquidity of the company are not identified. This necessitates the development of a new model that reflects the essential properties of the planning financial flows system — stochasticity, dynamism, non-stationarity.

    The model for the financial flows distribution has been developed. It bases on the principles of optimal dynamic control and provides financial resources planning ensuring an adequate level of liquidity and solvency of a company and concern initial data uncertainty. The algorithm for designing the objective cash balance, based on principles of a companies’ financial stability ensuring under changing financial constraints, is proposed.

    Characteristic of the proposed model is the presentation of the cash distribution process in the form of a discrete dynamic process, for which a plan for financial resources allocation is determined, ensuring the extremum of an optimality criterion. Designing of such plan is based on the coordination of payments (cash expenses) with the cash receipts. This approach allows to synthesize different plans that differ in combinations of financial outflows, and then to select the best one according to a given criterion. The minimum total costs associated with the payment of fines for non-timely financing of expenses were taken as the optimality criterion. Restrictions in the model are the requirement to ensure the minimum allowable cash balances for the subperiods of the planning period, as well as the obligation to make payments during the planning period, taking into account the maturity of these payments. The suggested model with a high degree of efficiency allows to solve the problem of financial resources distribution under uncertainty over time and receipts, coordination of funds inflows and outflows. The practical significance of the research is in developed model application, allowing to improve the financial planning quality, to increase the management efficiency and operational efficiency of a company.

    Views (last year): 33.
  5. Zavodskikh R.K., Efanov N.N.
    Performance prediction for chosen types of loops over one-dimensional arrays with embedding-driven intermediate representations analysis
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 211-224

    The method for mapping of intermediate representations (IR) set of C, C++ programs to vector embedding space is considered to create an empirical estimation framework for static performance prediction using LLVM compiler infrastructure. The usage of embeddings makes programs easier to compare due to avoiding Control Flow Graphs (CFG) and Data Flow Graphs (DFG) direct comparison. This method is based on transformation series of the initial IR such as: instrumentation — injection of artificial instructions in an instrumentation compiler’s pass depending on load offset delta in the current instruction compared to the previous one, mapping of instrumented IR into multidimensional vector with IR2Vec and dimension reduction with t-SNE (t-distributed stochastic neighbor embedding) method. The D1 cache miss ratio measured with perf stat tool is considered as performance metric. A heuristic criterion of programs having more or less cache miss ratio is given. This criterion is based on embeddings of programs in 2D-space. The instrumentation compiler’s pass developed in this work is described: how it generates and injects artificial instructions into IR within the used memory model. The software pipeline that implements the performance estimation based on LLVM compiler infrastructure is given. Computational experiments are performed on synthetic tests which are the sets of programs with the same CFGs but with different sequences of offsets used when accessing the one-dimensional array of a given size. The correlation coefficient between performance metric and distance to the worst program’s embedding is measured and proved to be negative regardless of t-SNE initialization. This fact proves the heuristic criterion to be true. The process of such synthetic tests generation is also considered. Moreover, the variety of performance metric in programs set in such a test is proposed as a metric to be improved with exploration of more tests generators.

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"