Результаты поиска по 'correlation':
Найдено статей: 38
  1. Orel V.R., Tambovtseva R.V., Firsova E.A.
    Effects of the heart contractility and its vascular load on the heart rate in athlets
    Computer Research and Modeling, 2017, v. 9, no. 2, pp. 323-329

    Heart rate (HR) is the most affordable indicator for measuring. In order to control the individual response to physical exercises of different load types heart rate is measured when the athletes perform different types of muscular work (strength machines, various types of training and competitive exercises). The magnitude of heart rate and its dynamics during muscular work and recovery can be objectively judged on the functional status of the cardiovascular system of an athlete, the level of its individual physical performance, as well as an adaptive response to a particular exercise. However, the heart rate is not an independent determinant of the physical condition of an athlete. HR size is formed by the interaction of the basic physiological mechanisms underlying cardiac hemodynamic ejection mode. Heart rate depends on one hand, on contractility of the heart, the venous return, the volumes of the atria and ventricles of the heart and from vascular heart load, the main components of which are elastic and peripheral resistance of the arterial system on the other hand. The values of arterial system vascular resistances depend on the power of muscular work and its duration. HR sensitivity to changes in heart load and vascular contraction was determined in athletes by pair regression analysis simultaneously recorded heart rate data, and peripheral $(R)$ and elastic $(E_a)$ resistance (heart vascular load), and the power $(W)$ of heartbeats (cardiac contractility). The coefficients of sensitivity and pair correlation between heart rate indicators and vascular load and contractility of left ventricle of the heart were determined in athletes at rest and during the muscular work on the cycle ergometer. It is shown that increase in both ergometer power load and heart rate is accompanied by the increase of correlation coefficients and coefficients of the heart rate sensitivity to $R$, $E_a$ and $W$.

    Views (last year): 5. Citations: 1 (RSCI).
  2. Kochetkov A.V., Chvanov A.V.
    Digital modeling geometrical and macrorough parameters of a highway
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 837-844

    Original representation of statistical digital model of measurement of a macroroughness on a local site (to 15) consisting of determined (bias), correlated (standard periodic making and periodic deviations from flatness) and actually casual making (values of a macroroughness) Is offered.

    Views (last year): 1. Citations: 1 (RSCI).
  3. Shpitonkov M.I.
    Application of correlation adaptometry technique to sports and biomedical research
    Computer Research and Modeling, 2017, v. 9, no. 2, pp. 345-354

    The paper outlines the approaches to mathematical modeling correlation adaptometry techniques widely used in biology and medicine. The analysis is based on models employed in descriptions of structured biological systems. It is assumed that the distribution density of the biological population numbers satisfies the equation of Kolmogorov-Fokker-Planck. Using this technique evaluated the effectiveness of treatment of patients with obesity. All patients depending on the obesity degree and the comorbidity nature were divided into three groups. Shows a decrease in weight of the correlation graph computed from the measured in the patients of the indicators that characterizes the effectiveness of the treatment for all studied groups. This technique was also used to assess the intensity of the training loads in academic rowing three age groups. It was shown that with the highest voltage worked with athletes for youth group. Also, using the technique of correlation adaptometry evaluated the effectiveness of the treatment of hormone replacement therapy in women. All the patients depending on the assigned drug were divided into four groups. In the standard analysis of the dynamics of mean values of indicators, it was shown that in the course of the treatment were observed normalization of the averages for all groups of patients. However, using the technique of correlation adaptometry it was found that during the first six months the weight of the correlation graph was decreasing and during the second six months the weight increased for all study groups. This indicates the excessive length of the annual course of hormone replacement therapy and the practicality of transition to a semiannual rate.

    Views (last year): 10.
  4. Aksenov A.A., Kalugina M.D., Lobanov A.I., Kashirin V.S.
    Numerical simulation of fluid flow in a blood pump in the FlowVision software package
    Computer Research and Modeling, 2023, v. 15, no. 4, pp. 1025-1038

    A numerical simulation of fluid flow in a blood pump was performed using the FlowVision software package. This test problem, provided by the Center for Devices and Radiological Health of the US. Food and Drug Administration, involved considering fluid flow according to several design modes. At the same time for each case of calculation a certain value of liquid flow rate and rotor speed was set. Necessary data for calculations in the form of exact geometry, flow conditions and fluid characteristics were provided to all research participants, who used different software packages for modeling. Numerical simulations were performed in FlowVision for six calculation modes with the Newtonian fluid and standard $k-\varepsilon$ turbulence model, in addition, the fifth mode with the $k-\omega$ SST turbulence model and with the Caro rheological fluid model were performed. In the first stage of the numerical simulation, the convergence over the mesh was investigated, on the basis of which a final mesh with a number of cells of the order of 6 million was chosen. Due to the large number of cells, in order to accelerate the study, part of the calculations was performed on the Lomonosov-2 cluster. As a result of numerical simulation, we obtained and analyzed values of pressure difference between inlet and outlet of the pump, velocity between rotor blades and in the area of diffuser, and also, we carried out visualization of velocity distribution in certain cross-sections. For all design modes there was compared the pressure difference received numerically with the experimental data, and for the fifth calculation mode there was also compared with the experiment by speed distribution between rotor blades and in the area of diffuser. Data analysis has shown good correlation of calculation results in FlowVision with experimental results and numerical simulation in other software packages. The results obtained in FlowVision for solving the US FDA test suggest that FlowVision software package can be used for solving a wide range of hemodynamic problems.

  5. Zavodskikh R.K., Efanov N.N.
    Performance prediction for chosen types of loops over one-dimensional arrays with embedding-driven intermediate representations analysis
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 211-224

    The method for mapping of intermediate representations (IR) set of C, C++ programs to vector embedding space is considered to create an empirical estimation framework for static performance prediction using LLVM compiler infrastructure. The usage of embeddings makes programs easier to compare due to avoiding Control Flow Graphs (CFG) and Data Flow Graphs (DFG) direct comparison. This method is based on transformation series of the initial IR such as: instrumentation — injection of artificial instructions in an instrumentation compiler’s pass depending on load offset delta in the current instruction compared to the previous one, mapping of instrumented IR into multidimensional vector with IR2Vec and dimension reduction with t-SNE (t-distributed stochastic neighbor embedding) method. The D1 cache miss ratio measured with perf stat tool is considered as performance metric. A heuristic criterion of programs having more or less cache miss ratio is given. This criterion is based on embeddings of programs in 2D-space. The instrumentation compiler’s pass developed in this work is described: how it generates and injects artificial instructions into IR within the used memory model. The software pipeline that implements the performance estimation based on LLVM compiler infrastructure is given. Computational experiments are performed on synthetic tests which are the sets of programs with the same CFGs but with different sequences of offsets used when accessing the one-dimensional array of a given size. The correlation coefficient between performance metric and distance to the worst program’s embedding is measured and proved to be negative regardless of t-SNE initialization. This fact proves the heuristic criterion to be true. The process of such synthetic tests generation is also considered. Moreover, the variety of performance metric in programs set in such a test is proposed as a metric to be improved with exploration of more tests generators.

  6. Shovin V.A.
    Confirmatory factor model of hypertension
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 885-894

    A new method of constructing orthogonal factor model based on the method of correlation pleiades and confirmatory factor analysis. A new algorithm for confirmatory factor analysis. Based on an original method built factor model of hypertension the first stage. The analysis of correlations and indices of arterial hypertension.

    Views (last year): 2. Citations: 7 (RSCI).
  7. The article discusses the problem of the influence of the research goals on the structure of the multivariate model of regression analysis (in particular, on the implementation of the procedure for reducing the dimension of the model). It is shown how bringing the specification of the multiple regression model in line with the research objectives affects the choice of modeling methods. Two schemes for constructing a model are compared: the first does not allow taking into account the typology of primary predictors and the nature of their influence on the performance characteristics, the second scheme implies a stage of preliminary division of the initial predictors into groups, in accordance with the objectives of the study. Using the example of solving the problem of analyzing the causes of burnout of creative workers, the importance of the stage of qualitative analysis and systematization of a priori selected factors is shown, which is implemented not by computing means, but by attracting the knowledge and experience of specialists in the studied subject area. The presented example of the implementation of the approach to determining the specification of the regression model combines formalized mathematical and statistical procedures and the preceding stage of the classification of primary factors. The presence of this stage makes it possible to explain the scheme of managing (corrective) actions (softening the leadership style and increasing approval lead to a decrease in the manifestations of anxiety and stress, which, in turn, reduces the severity of the emotional exhaustion of the team members). Preclassification also allows avoiding the combination in one main component of controlled and uncontrolled, regulatory and controlled feature factors, which could worsen the interpretability of the synthesized predictors. On the example of a specific problem, it is shown that the selection of factors-regressors is a process that requires an individual solution. In the case under consideration, the following were consistently used: systematization of features, correlation analysis, principal component analysis, regression analysis. The first three methods made it possible to significantly reduce the dimension of the problem, which did not affect the achievement of the goal for which this task was posed: significant measures of controlling influence on the team were shown. allowing to reduce the degree of emotional burnout of its participants.

  8. Timiryanova V.M., Lakman I.A., Larkin M.M.
    Retail forecasting on high-frequency depersonalized data
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1713-1734

    Technological development determines the emergence of highly detailed data in time and space, which expands the possibilities of analysis, allowing us to consider consumer decisions and the competitive behavior of enterprises in all their diversity, taking into account the context of the territory and the characteristics of time periods. Despite the promise of such studies, they are currently limited in the scientific literature. This is due to the range of problems, the solution of which is considered in this paper. The article draws attention to the complexity of the analysis of depersonalized high-frequency data and the possibility of modeling consumption changes in time and space based on them. The features of the new type of data are considered on the example of real depersonalized data received from the fiscal data operator “First OFD” (JSC “Energy Systems and Communications”). It is shown that along with the spectrum of problems inherent in high-frequency data, there are disadvantages associated with the process of generating data on the side of the sellers, which requires a wider use of data mining tools. A series of statistical tests were carried out on the data under consideration, including a Unit-Root Test, test for unobserved individual effects, test for serial correlation and for cross-sectional dependence in panels, etc. The presence of spatial autocorrelation of the data was tested using modified tests of Lagrange multipliers. The tests carried out showed the presence of a consistent correlation and spatial dependence of the data, which determine the expediency of applying the methods of panel and spatial analysis in relation to high-frequency data accumulated by fiscal operators. The constructed models made it possible to substantiate the spatial relationship of sales growth and its dependence on the day of the week. The limitation for increasing the predictive ability of the constructed models and their subsequent complication, due to the inclusion of explanatory factors, was the lack of open access statistics grouped in the required detail in time and space, which determines the relevance of the formation of high-frequency geographically structured data bases.

Pages: « first previous

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"