Результаты поиска по 'factors':
Найдено статей: 130
  1. Shovin V.A.
    Confirmatory factor model of hypertension
    Computer Research and Modeling, 2012, v. 4, no. 4, pp. 885-894

    A new method of constructing orthogonal factor model based on the method of correlation pleiades and confirmatory factor analysis. A new algorithm for confirmatory factor analysis. Based on an original method built factor model of hypertension the first stage. The analysis of correlations and indices of arterial hypertension.

    Views (last year): 2. Citations: 7 (RSCI).
  2. Gankevich I.G., Degtyarev A.B.
    Efficient processing and classification of wave energy spectrum data with a distributed pipeline
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 517-520

    Processing of large amounts of data often consists of several steps, e.g. pre- and post-processing stages, which are executed sequentially with data written to disk after each step, however, when pre-processing stage for each task is different the more efficient way of processing data is to construct a pipeline which streams data from one stage to another. In a more general case some processing stages can be factored into several parallel subordinate stages thus forming a distributed pipeline where each stage can have multiple inputs and multiple outputs. Such processing pattern emerges in a problem of classification of wave energy spectra based on analytic approximations which can extract different wave systems and their parameters (e.g. wave system type, mean wave direction) from spectrum. Distributed pipeline approach achieves good performance compared to conventional “sequential-stage” processing.

    Views (last year): 3. Citations: 2 (RSCI).
  3. Andreeva A.A., Anand M., Lobanov A.I., Nikolaev A.V., Panteleev M.A.
    Using extended ODE systems to investigate the mathematical model of the blood coagulation
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 931-951

    Many properties of ordinary differential equations systems solutions are determined by the properties of the equations in variations. An ODE system, which includes both the original nonlinear system and the equations in variations, will be called an extended system further. When studying the properties of the Cauchy problem for the systems of ordinary differential equations, the transition to extended systems allows one to study many subtle properties of solutions. For example, the transition to the extended system allows one to increase the order of approximation for numerical methods, gives the approaches to constructing a sensitivity function without using numerical differentiation procedures, allows to use methods of increased convergence order for the inverse problem solution. Authors used the Broyden method belonging to the class of quasi-Newtonian methods. The Rosenbroke method with complex coefficients was used to solve the stiff systems of the ordinary differential equations. In our case, it is equivalent to the second order approximation method for the extended system.

    As an example of the proposed approach, several related mathematical models of the blood coagulation process were considered. Based on the analysis of the numerical calculations results, the conclusion was drawn that it is necessary to include a description of the factor XI positive feedback loop in the model equations system. Estimates of some reaction constants based on the numerical inverse problem solution were given.

    Effect of factor V release on platelet activation was considered. The modification of the mathematical model allowed to achieve quantitative correspondence in the dynamics of the thrombin production with experimental data for an artificial system. Based on the sensitivity analysis, the hypothesis tested that there is no influence of the lipid membrane composition (the number of sites for various factors of the clotting system, except for thrombin sites) on the dynamics of the process.

  4. Dubinina M.G.
    Spatio-temporal models of ICT diffusion
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1695-1712

    The article proposes a space-time approach to modeling the diffusion of information and communication technologies based on the Fisher –Kolmogorov– Petrovsky – Piskunov equation, in which the diffusion kinetics is described by the Bass model, which is widely used to model the diffusion of innovations in the market. For this equation, its equilibrium positions are studied, and based on the singular perturbation theory, was obtained an approximate solution in the form of a traveling wave, i. e. a solution that propagates at a constant speed while maintaining its shape in space. The wave speed shows how much the “spatial” characteristic, which determines the given level of technology dissemination, changes in a single time interval. This speed is significantly higher than the speed at which propagation occurs due to diffusion. By constructing such an autowave solution, it becomes possible to estimate the time required for the subject of research to achieve the current indicator of the leader.

    The obtained approximate solution was further applied to assess the factors affecting the rate of dissemination of information and communication technologies in the federal districts of the Russian Federation. Various socio-economic indicators were considered as “spatial” variables for the diffusion of mobile communications among the population. Growth poles in which innovation occurs are usually characterized by the highest values of “spatial” variables. For Russia, Moscow is such a growth pole; therefore, indicators of federal districts related to Moscow’s indicators were considered as factor indicators. The best approximation to the initial data was obtained for the ratio of the share of R&D costs in GRP to the indicator of Moscow, average for the period 2000–2009. It was found that for the Ural Federal District at the initial stage of the spread of mobile communications, the lag behind the capital was less than one year, for the Central Federal District, the Northwestern Federal District — 1.4 years, for the Volga Federal District, the Siberian Federal District, the Southern Federal District and the Far Eastern Federal District — less than two years, in the North Caucasian Federal District — a little more 2 years. In addition, estimates of the delay time for the spread of digital technologies (intranet, extranet, etc.) used by organizations of the federal districts of the Russian Federation from Moscow indicators were obtained.

  5. Gorbachev O.G.
    Probabilistic-statistical model of insurance capital
    Computer Research and Modeling, 2012, v. 4, no. 1, pp. 231-235

    The article reveals the necessity of introduction of new economic category such as “insurance capital”. Insurance activity generates a specific kind of capital (as a production factor) – the guarantee fund, which is called “primary insurance monetary capital". The article establishes that, due to its probabilistic and statistical nature, the insurance capital has a number of specific features in addition to conventional characteristics of capital as a production factor. Basing on probabilistic-statistical model author investigates the role of insurance capital in the formation of price for insurance services. In particular, the author exposes that the law of diminishing returns is not universal when talking about insurance capital.

    Views (last year): 1. Citations: 2 (RSCI).
  6. The article discusses the problem of the influence of the research goals on the structure of the multivariate model of regression analysis (in particular, on the implementation of the procedure for reducing the dimension of the model). It is shown how bringing the specification of the multiple regression model in line with the research objectives affects the choice of modeling methods. Two schemes for constructing a model are compared: the first does not allow taking into account the typology of primary predictors and the nature of their influence on the performance characteristics, the second scheme implies a stage of preliminary division of the initial predictors into groups, in accordance with the objectives of the study. Using the example of solving the problem of analyzing the causes of burnout of creative workers, the importance of the stage of qualitative analysis and systematization of a priori selected factors is shown, which is implemented not by computing means, but by attracting the knowledge and experience of specialists in the studied subject area. The presented example of the implementation of the approach to determining the specification of the regression model combines formalized mathematical and statistical procedures and the preceding stage of the classification of primary factors. The presence of this stage makes it possible to explain the scheme of managing (corrective) actions (softening the leadership style and increasing approval lead to a decrease in the manifestations of anxiety and stress, which, in turn, reduces the severity of the emotional exhaustion of the team members). Preclassification also allows avoiding the combination in one main component of controlled and uncontrolled, regulatory and controlled feature factors, which could worsen the interpretability of the synthesized predictors. On the example of a specific problem, it is shown that the selection of factors-regressors is a process that requires an individual solution. In the case under consideration, the following were consistently used: systematization of features, correlation analysis, principal component analysis, regression analysis. The first three methods made it possible to significantly reduce the dimension of the problem, which did not affect the achievement of the goal for which this task was posed: significant measures of controlling influence on the team were shown. allowing to reduce the degree of emotional burnout of its participants.

  7. Timiryanova V.M., Lakman I.A., Larkin M.M.
    Retail forecasting on high-frequency depersonalized data
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1713-1734

    Technological development determines the emergence of highly detailed data in time and space, which expands the possibilities of analysis, allowing us to consider consumer decisions and the competitive behavior of enterprises in all their diversity, taking into account the context of the territory and the characteristics of time periods. Despite the promise of such studies, they are currently limited in the scientific literature. This is due to the range of problems, the solution of which is considered in this paper. The article draws attention to the complexity of the analysis of depersonalized high-frequency data and the possibility of modeling consumption changes in time and space based on them. The features of the new type of data are considered on the example of real depersonalized data received from the fiscal data operator “First OFD” (JSC “Energy Systems and Communications”). It is shown that along with the spectrum of problems inherent in high-frequency data, there are disadvantages associated with the process of generating data on the side of the sellers, which requires a wider use of data mining tools. A series of statistical tests were carried out on the data under consideration, including a Unit-Root Test, test for unobserved individual effects, test for serial correlation and for cross-sectional dependence in panels, etc. The presence of spatial autocorrelation of the data was tested using modified tests of Lagrange multipliers. The tests carried out showed the presence of a consistent correlation and spatial dependence of the data, which determine the expediency of applying the methods of panel and spatial analysis in relation to high-frequency data accumulated by fiscal operators. The constructed models made it possible to substantiate the spatial relationship of sales growth and its dependence on the day of the week. The limitation for increasing the predictive ability of the constructed models and their subsequent complication, due to the inclusion of explanatory factors, was the lack of open access statistics grouped in the required detail in time and space, which determines the relevance of the formation of high-frequency geographically structured data bases.

  8. Aronov I.Z., Maksimova O.V.
    Theoretical modeling consensus building in the work of standardization technical committees in coalitions based on regular Markov chains
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1247-1256

    Often decisions in social groups are made by consensus. This applies, for example, to the examination in the technical committee for standardization (TC) before the approval of the national standard by Rosstandart. The standard is approved if and only if the secured consensus in the TC. The same approach to standards development was adopted in almost all countries and at the regional and international level. Previously published works of authors dedicated to the construction of a mathematical model of time to reach consensus in technical committees for standardization in terms of variation in the number of TC members and their level of authoritarianism. The present study is a continuation of these works for the case of the formation of coalitions that are often formed during the consideration of the draft standard to the TC. In the article the mathematical model is constructed to ensure consensus on the work of technical standardization committees in terms of coalitions. In the framework of the model it is shown that in the presence of coalitions consensus is not achievable. However, the coalition, as a rule, are overcome during the negotiation process, otherwise the number of the adopted standards would be extremely small. This paper analyzes the factors that influence the bridging coalitions: the value of the assignment and an index of the effect of the coalition. On the basis of statistical modelling of regular Markov chains is investigated their effects on the time to ensure consensus in the technical Committee. It is proved that the time to reach consensus significantly depends on the value of unilateral concessions coalition and weakly depends on the size of coalitions. Built regression model of dependence of the average number of approvals from the value of the assignment. It was revealed that even a small concession leads to the onset of consensus, increasing the size of the assignment results (with other factors being equal) to a sharp decline in time before the consensus. It is shown that the assignment of a larger coalition against small coalitions takes on average more time before consensus. The result has practical value for all organizational structures, where the emergence of coalitions entails the inability of decision-making in the framework of consensus and requires the consideration of various methods for reaching a consensus decision.

  9. Tkachenko I.A.
    Experience of puppet usage for managment of Tier-1 GRID cluster at NRC “Kurchatov Institute”
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 735-740

    This article is about the organization of the cluster management using puppet. It tells about: safety of usage, from the point of view of mass apply at a computing cluster wrong configuration (by reason of human factor); collaboration work and the creation of opportunities for each cluster administrator, regardless of others, writing and debugging your own scripts, before include them in the overall system of cluster managment; writing scripts, which allow to get as fully configured nodes, and updates the configuration of any system parts, without affecting the rest of the nodes components, regardless of the current state of the node of computing cluster.

    The article compares different methods of the creation of the hierarchy of puppet scenarios, describes problems associated with the use of “include” for the organization hierarchy, and tells about the transition to a system of sequential call classes through shell-script.

  10. Degtyarev A.B., Myo Min S., Wunna K.
    Cloud computing for virtual testbed
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 753-758

    Nowadays cloud computing is an important topic in the field of information technology and computer system. Several companies and educational institutes have deployed cloud infrastructures to overcome their problems such as easy data access, software updates with minimal cost, large or unlimited storage, efficient cost factor, backup storage and disaster recovery, and some other benefits if compare with the traditional network infrastructures. The paper present the study of cloud computing technology for marine environmental data and processing. Cloud computing of marine environment information is proposed for the integration and sharing of marine information resources. It is highly desirable to perform empirical requiring numerous interactions with web servers and transfers of very large archival data files without affecting operational information system infrastructure. In this paper, we consider the cloud computing for virtual testbed to minimize the cost. That is related to real time infrastructure.

    Views (last year): 7.
Pages: « first previous

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"