Результаты поиска по 'factor analysis':
Найдено статей: 62
  1. Usanov M.S., Kulberg N.S., Yakovleva T.V., Morozov S.P.
    Determination of CT dose by means of noise analysis
    Computer Research and Modeling, 2018, v. 10, no. 4, pp. 525-533

    The article deals with the process of creating an effective algorithm for determining the amount of emitted quanta from an X-ray tube in computer tomography (CT) studies. An analysis of domestic and foreign literature showed that most of the work in the field of radiometry and radiography takes the tabulated values of X-ray absorption coefficients into account, while individual dose factors are not taken into account at all since many studies are lacking the Dose Report. Instead, an average value is used to simplify the calculation of statistics. In this regard, it was decided to develop a method to detect the amount of ionizing quanta by analyzing the noise of CT data. As the basis of the algorithm, we used Poisson and Gauss distribution mathematical model of owns’ design of logarithmic value. The resulting mathematical model was tested on the CT data of a calibration phantom consisting of three plastic cylinders filled with water, the X-ray absorption coefficient of which is known from the table values. The data were obtained from several CT devices from different manufacturers (Siemens, Toshiba, GE, Phillips). The developed algorithm made it possible to calculate the number of emitted X-ray quanta per unit time. These data, taking into account the noise level and the radiuses of the cylinders, were converted to X-ray absorption values, after which a comparison was made with tabulated values. As a result of this operation, the algorithm used with CT data of various configurations, experimental data were obtained, consistent with the theoretical part and the mathematical model. The results showed good accuracy of the algorithm and mathematical apparatus, which shows reliability of the obtained data. This mathematical model is already used in the noise reduction program of the CT of own design, where it participates as a method of creating a dynamic threshold of noise reduction. At the moment, the algorithm is being processed to work with real data from computer tomography of patients.

    Views (last year): 23. Citations: 1 (RSCI).
  2. Varshavsky L.E.
    Studying indicators of development of oligopolistic markets on the basis of operational calculus
    Computer Research and Modeling, 2019, v. 11, no. 5, pp. 949-963

    The traditional approach to computing optimal game strategies of firms on oligopolistic markets and of indicators of such markets consists in studying linear dynamical games with quadratic criteria and solving generalized matrix Riccati equations.

    The other approach proposed by the author is based on methods of operational calculus (in particular, Z-transform). This approach makes it possible to achieve economic meaningful decisions under wider field of parameter values. It characterizes by simplicity of computations and by necessary for economic analysis visibility. One of its advantages is that in many cases important for economic practice, it, in contrast to the traditional approach, provides the ability to make calculations using widespread spreadsheets, which allows to study the prospects for the development of oligopolistic markets to a wide range of professionals and consumers.

    The article deals with the practical aspects of determining the optimal Nash–Cournot strategies of participants in oligopolistic markets on the basis of operational calculus, in particular the technique of computing the optimal Nash–Cournot strategies in Excel. As an illustration of the opportinities of the proposed methods of calculation, examples close to the practical problems of forecasting indicators of the markets of high-tech products are studied.

    The results of calculations obtained by the author for numerous examples and real economic systems, both using the obtained relations on the basis of spreadsheets and using extended Riccati equations, are very close. In most of the considered practical problems, the deviation of the indicators calculated in accordance with the two approaches, as a rule, does not exceed 1.5–2%. The highest value of relative deviations (up to 3–5%) is observed at the beginning of the forecasting period. In typical cases, the period of relatively noticeable deviations is 3–5 moments of time. After the transition period, there is almost complete agreement of the values of the required indicators using both approaches.

  3. Revutskaya O.L., Neverova G.P., Frisman E.Y.
    A minimal model of density-dependent population dynamics incorporating sex structure: simulation and application
    Computer Research and Modeling, 2025, v. 17, no. 5, pp. 941-961

    This study proposes and analyzes a discrete-time mathematical model of population dynamics with seasonal reproduction, taking into account the density-dependent regulation and sex structure. In the model, population birth rate depends on the number of females, while density is regulated through juvenile survival, which decreases exponentially with increasing total population size. Analytical and numerical investigations of the model demonstrate that when more than half of both females and males survive, the population exhibits stable dynamics even at relatively high birth rates. Oscillations arise when the limitation of female survival exceeds that of male survival. Increasing the intensity of male survival limitation can stabilize population dynamics, an effect particularly evident when the proportion of female offspring is low. Depending on parameter values, the model exhibits stable, periodic, or irregular dynamics, including multistability, where changes in current population size driven by external factors can shift the system between coexisting dynamic modes. To apply the model to real populations, we propose an approach for estimating demographic parameters based on total abundance data. The key idea is to reduce the two-component discrete model with sex structure to a delay equation dependent only on total population size. In this formulation, the initial sex structure is expressed through total abundance and depends on demographic parameters. The resulting one-dimensional equation was applied to describe and estimate demographic characteristics of ungulate populations in the Jewish Autonomous Region. The delay equation provides a good fit to the observed dynamics of ungulate populations, capturing long-term trends in abundance. Point estimates of parameters fall within biologically meaningful ranges and produce population dynamics consistent with field observations. For moose, roe deer, and musk deer, the model suggests predominantly stable dynamics, while annual fluctuations are primarily driven by external factors and represent deviations from equilibrium. Overall, these estimates enable the analysis of structured population dynamics alongside short-term forecasting based on total abundance data.

  4. Giricheva E.E., Abakumov A.I.
    Spatiotemporal dynamics and the principle of competitive exclusion in community
    Computer Research and Modeling, 2017, v. 9, no. 5, pp. 815-824

    Execution or violation of the principle of competitive exclusion in communities is the subject of many studies. The principle of competitive exclusion means that coexistence of species in community is impossible if the number of species exceeds the number of controlling mutually independent factors. At that time there are many examples displaying the violations of this principle in the natural systems. The explanations for this paradox vary from inexact identification of the set of factors to various types of spatial and temporal heterogeneities. One of the factors breaking the principle of competitive exclusion is intraspecific competition. This study holds the model of community with two species and one influencing factor with density-dependent mortality and spatial heterogeneity. For such models possibility of the existence of stable equilibrium is proved in case of spatial homogeneity and negative effect of the species on the factor. Our purpose is analysis of possible variants of dynamics of the system with spatial heterogeneity under the various directions of the species effect on the influencing factor. Numerical analysis showed that there is stable coexistence of the species agreed with homogenous spatial distributions of the species if the species effects on the influencing factor are negative. Density-dependent mortality and spatial heterogeneity lead to violation of the principle of competitive exclusion when equilibriums are Turing unstable. In this case stable spatial heterogeneous patterns can arise. It is shown that Turing instability is possible if at least one of the species effects is positive. Model nonlinearity and spatial heterogeneity cause violation of the principle of competitive exclusion in terms of both stable spatial homogenous states and quasistable spatial heterogeneous patterns.

    Views (last year): 11.
  5. Karpaev A.A., Aliev R.R.
    Application of simplified implicit Euler method for electrophysiological models
    Computer Research and Modeling, 2020, v. 12, no. 4, pp. 845-864

    A simplified implicit Euler method was analyzed as an alternative to the explicit Euler method, which is a commonly used method in numerical modeling in electrophysiology. The majority of electrophysiological models are quite stiff, since the dynamics they describe includes a wide spectrum of time scales: a fast depolarization, that lasts milliseconds, precedes a considerably slow repolarization, with both being the fractions of the action potential observed in excitable cells. In this work we estimate stiffness by a formula that does not require calculation of eigenvalues of the Jacobian matrix of the studied ODEs. The efficiency of the numerical methods was compared on the case of typical representatives of detailed and conceptual type models of excitable cells: Hodgkin–Huxley model of a neuron and Aliev–Panfilov model of a cardiomyocyte. The comparison of the efficiency of the numerical methods was carried out via norms that were widely used in biomedical applications. The stiffness ratio’s impact on the speedup of simplified implicit method was studied: a real gain in speed was obtained for the Hodgkin–Huxley model. The benefits of the usage of simple and high-order methods for electrophysiological models are discussed along with the discussion of one method’s stability issues. The reasons for using simplified instead of high-order methods during practical simulations were discussed in the corresponding section. We calculated higher order derivatives of the solutions of Hodgkin-Huxley model with various stiffness ratios; their maximum absolute values appeared to be quite large. A numerical method’s approximation constant’s formula contains the latter and hence ruins the effect of the other term (a small factor which depends on the order of approximation). This leads to the large value of global error. We committed a qualitative stability analysis of the explicit Euler method and were able to estimate the model’s parameters influence on the border of the region of absolute stability. The latter is used when setting the value of the timestep for simulations a priori.

  6. Dementiev V.E.
    The model of interference of long waves of economic development
    Computer Research and Modeling, 2021, v. 13, no. 3, pp. 649-663

    The article substantiates the need to develop and analyze mathematical models that take into account the mutual influence of long (Kondratiev) waves of economic development. The analysis of the available publications shows that at the model level, the direct and inverse relationships between intersecting long waves are still insufficiently studied. As practice shows, the production of the current long wave can receive an additional impetus for growth from the technologies of the next long wave. The technologies of the next industrial revolution often serve as improving innovations for the industries born of the previous industrial revolution. As a result, the new long wave increases the amplitude of the oscillations of the trajectory of the previous long wave. Such results of the interaction of long waves in the economy are similar to the effects of interference of physical waves. The mutual influence of the recessions and booms of the economies of different countries gives even more grounds for comparing the consequences of this mutual influence with the interference of physical waves. The article presents a model for the development of the technological base of production, taking into account the possibilities of combining old and new technologies. The model consists of several sub-models. The use of a different mathematical description for the individual stages of updating the technological base of production allows us to take into account the significant differences between the successive phases of the life cycle of general purpose technologies, considered in modern literature as the technological basis of industrial revolutions. One of these phases is the period of formation of the appropriate infrastructure necessary for the intensive diffusion of new general purpose technology, for the rapid development of industries using this technology. The model is used for illustrative calculations with the values of exogenous parameters corresponding to the logic of changing long waves. Despite all the conditionality of the illustrative calculations, the configuration of the curve representing the change in the return on capital in the simulated period is close to the configuration of the real trajectory of the return on private fixed assets of the US economy in the period 1982-2019. The factors that remained outside the scope of the presented model, but which are advisable to take into account when describing the interference of long waves of economic development, are indicated.

  7. Vasiliev I.A., Dubinya N.V., Tikhotskiy S.A., Nachev V.A., Alexeev D.A.
    Numerical model of jack-up rig’s mechanical behavior under seismic loading
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 853-871

    The paper presents results of numerical modeling of stress-strain state of jack-up rigs used for shelf hydrocarbon reservoirs exploitation. The work studied the equilibrium stress state of a jack-up rig standing on seafloor and mechanical behavior of the rig under seismic loading. Surface elastic wave caused by a distant earthquake acts a reason for the loading. Stability of jack-up rig is the main topic of the research, as stability can be lost due to redistribution of stresses and strains in the elements of the rig due to seismic loading. Modeling results revealed that seismic loading can indeed lead to intermittent growth of stresses in particular elements of the rig’s support legs resulting into stability loss. These results were obtained using the finite element-based numerical scheme. The paper contains the proof of modeling results convergence obtained from analysis of one problem — the problem of stresses and strains distributions for the contact problem of a rigid cylinder indenting on elastic half space. The comparison between numerical and analytical solutions proved the used numerical scheme to be correct, as obtained results converged. The paper presents an analysis of the different factors influencing the mechanical behavior of the studied system. These factors include the degree of seismic loading, mechanical properties of seafloor sediments, and depth of support legs penetration. The results obtained from numerical modeling made it possible to formulate preliminary conclusions regarding the need to take site-specific conditions into account whenever planning the use of jack-up rigs, especially, in the regions with seismic activity. The approach presented in the paper can be used to evaluate risks related to offshore hydrocarbon reservoirs exploitation and development, while the reported numerical scheme can be used to solve some contact problems of theory of elasticity with the need to analyze dynamic processes.

  8. Ansori Moch.F., Al Jasir H., Sihombing A.H., Putra S.M., Nurfaizah D.A., Nurulita E.
    Assessing the impact of deposit benchmark interest rate on banking loan dynamics
    Computer Research and Modeling, 2024, v. 16, no. 4, pp. 1023-1032

    Deposit benchmark interest rates are a policy implemented by banking regulators to calculate the interest rates offered to depositors, maintaining equitable and competitive rates within the financial industry. It functions as a benchmark for determining the pricing of different banking products, expenses, and financial choices. The benchmark rate will have a direct impact on the amount of money deposited, which in turn will determine the amount of money available for lending.We are motivated to analyze the influence of deposit benchmark interest rates on the dynamics of banking loans. This study examines the issue using a difference equation of banking loans. In this process, the decision on the loan amount in the next period is influenced by both the present loan volume and the information on its marginal profit. An analysis is made of the loan equilibrium point and its stability. We also analyze the bifurcations that arise in the model. To ensure a stable banking loan, it is necessary to set the benchmark rate higher than the flip value and lower than the transcritical bifurcation values. The confirmation of this result is supported by the bifurcation diagram and its associated Lyapunov exponent. Insufficient deposit benchmark interest rates might lead to chaotic dynamics in banking lending. Additionally, a bifurcation diagram with two parameters is also shown. We do numerical sensitivity analysis by examining contour plots of the stability requirements, which vary with the deposit benchmark interest rate and other parameters. In addition, we examine a nonstandard difference approach for the previous model, assess its stability, and make a comparison with the standard model. The outcome of our study can provide valuable insights to the banking regulator in making informed decisions regarding deposit benchmark interest rates, taking into account several other banking factors.

  9. Shakhgeldyan K.I., Kuksin N.S., Domzhalov I.G., Pak R.L., Geltser B.I.
    Random forest of risk factors as a predictive tool for adverse events in clinical medicine
    Computer Research and Modeling, 2025, v. 17, no. 5, pp. 987-1004

    The aim of study was to develop an ensemble machine learning method for constructing interpretable predictive models and to validate it using the example of predicting in-hospital mortality (IHM) in patients with ST-segment elevation myocardial infarction (STEMI).

    A retrospective cohort study was conducted using data from 5446 electronic medical records of STEMI patients who underwent percutaneous coronary intervention (PCI). Patients were divided into two groups: 335 (6.2%) patients who died during hospitalization and 5111 (93.8%) patients with a favourable in-hospital outcome. A pool of potential predictors was formed using statistical methods. Through multimetric categorization (minimizing p-values, maximizing the area under the ROC curve (AUC), and SHAP value analysis), decision trees, and multivariable logistic regression (MLR), predictors were transformed into risk factors for IHM. Predictive models for IHM were developed using MLR, Random Forest Risk Factors (RandFRF), Stochastic Gradient Boosting (XGboost), Random Forest (RF), Adaptive boosting, Gradient Boosting, Light Gradient-Boosting Machine, Categorical Boosting (CatBoost), Explainable Boosting Machine and Stacking methods.

    Authors developed the RandFRF method, which integrates the predictive outcomes of modified decision trees, identifies risk factors and ranks them based on their contribution to the risk of adverse outcomes. RandFRF enables the development of predictive models with high discriminative performance (AUC 0.908), comparable to models based on CatBoost and Stacking (AUC 0.904 and 0.908, respectively). In turn, risk factors provide clinicians with information on the patient’s risk group classification and the extent of their impact on the probability of IHM. The risk factors identified by RandFRF can serve not only as rationale for the prediction results but also as a basis for developing more accurate models.

  10. Rukhlenko A.S., Zlobina K.E., Guria G.T.
    Hydrodynamical activation of blood coagulation in stenosed vessels. Theoretical analysis
    Computer Research and Modeling, 2012, v. 4, no. 1, pp. 155-183

    The mechanisms of hydrodynamical activation of blood coagulation system are investigated in stenosed vessels for a wide range of Reynolds number values (from 10 up to 500). It is assumed that the vessel wall permeability for procoagulant factors rapidly increases when wall shear stress exceeds specific threshold value. A number of patterns of blood coagulation processes development are described. The influence of blood flow topology changes on activation of blood coagulation is explored. It is established that not only blood flow decrease, but also its increase may promote activation of blood coagulation. It was found that dependence of thrombogenic danger of stenosis on vessel lumen blockage ratio is non-monotonic. The relevance of obtained theoretical results for clinical practice is discussed.

    Views (last year): 2. Citations: 5 (RSCI).
Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"