Результаты поиска по 'assessment':
Найдено статей: 104
  1. Sadovykh A., Ivanov V.
    Enhancing DevSecOps with continuous security requirements analysis and testing
    Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1687-1702

    The fast-paced environment of DevSecOps requires integrating security at every stage of software development to ensure secure, compliant applications. Traditional methods of security testing, often performed late in the development cycle, are insufficient to address the unique challenges of continuous integration and continuous deployment (CI/CD) pipelines, particularly in complex, high-stakes sectors such as industrial automation. In this paper, we propose an approach that automates the analysis and testing of security requirements by embedding requirements verification into the CI/CD pipeline. Our method employs the ARQAN tool to map high-level security requirements to Security Technical Implementation Guides (STIGs) using semantic search, and RQCODE to formalize these requirements as code, providing testable and enforceable security guidelines.We implemented ARQAN and RQCODE within a CI/CD framework, integrating them with GitHub Actions for realtime security checks and automated compliance verification. Our approach supports established security standards like IEC 62443 and automates security assessment starting from the planning phase, enhancing the traceability and consistency of security practices throughout the pipeline. Evaluation of this approach in collaboration with an industrial automation company shows that it effectively covers critical security requirements, achieving automated compliance for 66.15% of STIG guidelines relevant to the Windows 10 platform. Feedback from industry practitioners further underscores its practicality, as 85% of security requirements mapped to concrete STIG recommendations, with 62% of these requirements having matching testable implementations in RQCODE. This evaluation highlights the approach’s potential to shift security validation earlier in the development process, contributing to a more resilient and secure DevSecOps lifecycle.

  2. Vavilova D.D., Ketova K.V., Zerari R.
    Computer modeling of the gross regional product dynamics: a comparative analysis of neural network models
    Computer Research and Modeling, 2025, v. 17, no. 6, pp. 1219-1236

    Analysis of regional economic indicators plays a crucial role in management and development planning, with Gross Regional Product (GRP) serving as one of the key indicators of economic activity. The application of artificial intelligence, including neural network technologies, enables significant improvements in the accuracy and reliability of forecasts of economic processes. This study compares three neural network algorithm models for predicting the GRP of a typical region of the Russian Federation — the Udmurt Republic — based on time series data from 2000 to 2023. The selected models include a neural network with the Bat Algorithm (BA-LSTM), a neural network model based on backpropagation error optimized with a Genetic Algorithm (GA-BPNN), and a neural network model of Elman optimized using the Particle Swarm Optimization algorithm (PSO-Elman). The research involved stages of neural network modeling such as data preprocessing, training model, and comparative analysis based on accuracy and forecast quality metrics. This approach allows for evaluating the advantages and limitations of each model in the context of GRP forecasting, as well as identifying the most promising directions for further research. The utilization of modern neural network methods opens new opportunities for automating regional economic analysis and improving the quality of forecast assessments, which is especially relevant when data are limited and for rapid decision-making. The study uses factors such as the amount of production capital, the average annual number of labor resources, the share of high-tech and knowledge-intensive industries in GRP, and an inflation indicator as input data for predicting GRP. The high accuracy of the predictions achieved by including these factors in the neural network models confirms the strong correlation between these factors and GRP. The results demonstrate the exceptional accuracy of the BA-LSTM neural network model on validation data: the coefficient of determination was 0.82, and the mean absolute percentage error was 4.19%. The high performance and reliability of this model confirm its capacity to predict effectively the dynamics of the GRP. During the forecast period up to 2030, the Udmurt Republic is expected to experience an annual increase in Gross Regional Product (GRP) of +4.6% in current prices or +2.5% in comparable 2023 prices. By 2030, the GRP is projected to reach 1264.5 billion rubles.

  3. Khramtsova E.A., Kapralova I.V., Mezhevikina L.M.
    Prediction of embryo implantation potential by morphology assessment
    Computer Research and Modeling, 2010, v. 2, no. 1, pp. 111-116

    The early embryos developing in vitro to the blastocyst stage have low implantation potential. In the current work the microinjection was used to evaluate the most viable blastocysts with high implantation ability on the basis of morphology changing. The recovery rate of the embryo volume allows assessing the functional activity of trophoblast cells that involved in implantation. The predictive model is suggested to forecast the development effectiveness of blastocysts in vitro. It’s shown the recovery rate of the blastocyst volume after microinjection is the most important feature of implantation potential of early embryos. The maximal recovery rate of blastocyst volume (35.7 % of initial volume per 1 h) correlates with the embryos ability to generate the colonies 72 h after microinjection. By the area under receiver operator curve (AUC) it was shown that combination of such characteristics as blastocyst stage (middle and late) and recovery rate after microinjection allowed to predict the blastocyst development.

  4. Shumov V.V.
    Reconstruction of the security of the Roman Empire
    Computer Research and Modeling, 2016, v. 8, no. 1, pp. 169-200

    The paper considers the model of national security, which reflects the dichotomy of values development and conservation, to evaluate its options on the example of Russia (USSR), the United States, Germany and Ukraine. The calculations to assess the safety of the Roman Empire. It is shown that in 160 AD the conservation value of the function has reached a critically low values, which served as the impetus for modernization and reform.

    Views (last year): 3.
  5. Levashova N.T., Muhartova Ju.V., Olchev A.V.
    Three-dimensional modelling of turbulent transfer in the atmospheric surface layer using the theory of contrast structures
    Computer Research and Modeling, 2016, v. 8, no. 2, pp. 355-367

    A three-dimensional (3D) hydrodynamic model to describe the spatial patterns of wind and turbulence characteristics in the atmospheric surface layer over inhomogeneous vegetation cover is presented. To describe the interaction of air flow with vegetation the theory of contrast structures is used. The numerical experiments provided by a developed model to assess the impact of small clear-cutting on wind and turbulent regime in the atmospheric surface layer showed a significant influence of heterogeneous vegetation on the wind field and the turbulent exchange processes between the land surface and the atmosphere. Obtained results give a reasonable agreement with field experimental data and results of numerical experiments provided using alternative models.

    Views (last year): 3. Citations: 1 (RSCI).
  6. Svetlov K.V., Ivanov S.A.
    Stochastic model of voter dynamics in online media
    Computer Research and Modeling, 2019, v. 11, no. 5, pp. 979-997

    In the present article we explore the process of changing the level of approval of a political leader under the influence of the processes taking place in online platforms (social networks, forums, etc.). The driver of these changes is the interaction of users, through which they can exchange opinions with each other and formulate their position in relation to the political leader. In addition to interpersonal interaction, we will consider such factors as the information impact, expressed in the creation of an information flow with a given power and polarity (positive or negative, in the context of influencing the image of a political leader), as well as the presence of a group of agents (opinion leaders), supporting the leader, or, conversely, negatively affecting its representation in the media space.

    The mathematical basis of the presented research is the Kirman model, which has its roots in biology and initially found its application in economics. Within the framework of this model it is considered that each user is in one of the two possible states, and a Markov jump process describing transitions between these states is given. For the problem under consideration, these states are 0 or 1, depending on whether a particular agent is a supporter of a political leader or not. For further research, we find its diffusional approximation, known as the Jacoby process. With the help of spectral decomposition for the infinitesimal operator of this process we have an opportunity to find an analytical representation for the transition probability density.

    Analyzing the probabilities obtained in this way, we can assess the influence of individual factors of the model: the power and direction of the information flow, available to online users and relevant to the tasks of rating formation, as well as the number of supporters or opponents of the politician. Next, using the found eigenfunctions and eigenvalues, we derive expressions for the evaluation of conditional mathematical expectations of a politician’s rating, which can serve as a basis for building forecasts that are important for the formation of a strategy of representing a political leader in the online environment.

  7. Kovalenko I.B., Dreval V.D., Fedorov V.A., Kholina E.G., Gudimchuk N.B.
    Microtubule protofilament bending characterization
    Computer Research and Modeling, 2020, v. 12, no. 2, pp. 435-443

    This work is devoted to the analysis of conformational changes in tubulin dimers and tetramers, in particular, the assessment of the bending of microtubule protofilaments. Three recently exploited approaches for estimating the bend of tubulin protofilaments are reviewed: (1) measurement of the angle between the vector passing through the H7 helices in $\alpha$ and $\beta$ tubulin monomers in the straight structure and the same vector in the curved structure of tubulin; (2) measurement of the angle between the vector, connecting the centers of mass of the subunit and the associated GTP nucleotide, and the vector, connecting the centers of mass of the same nucleotide and the adjacent tubulin subunit; (3) measurement of the three rotation angles of the bent tubulin subunit relative to the straight subunit. Quantitative estimates of the angles calculated at the intra- and inter-dimer interfaces of tubulin in published crystal structures, calculated in accordance with the three metrics, are presented. Intra-dimer angles of tubulin in one structure, measured by the method (3), as well as measurements by this method of the intra-dimer angles in different structures, were more similar, which indicates a lower sensitivity of the method to local changes in tubulin conformation and characterizes the method as more robust. Measuring the angle of curvature between H7-helices (method 1) produces somewhat underestimated values of the curvature per dimer. Method (2), while at first glance generating the bending angle values, consistent the with estimates of curved protofilaments from cryoelectron microscopy, significantly overestimates the angles in the straight structures. For the structures of tubulin tetramers in complex with the stathmin protein, the bending angles calculated with all three metrics varied quite significantly for the first and second dimers (up to 20% or more), which indicates the sensitivity of all metrics to slight variations in the conformation of tubulin dimers within these complexes. A detailed description of the procedures for measuring the bending of tubulin protofilaments, as well as identifying the advantages and disadvantages of various metrics, will increase the reproducibility and clarity of the analysis of tubulin structures in the future, as well as it will hopefully make it easier to compare the results obtained by various scientific groups.

  8. Ansori Moch.F., Al Jasir H., Sihombing A.H., Putra S.M., Nurfaizah D.A., Nurulita E.
    Assessing the impact of deposit benchmark interest rate on banking loan dynamics
    Computer Research and Modeling, 2024, v. 16, no. 4, pp. 1023-1032

    Deposit benchmark interest rates are a policy implemented by banking regulators to calculate the interest rates offered to depositors, maintaining equitable and competitive rates within the financial industry. It functions as a benchmark for determining the pricing of different banking products, expenses, and financial choices. The benchmark rate will have a direct impact on the amount of money deposited, which in turn will determine the amount of money available for lending.We are motivated to analyze the influence of deposit benchmark interest rates on the dynamics of banking loans. This study examines the issue using a difference equation of banking loans. In this process, the decision on the loan amount in the next period is influenced by both the present loan volume and the information on its marginal profit. An analysis is made of the loan equilibrium point and its stability. We also analyze the bifurcations that arise in the model. To ensure a stable banking loan, it is necessary to set the benchmark rate higher than the flip value and lower than the transcritical bifurcation values. The confirmation of this result is supported by the bifurcation diagram and its associated Lyapunov exponent. Insufficient deposit benchmark interest rates might lead to chaotic dynamics in banking lending. Additionally, a bifurcation diagram with two parameters is also shown. We do numerical sensitivity analysis by examining contour plots of the stability requirements, which vary with the deposit benchmark interest rate and other parameters. In addition, we examine a nonstandard difference approach for the previous model, assess its stability, and make a comparison with the standard model. The outcome of our study can provide valuable insights to the banking regulator in making informed decisions regarding deposit benchmark interest rates, taking into account several other banking factors.

  9. Dhivyadharshini B., Senthamarai R.
    Modeling the indirect impact of rhinoceros beetle control on red palm weevils in coconut plantations
    Computer Research and Modeling, 2025, v. 17, no. 4, pp. 737-752

    In this paper, a mathematical model is developed and analyzed to assess the indirect impact of controlling rhinoceros beetles on red palm weevil populations in coconut plantations. The model consists of a system of six non-linear ordinary differential equations (ODEs), capturing the interactions among healthy and infected coconut trees, rhinoceros beetles, red palm weevils, and the oryctes virus. The model ensures biological feasibility through positivity and boundedness analysis. The basic reproduction number $R_0$ is derived using the next-generation matrix method. Both local and global stability of the equilibrium points are analyzed to determine conditions for pest persistence or eradication. Sensitivity analysis identifies the most influential parameters for pest management. Numerical simulations reveal that by effectively controlling the rhinoceros beetle population particularly through infection with the oryctes virus, the spread of the red palm weevil can also be suppressed. This indirect control mechanism helps to protect the coconut tree population more efficiently and supports sustainable pest management in coconut plantations.

  10. Belyaeva A.V.
    Comparing the effectiveness of computer mass appraisal methods
    Computer Research and Modeling, 2015, v. 7, no. 1, pp. 185-196

    Location-based models — one of areas of CAMA (computer-assisted mass apriasal) building. When taking into account the location of the object using spatial autoregressive models structure of models (type of spatial autocorrelation, choice of “nearest neighbors”) cannot always be determined before its construction. Moreover, in practice there are situations where more efficient methods are taking into account different rates depending on the type of the object from its location. In this regard there are important issues in spatial methods area:

    – fields of methods efficacy;

    – sensitivity of the methods on the choice of the type of spatial model and on the selected number of nearest neighbors.

    This article presents a methodology for assessing the effectiveness of computer evaluation of real estate objects. There are results of approbation on methods based on location information of the objects.

    Views (last year): 2.
Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"