All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Physical analysis and mathematical modeling of the parameters of explosion region produced in a rarefied ionosphere
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 817-833The paper presents a physical and numerical analysis of the dynamics and radiation of explosion products formed during the Russian-American experiment in the ionosphere using an explosive generator based on hexogen (RDX) and trinitrotoluene (TNT). The main attention is paid to the radiation of the perturbed region and the dynamics of the products of explosion (PE). The detailed chemical composition of the explosion products is analyzed and the initial concentrations of the most important molecules capable of emitting in the infrared range of the spectrum are determined, and their radiative constants are given. The initial temperature of the explosion products and the adiabatic exponent are determined. The nature of the interpenetration of atoms and molecules of a highly rarefied ionosphere into a spherically expanding cloud of products is analyzed. An approximate mathematical model of the dynamics of explosion products under conditions of mixing rarefied ionospheric air with them has been developed and the main thermodynamic characteristics of the system have been calculated. It is shown that for a time of 0,3–3 sec there is a significant increase in the temperature of the scattering mixture as a result of its deceleration. In the problem under consideration the explosion products and the background gas are separated by a contact boundary. To solve this two-region gas dynamic problem a numerical algorithm based on the Lagrangian approach was developed. It was necessary to fulfill special conditions at the contact boundary during its movement in a stationary gas. In this case there are certain difficulties in describing the parameters of the explosion products near the contact boundary which is associated with a large difference in the size of the mass cells of the explosion products and the background due to a density difference of 13 orders of magnitude. To reduce the calculation time of this problem an irregular calculation grid was used in the area of explosion products. Calculations were performed with different adiabatic exponents. The most important result is temperature. It is in good agreement with the results obtained by the method that approximately takes into account interpenetration. The time behavior of the IR emission coefficients of active molecules in a wide range of the spectrum is obtained. This behavior is qualitatively consistent with experiments for the IR glow of flying explosion products.
-
NLP-based automated compliance checking of data processing agreements against General Data Protection Regulation
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1667-1685As it stands in the contemporary world, compliance with regulations concerning data protection such as GDPR is central to organizations. Another important issue analysis identified is the fact that compliance is hampered by the fact that legal documents are often complex and that regulations are ever changing. This paper aims to describe the ways in which NLP aids in keeping GDPR compliance effortless through automated scanning for compliance, evaluating privacy policies, and increasing the level of transparency. The work does not only limit to exploring the application of NLP for dealing with the privacy policies and facilitate better understanding of the third-party data sharing but also proceed to perform the preliminary studies to evaluate the difference of several NLP models. They implement and execute the models to distinguish the one that performs the best based on the efficiency and speed at which it automates the process of compliance verification and analyzing the privacy policy. Moreover, some of the topics discussed in the research deal with the possibility of using automatic tools and data analysis to GDPR, for instance, generation of the machine readable models that assist in evaluation of compliance. Among the evaluated models from our studies, SBERT performed best at the policy level with an accuracy of 0.57, precision of 0.78, recall of 0.83, and F1-score of 0.80. BERT showed the highest performance at the sentence level, achieving an accuracy of 0.63, precision of 0.70, recall of 0.50, and F1-score of 0.55. Therefore, this paper emphasizes the importance of NLP to help organizations overcome the difficulties of GDPR compliance, create a roadmap to a more client-oriented data protection regime. In this regard, by comparing preliminary studies done in the test and showing the performance of the better model, it helps enhance the measures taken in compliance and fosters the defense of individual rights in the cyberspace.
-
Numerical simulation of corium cooling driven by natural convection in case of in-vessel retention and time-dependent heat generation
Computer Research and Modeling, 2021, v. 13, no. 4, pp. 807-822Represented study considers numerical simulation of corium cooling driven by natural convection within a horizontal hemicylindrical cavity, boundaries of which are assumed isothermal. Corium is a melt of ceramic fuel of a nuclear reactor and oxides of construction materials.
Corium cooling is a process occurring during severe accident associated with core melt. According to invessel retention conception, the accident may be restrained and localized, if the corium is contained within the vessel, only if it is cooled externally. This conception has a clear advantage over the melt trap, it can be implemented at already operating nuclear power plants. Thereby proper numerical analysis of the corium cooling has become such a relevant area of studies.
In the research, we assume the corium is contained within a horizontal semitube. The corium initially has temperature of the walls. In spite of reactor shutdown, the corium still generates heat owing to radioactive decays, and the amount of heat released decreases with time accordingly to Way–Wigner formula. The system of equations in Boussinesq approximation including momentum equation, continuity equation and energy equation, describes the natural convection within the cavity. Convective flows are taken to be laminar and two-dimensional.
The boundary-value problem of mathematical physics is formulated using the non-dimensional nonprimitive variables «stream function – vorticity». The obtained differential equations are solved numerically using the finite difference method and locally one-dimensional Samarskii scheme for the equations of parabolic type.
As a result of the present research, we have obtained the time behavior of mean Nusselt number at top and bottom walls for Rayleigh number ranged from 103 to 106. These mentioned dependences have been analyzed for various dimensionless operation periods before the accident. Investigations have been performed using streamlines and isotherms as well as time dependences for convective flow and heat transfer rates.
-
An Algorithm for Simulating the Banking Network System and Its Application for Analyzing Macroprudential Policy
Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1275-1289Modeling banking systems using a network approach has received growing attention in recent years. One of the notable models is that developed by Iori et al, who proposed a banking system model for analyzing systemic risks in interbank networks. The model is built based on the simple dynamics of several bank balance sheet variables such as deposit, equity, loan, liquid asset, and interbank lending (or borrowing) in the form of difference equations. Each bank faces random shocks in deposits and loans. The balance sheet is updated at the beginning or end of each period. In the model, banks are grouped into either potential lenders or borrowers. The potential borrowers are those that have lack of liquidity and the potential lenders are those which have excess liquids after dividend payment and channeling new investment. The borrowers and the lenders are connected through the interbank market. Those borrowers have some percentage of linkage to random potential lenders for borrowing funds to maintain their safety net of the liquidity. If the demand for borrowing funds can meet the supply of excess liquids, then the borrower bank survives. If not, they are deemed to be in default and will be removed from the banking system. However, in their paper, most part of the interbank borrowing-lending mechanism is described qualitatively rather than by detailed mathematical or computational analysis. Therefore, in this paper, we enhance the mathematical parts of borrowing-lending in the interbank market and present an algorithm for simulating the model. We also perform some simulations to analyze the effects of the model’s parameters on banking stability using the number of surviving banks as the measure. We apply this technique to analyze the effects of a macroprudential policy called loan-to-deposit ratio based reserve requirement for banking stability.
-
Mathematical features of individual dosimetric planning of radioiodotherapy based on pharmacokinetic modeling
Computer Research and Modeling, 2024, v. 16, no. 3, pp. 773-784When determining therapeutic absorbed doses in the process of radioiodine therapy, the method of individual dosimetric planning is increasingly used in Russian medicine. However, for the successful implementation of this method, it is necessary to have appropriate software that allows modeling the pharmacokinetics of radioiodine in the patient’s body and calculate the necessary therapeutic activity of a radiopharmaceutical drug to achieve the planned therapeutic absorbed dose in the thyroid gland.
Purpose of the work: development of a software package for pharmacokinetic modeling and calculation of individual absorbed doses in radioiodine therapy based on a five-chamber model of radioiodine kinetics using two mathematical optimization methods. The work is based on the principles and methods of RFLP pharmacokinetics (chamber modeling). To find the minimum of the residual functional in identifying the values of the transport constants of the model, the Hook – Jeeves method and the simulated annealing method were used. Calculation of dosimetric characteristics and administered therapeutic activity is based on the method of calculating absorbed doses using the functions of radioiodine activity in the chambers found during modeling. To identify the parameters of the model, the results of radiometry of the thyroid gland and urine of patients with radioiodine introduced into the body were used.
A software package for modeling the kinetics of radioiodine during its oral intake has been developed. For patients with diffuse toxic goiter, the transport constants of the model were identified and individual pharmacokinetic and dosimetric characteristics (elimination half-lives, maximum thyroid activity and time to reach it, absorbed doses to critical organs and tissues, administered therapeutic activity) were calculated. The activity-time relationships for all cameras in the model are obtained and analyzed. A comparative analysis of the calculated pharmacokinetic and dosimetric characteristics calculated using two mathematical optimization methods was performed. Evaluation completed the stunning-effect and its contribution to the errors in calculating absorbed doses. From a comparative analysis of the pharmacokinetic and dosimetric characteristics calculated in the framework of two optimization methods, it follows that the use of a more complex mathematical method for simulating annealing in a software package does not lead to significant changes in the values of the characteristics compared to the simple Hook – Jeeves method. Errors in calculating absorbed doses in the framework of these mathematical optimization methods do not exceed the spread of absorbed dose values from the stunning-effect.
-
Analysis of the rate of electron transport through photosynthetic cytochrome $b_6 f$ complex
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 997-1022We consider an approach based on linear algebra methods to analyze the rate of electron transport through the cytochrome $b_6 f$ complex. In the proposed approach, the dependence of the quasi-stationary electron flux through the complex on the degree of reduction of pools of mobile electron carriers is considered a response function characterizing this process. We have developed software in the Python programming language that allows us to construct the master equation for the complex according to the scheme of elementary reactions and calculate quasi-stationary electron transport rates through the complex and the dynamics of their changes during the transition process. The calculations are performed in multithreaded mode, which makes it possible to efficiently use the resources of modern computing systems and to obtain data on the functioning of the complex in a wide range of parameters in a relatively short time. The proposed approach can be easily adapted for the analysis of electron transport in other components of the photosynthetic and respiratory electron-transport chain, as well as other processes in multienzyme complexes containing several reaction centers. Cryo-electron microscopy and redox titration data were used to parameterize the model of cytochrome $b_6 f$ complex. We obtained dependences of the quasi-stationary rate of plastocyanin reduction and plastoquinone oxidation on the degree of reduction of pools of mobile electron carriers and analyzed the dynamics of rate changes in response to changes in the redox state of the plastoquinone pool. The modeling results are in good agreement with the available experimental data.
-
Enhancing DevSecOps with continuous security requirements analysis and testing
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1687-1702The fast-paced environment of DevSecOps requires integrating security at every stage of software development to ensure secure, compliant applications. Traditional methods of security testing, often performed late in the development cycle, are insufficient to address the unique challenges of continuous integration and continuous deployment (CI/CD) pipelines, particularly in complex, high-stakes sectors such as industrial automation. In this paper, we propose an approach that automates the analysis and testing of security requirements by embedding requirements verification into the CI/CD pipeline. Our method employs the ARQAN tool to map high-level security requirements to Security Technical Implementation Guides (STIGs) using semantic search, and RQCODE to formalize these requirements as code, providing testable and enforceable security guidelines.We implemented ARQAN and RQCODE within a CI/CD framework, integrating them with GitHub Actions for realtime security checks and automated compliance verification. Our approach supports established security standards like IEC 62443 and automates security assessment starting from the planning phase, enhancing the traceability and consistency of security practices throughout the pipeline. Evaluation of this approach in collaboration with an industrial automation company shows that it effectively covers critical security requirements, achieving automated compliance for 66.15% of STIG guidelines relevant to the Windows 10 platform. Feedback from industry practitioners further underscores its practicality, as 85% of security requirements mapped to concrete STIG recommendations, with 62% of these requirements having matching testable implementations in RQCODE. This evaluation highlights the approach’s potential to shift security validation earlier in the development process, contributing to a more resilient and secure DevSecOps lifecycle.
-
Performance of the OpenMP and MPI implementations on ultrasparc system
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 485-491Views (last year): 2.This paper targets programmers and developers interested in utilizing parallel programming techniques to enhance application performance. The Oracle Solaris Studio software provides state-of-the-art optimizing and parallelizing compilers for C, C++ and Fortran, an advanced debugger, and optimized mathematical and performance libraries. Also included are an extremely powerful performance analysis tool for profiling serial and parallel applications, a thread analysis tool to detect data races and deadlock in memory parallel programs, and an Integrated Development Environment (IDE). The Oracle Message Passing Toolkit software provides the high-performance MPI libraries and associated run-time environment needed for message passing applications that can run on a single system or across multiple compute systems connected with high performance networking, including Gigabit Ethernet, 10 Gigabit Ethernet, InfiniBand and Myrinet. Examples of OpenMP and MPI are provided throughout the paper, including their usage via the Oracle Solaris Studio and Oracle Message Passing Toolkit products for development and deployment of both serial and parallel applications on SPARC and x86/x64 based systems. Throughout this paper it is demonstrated how to develop and deploy an application parallelized with OpenMP and/or MPI.
-
Effects of the heart contractility and its vascular load on the heart rate in athlets
Computer Research and Modeling, 2017, v. 9, no. 2, pp. 323-329Views (last year): 5. Citations: 1 (RSCI).Heart rate (HR) is the most affordable indicator for measuring. In order to control the individual response to physical exercises of different load types heart rate is measured when the athletes perform different types of muscular work (strength machines, various types of training and competitive exercises). The magnitude of heart rate and its dynamics during muscular work and recovery can be objectively judged on the functional status of the cardiovascular system of an athlete, the level of its individual physical performance, as well as an adaptive response to a particular exercise. However, the heart rate is not an independent determinant of the physical condition of an athlete. HR size is formed by the interaction of the basic physiological mechanisms underlying cardiac hemodynamic ejection mode. Heart rate depends on one hand, on contractility of the heart, the venous return, the volumes of the atria and ventricles of the heart and from vascular heart load, the main components of which are elastic and peripheral resistance of the arterial system on the other hand. The values of arterial system vascular resistances depend on the power of muscular work and its duration. HR sensitivity to changes in heart load and vascular contraction was determined in athletes by pair regression analysis simultaneously recorded heart rate data, and peripheral $(R)$ and elastic $(E_a)$ resistance (heart vascular load), and the power $(W)$ of heartbeats (cardiac contractility). The coefficients of sensitivity and pair correlation between heart rate indicators and vascular load and contractility of left ventricle of the heart were determined in athletes at rest and during the muscular work on the cycle ergometer. It is shown that increase in both ergometer power load and heart rate is accompanied by the increase of correlation coefficients and coefficients of the heart rate sensitivity to $R$, $E_a$ and $W$.
-
A framework for medical image segmentation based on measuring diversity of pixel’s intensity utilizing interval approach
Computer Research and Modeling, 2021, v. 13, no. 5, pp. 1059-1066Segmentation of medical image is one of the most challenging tasks in analysis of medical image. It classifies the organs pixels or lesions from medical images background like MRI or CT scans, that is to provide critical information about the human organ’s volumes and shapes. In scientific imaging field, medical imaging is considered one of the most important topics due to the rapid and continuing progress in computerized medical image visualization, advances in analysis approaches and computer-aided diagnosis. Digital image processing becomes more important in healthcare field due to the growing use of direct digital imaging systems for medical diagnostics. Due to medical imaging techniques, approaches of image processing are now applicable in medicine. Generally, various transformations will be needed to extract image data. Also, a digital image can be considered an approximation of a real situation includes some uncertainty derived from the constraints on the process of vision. Since information on the level of uncertainty will influence an expert’s attitude. To address this challenge, we propose novel framework involving interval concept that consider a good tool for dealing with the uncertainty, In the proposed approach, the medical images are transformed into interval valued representation approach and entropies are defined for an image object and background. Then we determine a threshold for lower-bound image and for upper-bound image, and then calculate the mean value for the final output results. To demonstrate the effectiveness of the proposed framework, we evaluate it by using synthetic image and its ground truth. Experimental results showed how performance of the segmentation-based entropy threshold can be enhanced using proposed approach to overcome ambiguity.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




