All issues
- 2026 Vol. 18
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Automating high-quality concept banks: leveraging LLMs and multimodal evaluation metrics
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1555-1567Interpretability in recent deep learning models has become an epicenter of research particularly in sensitive domains such as healthcare, and finance. Concept bottleneck models have emerged as a promising approach for achieving transparency and interpretability by leveraging a set of humanunderstandable concepts as an intermediate representation before the prediction layer. However, manual concept annotation is discouraged due to the time and effort involved. Our work explores the potential of large language models (LLMs) for generating high-quality concept banks and proposes a multimodal evaluation metric to assess the quality of generated concepts. We investigate three key research questions: the ability of LLMs to generate concept banks comparable to existing knowledge bases like ConceptNet, the sufficiency of unimodal text-based semantic similarity for evaluating concept-class label associations, and the effectiveness of multimodal information in quantifying concept generation quality compared to unimodal concept-label semantic similarity. Our findings reveal that multimodal models outperform unimodal approaches in capturing concept-class label similarity. Furthermore, our generated concepts for the CIFAR-10 and CIFAR-100 datasets surpass those obtained from ConceptNet and the baseline comparison, demonstrating the standalone capability of LLMs in generating highquality concepts. Being able to automatically generate and evaluate high-quality concepts will enable researchers to quickly adapt and iterate to a newer dataset with little to no effort before they can feed that into concept bottleneck models.
-
Explainable artificial intelligence: principles, methods and applications
Computer Research and Modeling, 2026, v. 18, no. 2, pp. 211-241Explainable Artificial Intelligence (XAI) is a field of artificial intelligence aimed at creating methods and tools for generating interpretable and human-understandable explanations of AI decisions. The relevance of model explainability increases with the deployment of artificial intelligence in critical domains (healthcare, finance, law), where algorithmic opacity can lead to serious consequences for users and society. This work presents an analytical review of the current state of the XAI field, covering theoretical foundations, methodology, and practical applications.
The examined explainable AI methods were selected and systematized based on a multi-level classification of XAI methods by problem formulation (goal, target audience, data type), methodology (application stage, model-specificity, methods, scale), and result form (representation, presentation, evaluation metrics).
A comparative analysis of explainable AI methods for various application domains is conducted. For classical machine learning, SHAP and LIME are examined in detail, revealing their theoretical foundations, computational characteristics, and limitations. For computer vision, gradient-based methods (SmoothGrad, Integrated Gradients), activation visualization methods (Grad-CAM, Grad-CAM++), perturbation-based methods (RISE, Occlusion), and conceptual explanations (TCAV, Network Dissection) are systematized. Special attention is paid to the specifics of applying XAI to natural language processing and large language models, including analysis of the faithfulness of Chain-of-Thought reasoning, natural language explanations, and attribution graph methods. Fundamental limitations of existing approaches to LLM explainability are identified and directions for future research are defined.
The review results demonstrate that XAI methods have reached significant maturity in classical machine learning and computer vision, however, their application to large language models remains an open research problem requiring the development of new explanation paradigms.
-
Stochastic formalization of the gas dynamic hierarchy
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 767-779Mathematical models of gas dynamics and its computational industry, in our opinion, are far from perfect. We will look at this problem from the point of view of a clear probabilistic micro-model of a gas from hard spheres, relying on both the theory of random processes and the classical kinetic theory in terms of densities of distribution functions in phase space, namely, we will first construct a system of nonlinear stochastic differential equations (SDE), and then a generalized random and nonrandom integro-differential Boltzmann equation taking into account correlations and fluctuations. The key feature of the initial model is the random nature of the intensity of the jump measure and its dependence on the process itself.
Briefly recall the transition to increasingly coarse meso-macro approximations in accordance with a decrease in the dimensionalization parameter, the Knudsen number. We obtain stochastic and non-random equations, first in phase space (meso-model in terms of the Wiener — measure SDE and the Kolmogorov – Fokker – Planck equations), and then — in coordinate space (macro-equations that differ from the Navier – Stokes system of equations and quasi-gas dynamics systems). The main difference of this derivation is a more accurate averaging by velocity due to the analytical solution of stochastic differential equations with respect to the Wiener measure, in the form of which an intermediate meso-model in phase space is presented. This approach differs significantly from the traditional one, which uses not the random process itself, but its distribution function. The emphasis is placed on the transparency of assumptions during the transition from one level of detail to another, and not on numerical experiments, which contain additional approximation errors.
The theoretical power of the microscopic representation of macroscopic phenomena is also important as an ideological support for particle methods alternative to difference and finite element methods.
-
A simple numerical splitting method for solving the linear Boltzmann kinetic equation with intense scattering
Computer Research and Modeling, 2026, v. 18, no. 2, pp. 315-333This paper analyzes some issues in developing numerical methods for solving problems with a Boltzmann-type linear kinetic transport equation. Existing applications of this type of equation are listed. The focus is on the problem of radiative transfer in a flat layer, which are important for experimental research practice. Key definitions and traditional limitations applied to radiative transfer problems are presented. Some features of formulating radiative transfer problems for flat layers of irregular heterogeneous composite materials that are partially transparent to electromagnetic radiation are considered. The main approaches to the numerical and numerical-analytical solution of the linear kinetic transport equation are outlined.
Some variants of the simplest grid numerical methods for solving of nonstationary kinetic problems of transport a flat layer of a medium with strong attenuation are considered. Problems with one- and two-step variants of these iterative methods are analyzed, for some of them the causes of instability and convergence absence in some of them are investigated and established. It is shown that in the explicit conservative one-step method for a layer of a homogeneous absorbing, but neither radiating nor scattering, medium, unstable modes always exist in the spectrum of harmonic solutions. These modes arise in the region of radiation propagating almost parallel to the layer boundaries, and their instability increases with increasing attenuation effects and is caused by the presence of a small coefficient before the spatial derivative in the transport equation. To limit the undesirable influence of this component, various variants of splitting the equation into two and three fractional steps are considered.
It is shown that the most preferable options are those with explicitly organized fractional steps, for which a proof of their stability and convergence, that based on the Lax’s equivalence theorem is presented. It is demonstrated that the correct building of the fractional step sequence in explicit schemes for numerical solving of the nonstationary linear kinetic transport problems can provide additional stabilization, with the scattering integral plays an important role in stabilizing them. So, when solving kinetic transport problems in media with high scattering albedo, the explicit grid method of settling with splitting the iterations into three fractional steps, that were based on physical processes proved to be the simplest and most effective. The method is implemented as Matlab code, which performs quality control during the generation of the numerical solution process. The most significant modeling results are presented, confirming that the three-step method imposes relatively moderate requirements on resources and numerical integration accuracy, and ensures conditional convergence of iterations. Its mathematical correctness is confirmed by the behavior of the equation residuals and direct control of the convergence of numerical solutions. Its physical correctness is confirmed by ensuring, for ergodic systems, the property of convergence to an invariant steady state independent of the initial conditions. Some discovered and possible limitations of the method are listed.
The work will be useful to specialists in the field of mathematical modeling, numerical methods, kinetic theory, combined heat and mass transfer, dealing with issues of interpretation of experimental data, graduate students and senior students specializing in the indicated areas.
-
Investigation of individual-based mechanisms of single-species population dynamics by logical deterministic cellular automata
Computer Research and Modeling, 2015, v. 7, no. 6, pp. 1279-1293Views (last year): 16. Citations: 3 (RSCI).Investigation of logical deterministic cellular automata models of population dynamics allows to reveal detailed individual-based mechanisms. The search for such mechanisms is important in connection with ecological problems caused by overexploitation of natural resources, environmental pollution and climate change. Classical models of population dynamics have the phenomenological nature, as they are “black boxes”. Phenomenological models fundamentally complicate research of detailed mechanisms of ecosystem functioning. We have investigated the role of fecundity and duration of resources regeneration in mechanisms of population growth using four models of ecosystem with one species. These models are logical deterministic cellular automata and are based on physical axiomatics of excitable medium with regeneration. We have modeled catastrophic death of population arising from increasing of resources regeneration duration. It has been shown that greater fecundity accelerates population extinction. The investigated mechanisms are important for understanding mechanisms of sustainability of ecosystems and biodiversity conservation. Prospects of the presented modeling approach as a method of transparent multilevel modeling of complex systems are discussed.
-
Biohydrochemical portrait of the White Sea
Computer Research and Modeling, 2018, v. 10, no. 1, pp. 125-160The biohydrochemical portrait of the White Sea is constructed on the CNPSi-model calculations based on long-term mean annual observations (average monthly hydrometeorological, hydrochemical and hydrobiological parameters of the marine environment) as well as on updated information on the nutrient input to the sea with the runoff of the main river tributaries (Niva, Onega, Northern Dvina, Mezen, Kem, Keret). Parameters of the marine environment are temperature, light, transparency, and biogenic load. Ecological characteristics of the sea “portrait” were calculated for nine marine areas (Kandalaksha, Onega, Dvinsky, Mezensky Bays, Solovetsky Islands, Basin, Gorlot, Voronka, Chupa Bay), these are: the concentration changes of organic and mineral compounds of biogenic elements (C, N, P, Si), the biomass of organisms of the lower trophic level (heterotrophic bacteria, diatomic phytoplankton, herbivorous and predatory zooplankton) and other ones (rates of substance concentration and organism biomass changes, internal and external substance flows, balances of individual substances and nutrients as a whole). Parameters of the marine environment state (water temperature, ratio of mineral fractions N < P) and dominant diatom phytoplankton in the sea (abundance, production, biomass, chlorophyll content a) were calculated and compared with the results of individual surveys (for 1972–1991 and 2007–2012) of the White Sea water areas. The methods for estimating the values of these parameters from observations and calculations differ, however, the calculated values of the phytoplankton state are comparable with the measurements and are similar to the data given in the literature. Therefore, according to the literature data, the annual production of diatoms in the White Sea is estimated at 1.5–3 million tons C (at a vegetation period of 180 days), and according to calculations it is ~2 and 3.5 million tons C for vegetation period of 150 and 180 days respectively.
Keywords: White Sea ecosystem, nutrients, heterotrophic bacterioplankton, diatom phytoplankton, herbivorous and predatory zooplankton, detritus, trophic chain, CNPSi-model of nutrient biotransformation, ecological portrait of the White Sea, the comparison of the observed and calculated parameters of diatoms (abundance, products, biomass, chlorophyll a).Views (last year): 15. Citations: 1 (RSCI). -
On the permissible intensity of laser radiation in the optical system and on the technology for measuring the absorption coefficient of its power
Computer Research and Modeling, 2021, v. 13, no. 5, pp. 1025-1044Laser damage to transparent solids is a major limiting factor output power of laser systems. For laser rangefinders, the most likely destruction cause of elements of the optical system (lenses, mirrors) actually, as a rule, somewhat dusty, is not an optical breakdown as a result of avalanche, but such a thermal effect on the dust speck deposited on an element of the optical system (EOS), which leads to its ignition. It is the ignition of a speck of dust that initiates the process of EOS damage.
The corresponding model of this process leading to the ignition of a speck of dust takes into account the nonlinear Stefan –Boltzmann law of thermal radiation and the infinite thermal effect of periodic radiation on the EOS and the speck of dust. This model is described by a nonlinear system of differential equations for two functions: the EOS temperature and the dust particle temperature. It is proved that due to the accumulating effect of periodic thermal action, the process of reaching the dust speck ignition temperature occurs almost at any a priori possible changes in this process of the thermophysical parameters of the EOS and the dust speck, as well as the heat exchange coefficients between them and the surrounding air. Averaging these parameters over the variables related to both the volume and the surfaces of the dust speck and the EOS is correct under the natural constraints specified in the paper. The entire really significant spectrum of thermophysical parameters is covered thanks to the use of dimensionless units in the problem (including numerical results).
A thorough mathematical study of the corresponding nonlinear system of differential equations made it possible for the first time for the general case of thermophysical parameters and characteristics of the thermal effect of periodic laser radiation to find a formula for the value of the permissible radiation intensity that does not lead to the destruction of the EOS as a result of the ignition of a speck of dust deposited on the EOS. The theoretical value of the permissible intensity found in the general case in the special case of the data from the Grasse laser ranging station (south of France) almost matches that experimentally observed in the observatory.
In parallel with the solution of the main problem, we derive a formula for the power absorption coefficient of laser radiation by an EOS expressed in terms of four dimensionless parameters: the relative intensity of laser radiation, the relative illumination of the EOS, the relative heat transfer coefficient from the EOS to the surrounding air, and the relative steady-state temperature of the EOS.
-
NLP-based automated compliance checking of data processing agreements against General Data Protection Regulation
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1667-1685As it stands in the contemporary world, compliance with regulations concerning data protection such as GDPR is central to organizations. Another important issue analysis identified is the fact that compliance is hampered by the fact that legal documents are often complex and that regulations are ever changing. This paper aims to describe the ways in which NLP aids in keeping GDPR compliance effortless through automated scanning for compliance, evaluating privacy policies, and increasing the level of transparency. The work does not only limit to exploring the application of NLP for dealing with the privacy policies and facilitate better understanding of the third-party data sharing but also proceed to perform the preliminary studies to evaluate the difference of several NLP models. They implement and execute the models to distinguish the one that performs the best based on the efficiency and speed at which it automates the process of compliance verification and analyzing the privacy policy. Moreover, some of the topics discussed in the research deal with the possibility of using automatic tools and data analysis to GDPR, for instance, generation of the machine readable models that assist in evaluation of compliance. Among the evaluated models from our studies, SBERT performed best at the policy level with an accuracy of 0.57, precision of 0.78, recall of 0.83, and F1-score of 0.80. BERT showed the highest performance at the sentence level, achieving an accuracy of 0.63, precision of 0.70, recall of 0.50, and F1-score of 0.55. Therefore, this paper emphasizes the importance of NLP to help organizations overcome the difficulties of GDPR compliance, create a roadmap to a more client-oriented data protection regime. In this regard, by comparing preliminary studies done in the test and showing the performance of the better model, it helps enhance the measures taken in compliance and fosters the defense of individual rights in the cyberspace.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




