All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Application of the streamline method for nonlinear filtration problems acceleration
Computer Research and Modeling, 2018, v. 10, no. 5, pp. 709-728Views (last year): 18.The paper contains numerical simulation of nonisothermal nonlinear flow in a porous medium. Twodimensional unsteady problem of heavy oil, water and steam flow is considered. Oil phase consists of two pseudocomponents: light and heavy fractions, which like the water component, can vaporize. Oil exhibits viscoplastic rheology, its filtration does not obey Darcy's classical linear law. Simulation considers not only the dependence of fluids density and viscosity on temperature, but also improvement of oil rheological properties with temperature increasing.
To solve this problem numerically we use streamline method with splitting by physical processes, which consists in separating the convective heat transfer directed along filtration from thermal conductivity and gravitation. The article proposes a new approach to streamline methods application, which allows correctly simulate nonlinear flow problems with temperature-dependent rheology. The core of this algorithm is to consider the integration process as a set of quasi-equilibrium states that are results of solving system on a global grid. Between these states system solved on a streamline grid. Usage of the streamline method allows not only to accelerate calculations, but also to obtain a physically reliable solution, since integration takes place on a grid that coincides with the fluid flow direction.
In addition to the streamline method, the paper presents an algorithm for nonsmooth coefficients accounting, which arise during simulation of viscoplastic oil flow. Applying this algorithm allows keeping sufficiently large time steps and does not change the physical structure of the solution.
Obtained results are compared with known analytical solutions, as well as with the results of commercial package simulation. The analysis of convergence tests on the number of streamlines, as well as on different streamlines grids, justifies the applicability of the proposed algorithm. In addition, the reduction of calculation time in comparison with traditional methods demonstrates practical significance of the approach.
-
CFD analysis of hemodynamics in idealized abdominal aorta-renal artery junction: preliminary study to locate atherosclerotic plaque
Computer Research and Modeling, 2019, v. 11, no. 4, pp. 695-706Views (last year): 3.Atherosclerotic diseases such as carotid artery diseases (CAD) and chronic kidney diseases (CKD) are the major causes of death worldwide. The onset of these atherosclerotic diseases in the arteries are governed by complex blood flow dynamics and hemodynamic parameters. Atherosclerosis in renal arteries leads to reduction in arterial efficiency, which ultimately leads to Reno-vascular hypertension. This work attempts to identify the localization of atherosclerotic plaque in human abdominal aorta — renal artery junction using Computational fluid dynamics (CFD).
The atherosclerosis prone regions in an idealized human abdominal aorta-renal artery junction are identified by calculating relevant hemodynamic indicators from computational simulations using the rheologically accurate shear-thinning Yeleswarapu model for human blood. Blood flow is numerically simulated in a 3-D model of the artery junction using ANSYS FLUENT v18.2.
Hemodynamic indicators calculated are average wall shear stress (AWSS), oscillatory shear index (OSI), and relative residence time (RRT). Simulations of pulsatile flow (f=1.25 Hz, Re = 1000) show that low AWSS, and high OSI manifest in the regions of renal artery downstream of the junction and on the infrarenal section of the abdominal aorta lateral to the junction. High RRT, which is a relative index and dependent on AWSS and OSI, is found to overlap with the low AWSS and high OSI at the cranial surface of renal artery proximal to the junction and on the surface of the abdominal aorta lateral to the bifurcation: this indicates that these regions of the junction are prone to atherosclerosis. The results match qualitatively with the findings reported in literature and serve as initial step to illustrate utility of CFD for the location of atherosclerotic plaque.
-
The stoichiometry of metabolic pathways in the dynamics of cellular populations
Computer Research and Modeling, 2011, v. 3, no. 4, pp. 455-475Views (last year): 5. Citations: 1 (RSCI).The problem has been considered, to what extent the kinetic models of cellular metabolism fit the matter which they describe. Foundations of stoichiometry of the whole metabolism and its large regions have been stated. A bioenergetic representation of stoichiometry based on a universal unit of chemical compound reductivity, viz., redoxon, has been described. Equations of mass-energy balance (bioenergetic variant of stoichiometry) have been derived for metabolic flows including those of protons possessing high electrochemical potential μH+, and high-energy compounds. Interrelations have been obtained which determine the biomass yield, rate of uptake of energy source for cell growth and other important physiological quantities as functions of biochemical characteristics of cellular energetics. The maximum biomass energy yield values have been calculated for different energy sources utilized by cells. These values coincide with those measured experimentally.
-
Searching for realizable energy-efficient gaits of planar five-link biped with a point contact
Computer Research and Modeling, 2020, v. 12, no. 1, pp. 155-170In this paper, we discuss the procedure for finding nominal trajectories of the planar five-link bipedal robot with point contact. To this end we use a virtual constraints method that transforms robot’s dynamics to a lowdimensional zero manifold; we also use a nonlinear optimization algorithms to find virtual constraints parameters that minimize robot’s cost of transportation. We analyzed the effect of the degree of Bezier polynomials that approximate the virtual constraints and continuity of the torques on the cost of transportation. Based on numerical results we found that it is sufficient to consider polynomials with degrees between five and six, as further increase in the degree of polynomial results in increased computation time while it does not guarantee reduction of the cost of transportation. Moreover, it was shown that introduction of torque continuity constraints does not lead to significant increase of the objective function and makes the gait more implementable on a real robot.
We propose a two step procedure for finding minimum of the considered optimization problem with objective function in the form of cost of transportation and with high number of constraints. During the first step we solve a feasibility problem: remove cost function (set it to zero) and search for feasible solution in the parameter space. During the second step we introduce the objective function and use the solution found in the first step as initial guess. For the first step we put forward an algorithm for finding initial guess that considerably reduced optimization time of the first step (down to 3–4 seconds) compared to random initialization. Comparison of the objective function of the solutions found during the first and second steps showed that on average during the second step objective function was reduced twofold, even though overall computation time increased significantly.
-
The effect of cell metabolism on biomass yield during the growth on various substrates
Computer Research and Modeling, 2017, v. 9, no. 6, pp. 993-1014Views (last year): 17.Bioenergetic regularities determining the maximal biomass yield in aerobic microbial growth on various substrates have been considered. The approach is based on the method of mass-energy balance and application of GenMetPath computer program package. An equation system describing the balances of quantities of 1) metabolite reductivity and 2) high-energy bonds formed and expended has been formulated. In order to formulate the system, the whole metabolism is subdivided into constructive and energetic partial metabolisms. The constructive metabolism is, in turn, subdivided into two parts: forward and standard. The latter subdivision is based on the choice of nodal metabolites. The forward constructive metabolism is substantially dependent on growth substrate: it converts the substrate into the standard set of nodal metabolites. The latter is, then, converted into biomass macromolecules by the standard constructive metabolism which is the same on various substrates. Variations of flows via nodal metabolites are shown to exert minor effects on the standard constructive metabolism. As a separate case, the growth on substrates requiring the participation of oxygenases and/or oxidase is considered. The bioenergetic characteristics of the standard constructive metabolism are found from a large amount of data for the growth of various organisms on glucose. The described approach can be used for prediction of biomass growth yield on substrates with known reactions of their primary metabolization. As an example, the growth of a yeast culture on ethanol has been considered. The value of maximal growth yield predicted by the method described here showed very good consistency with the value found experimentally.
-
Determination of CT dose by means of noise analysis
Computer Research and Modeling, 2018, v. 10, no. 4, pp. 525-533Views (last year): 23. Citations: 1 (RSCI).The article deals with the process of creating an effective algorithm for determining the amount of emitted quanta from an X-ray tube in computer tomography (CT) studies. An analysis of domestic and foreign literature showed that most of the work in the field of radiometry and radiography takes the tabulated values of X-ray absorption coefficients into account, while individual dose factors are not taken into account at all since many studies are lacking the Dose Report. Instead, an average value is used to simplify the calculation of statistics. In this regard, it was decided to develop a method to detect the amount of ionizing quanta by analyzing the noise of CT data. As the basis of the algorithm, we used Poisson and Gauss distribution mathematical model of owns’ design of logarithmic value. The resulting mathematical model was tested on the CT data of a calibration phantom consisting of three plastic cylinders filled with water, the X-ray absorption coefficient of which is known from the table values. The data were obtained from several CT devices from different manufacturers (Siemens, Toshiba, GE, Phillips). The developed algorithm made it possible to calculate the number of emitted X-ray quanta per unit time. These data, taking into account the noise level and the radiuses of the cylinders, were converted to X-ray absorption values, after which a comparison was made with tabulated values. As a result of this operation, the algorithm used with CT data of various configurations, experimental data were obtained, consistent with the theoretical part and the mathematical model. The results showed good accuracy of the algorithm and mathematical apparatus, which shows reliability of the obtained data. This mathematical model is already used in the noise reduction program of the CT of own design, where it participates as a method of creating a dynamic threshold of noise reduction. At the moment, the algorithm is being processed to work with real data from computer tomography of patients.
-
Estimation of maximal values of biomass growth yield based on the mass-energy balance of cell metabolism
Computer Research and Modeling, 2019, v. 11, no. 4, pp. 723-750Views (last year): 2.The biomass growth yield is the ratio of the newly synthesized substance of growing cells to the amount of the consumed substrate, the source of matter and energy for cell growth. The yield is a characteristic of the efficiency of substrate conversion to cell biomass. The conversion is carried out by the cell metabolism, which is a complete aggregate of biochemical reactions occurring in the cells.
This work newly considers the problem of maximal cell growth yield prediction basing on balances of the whole living cell metabolism and its fragments called as partial metabolisms (PM). The following PM’s are used for the present consideration. During growth on any substrate we consider i) the standard constructive metabolism (SCM) which consists of identical pathways during growth of various organisms on any substrate. SCM starts from several standard compounds (nodal metabolites): glucose, acetyl-CoA 2-oxoglutarate, erythrose-4-phosphate, oxaloacetate, ribose-5- phosphate, 3-phosphoglycerate, phosphoenolpyruvate, and pyruvate, and ii) the full forward metabolism (FM) — the remaining part of the whole metabolism. The first one consumes high-energy bonds (HEB) formed by the second one. In this work we examine a generalized variant of the FM, when the possible presence of extracellular products, as well as the possibilities of both aerobic and anaerobic growth are taken into account. Instead of separate balances of each nodal metabolite formation as it was made in our previous work, this work deals at once with the whole aggregate of these metabolites. This makes the problem solution more compact and requiring a smaller number of biochemical quantities and substantially less computational time. An equation expressing the maximal biomass yield via specific amounts of HEB formed and consumed by the partial metabolisms has been derived. It includes the specific HEB consumption by SCM which is a universal biochemical parameter applicable to the wide range of organisms and growth substrates. To correctly determine this parameter, the full constructive metabolism and its forward part are considered for the growth of cells on glucose as the mostly studied substrate. We used here the found earlier properties of the elemental composition of lipid and lipid-free fractions of cell biomass. Numerical study of the effect of various interrelations between flows via different nodal metabolites has been made. It showed that the requirements of the SCM in high-energy bonds and NAD(P)H are practically constants. The found HEB-to-formed-biomass coefficient is an efficient tool for finding estimates of maximal biomass yield from substrates for which the primary metabolism is known. Calculation of ATP-to-substrate ratio necessary for the yield estimation has been made using the special computer program package, GenMetPath.
-
Deep learning analysis of intracranial EEG for recognizing drug effects and mechanisms of action
Computer Research and Modeling, 2024, v. 16, no. 3, pp. 755-772Predicting novel drug properties is fundamental to polypharmacology, repositioning, and the study of biologically active substances during the preclinical phase. The use of machine learning, including deep learning methods, for the identification of drug – target interactions has gained increasing popularity in recent years.
The objective of this study was to develop a method for recognizing psychotropic effects and drug mechanisms of action (drug – target interactions) based on an analysis of the bioelectrical activity of the brain using artificial intelligence technologies.
Intracranial electroencephalographic (EEG) signals from rats were recorded (4 channels at a sampling frequency of 500 Hz) after the administration of psychotropic drugs (gabapentin, diazepam, carbamazepine, pregabalin, eslicarbazepine, phenazepam, arecoline, pentylenetetrazole, picrotoxin, pilocarpine, chloral hydrate). The signals were divided into 2-second epochs, then converted into $2000\times 4$ images and input into an autoencoder. The output of the bottleneck layer was subjected to classification and clustering using t-SNE, and then the distances between resulting clusters were calculated. As an alternative, an approach based on feature extraction with dimensionality reduction using principal component analysis and kernel support vector machine (kSVM) classification was used. Models were validated using 5-fold cross-validation.
The classification accuracy obtained for 11 drugs during cross-validation was $0.580 \pm 0.021$, which is significantly higher than the accuracy of the random classifier $(0.091 \pm 0.045, p < 0.0001)$ and the kSVM $(0.441 \pm 0.035, p < 0.05)$. t-SNE maps were generated from the bottleneck parameters of intracranial EEG signals. The relative proximity of the signal clusters in the parametric space was assessed.
The present study introduces an original method for biopotential-mediated prediction of effects and mechanism of action (drug – target interaction). This method employs convolutional neural networks in conjunction with a modified selective parameter reduction algorithm. Post-treatment EEGs were compressed into a unified parameter space. Using a neural network classifier and clustering, we were able to recognize the patterns of neuronal response to the administration of various psychotropic drugs.
-
Analysis of the rate of electron transport through photosynthetic cytochrome $b_6 f$ complex
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 997-1022We consider an approach based on linear algebra methods to analyze the rate of electron transport through the cytochrome $b_6 f$ complex. In the proposed approach, the dependence of the quasi-stationary electron flux through the complex on the degree of reduction of pools of mobile electron carriers is considered a response function characterizing this process. We have developed software in the Python programming language that allows us to construct the master equation for the complex according to the scheme of elementary reactions and calculate quasi-stationary electron transport rates through the complex and the dynamics of their changes during the transition process. The calculations are performed in multithreaded mode, which makes it possible to efficiently use the resources of modern computing systems and to obtain data on the functioning of the complex in a wide range of parameters in a relatively short time. The proposed approach can be easily adapted for the analysis of electron transport in other components of the photosynthetic and respiratory electron-transport chain, as well as other processes in multienzyme complexes containing several reaction centers. Cryo-electron microscopy and redox titration data were used to parameterize the model of cytochrome $b_6 f$ complex. We obtained dependences of the quasi-stationary rate of plastocyanin reduction and plastoquinone oxidation on the degree of reduction of pools of mobile electron carriers and analyzed the dynamics of rate changes in response to changes in the redox state of the plastoquinone pool. The modeling results are in good agreement with the available experimental data.
-
Semantic structuring of text documents based on patterns of natural language entities
Computer Research and Modeling, 2022, v. 14, no. 5, pp. 1185-1197The technology of creating patterns from natural language words (concepts) based on text data in the bag of words model is considered. Patterns are used to reduce the dimension of the original space in the description of documents and search for semantically related words by topic. The process of dimensionality reduction is implemented through the formation of patterns of latent features. The variety of structures of document relations is investigated in order to divide them into themes in the latent space.
It is considered that a given set of documents (objects) is divided into two non-overlapping classes, for the analysis of which it is necessary to use a common dictionary. The belonging of words to a common vocabulary is initially unknown. Class objects are considered as opposition to each other. Quantitative parameters of oppositionality are determined through the values of the stability of each feature and generalized assessments of objects according to non-overlapping sets of features.
To calculate the stability, the feature values are divided into non-intersecting intervals, the optimal boundaries of which are determined by a special criterion. The maximum stability is achieved under the condition that the boundaries of each interval contain values of one of the two classes.
The composition of features in sets (patterns of words) is formed from a sequence ordered by stability values. The process of formation of patterns and latent features based on them is implemented according to the rules of hierarchical agglomerative grouping.
A set of latent features is used for cluster analysis of documents using metric grouping algorithms. The analysis applies the coefficient of content authenticity based on the data on the belonging of documents to classes. The coefficient is a numerical characteristic of the dominance of class representatives in groups.
To divide documents into topics, it is proposed to use the union of groups in relation to their centers. As patterns for each topic, a sequence of words ordered by frequency of occurrence from a common dictionary is considered.
The results of a computational experiment on collections of abstracts of scientific dissertations are presented. Sequences of words from the general dictionary on 4 topics are formed.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




