All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Quantitative analysis of “structure – anticancer activity” and rational molecular design of bi-functional VEGFR-2/HDAC-inhibitors
Computer Research and Modeling, 2019, v. 11, no. 5, pp. 911-930Inhibitors of histone deacetylases (HDACi) have considered as a promising class of drugs for the treatment of cancers because of their effects on cell growth, differentiation, and apoptosis. Angiogenesis play an important role in the growth of most solid tumors and the progression of metastasis. The vascular endothelial growth factor (VEGF) is a key angiogenic agent, which is secreted by malignant tumors, which induces the proliferation and the migration of vascular endothelial cells. Currently, the most promising strategy in the fight against cancer is the creation of hybrid drugs that simultaneously act on several physiological targets. In this work, a series of hybrids bearing N-phenylquinazolin-4-amine and hydroxamic acid moieties were studied as dual VEGFR-2/HDAC inhibitors using simplex representation of the molecular structure and Support Vector Machine (SVM). The total sample of 42 compounds was divided into training and test sets. Five-fold cross-validation (5-fold) was used for internal validation. Satisfactory quantitative structure—activity relationship (QSAR) models were constructed (R2test = 0.64–0.87) for inhibitors of HDAC, VEGFR-2 and human breast cancer cell line MCF-7. The interpretation of the obtained QSAR models was carried out. The coordinated effect of different molecular fragments on the increase of antitumor activity of the studied compounds was estimated. Among the substituents of the N-phenyl fragment, the positive contribution of para bromine for all three types of activity can be distinguished. The results of the interpretation were used for molecular design of potential dual VEGFR-2/HDAC inhibitors. For comparative QSAR research we used physicochemical descriptors calculated by the program HYBOT, the method of Random Forest (RF), and on-line version of the expert system OCHEM (https://ochem.eu). In the modeling of OCHEM PyDescriptor descriptors and extreme gradient boosting was chosen. In addition, the models obtained with the help of the expert system OCHEM were used for virtual screening of 300 compounds to select promising VEGFR-2/HDAC inhibitors for further synthesis and testing.
-
Biomechanics of DNA: rotational oscillations of bases
Computer Research and Modeling, 2011, v. 3, no. 3, pp. 319-328Views (last year): 3. Citations: 2 (RSCI).In this paper we study the rotational oscillations of the nitrous bases forming a central pair in a short DNA fragment consisting of three base pairs. A simple mechanical analog of the fragment where the bases are imitated by pendulums and the interactions between pendulums — by springs, has been constructed. We derived Lagrangian of the model system and the nonlinear equations of motions. We found solutions in the homogeneous case when the fragment considered consists of identical base pairs: Adenine-Thymine (AT- pair) or Guanine-Cytosine (GC-pair). The trajectories of the model system in the configuration space were also constructed.
-
Investigation of time to reach consensus on the work of technical committees on standardization based on regular Markov chains
Computer Research and Modeling, 2015, v. 7, no. 4, pp. 941-950Views (last year): 5. Citations: 8 (RSCI).In this paper construct the mathematical model for consensus in technical committees for standardization (TC), based on the consensus model proposed DeGroot. The basic problems of achieving consensus in the development of consensus standards in terms of the proposed model are discussed. The results of statistical modeling characterizing the dependence of time to reach consensus on the number of members of the TC and their authoritarianism are presented. It has been shown that increasing the number of TC experts and authoritarianism negative impact on the time to reach a consensus and increase fragmentation of the TC.
-
Estimation of maximal values of biomass growth yield based on the mass-energy balance of cell metabolism
Computer Research and Modeling, 2019, v. 11, no. 4, pp. 723-750Views (last year): 2.The biomass growth yield is the ratio of the newly synthesized substance of growing cells to the amount of the consumed substrate, the source of matter and energy for cell growth. The yield is a characteristic of the efficiency of substrate conversion to cell biomass. The conversion is carried out by the cell metabolism, which is a complete aggregate of biochemical reactions occurring in the cells.
This work newly considers the problem of maximal cell growth yield prediction basing on balances of the whole living cell metabolism and its fragments called as partial metabolisms (PM). The following PM’s are used for the present consideration. During growth on any substrate we consider i) the standard constructive metabolism (SCM) which consists of identical pathways during growth of various organisms on any substrate. SCM starts from several standard compounds (nodal metabolites): glucose, acetyl-CoA 2-oxoglutarate, erythrose-4-phosphate, oxaloacetate, ribose-5- phosphate, 3-phosphoglycerate, phosphoenolpyruvate, and pyruvate, and ii) the full forward metabolism (FM) — the remaining part of the whole metabolism. The first one consumes high-energy bonds (HEB) formed by the second one. In this work we examine a generalized variant of the FM, when the possible presence of extracellular products, as well as the possibilities of both aerobic and anaerobic growth are taken into account. Instead of separate balances of each nodal metabolite formation as it was made in our previous work, this work deals at once with the whole aggregate of these metabolites. This makes the problem solution more compact and requiring a smaller number of biochemical quantities and substantially less computational time. An equation expressing the maximal biomass yield via specific amounts of HEB formed and consumed by the partial metabolisms has been derived. It includes the specific HEB consumption by SCM which is a universal biochemical parameter applicable to the wide range of organisms and growth substrates. To correctly determine this parameter, the full constructive metabolism and its forward part are considered for the growth of cells on glucose as the mostly studied substrate. We used here the found earlier properties of the elemental composition of lipid and lipid-free fractions of cell biomass. Numerical study of the effect of various interrelations between flows via different nodal metabolites has been made. It showed that the requirements of the SCM in high-energy bonds and NAD(P)H are practically constants. The found HEB-to-formed-biomass coefficient is an efficient tool for finding estimates of maximal biomass yield from substrates for which the primary metabolism is known. Calculation of ATP-to-substrate ratio necessary for the yield estimation has been made using the special computer program package, GenMetPath.
-
An efficient algorithm for ${\mathrm{\LaTeX}}$ documents comparing
Computer Research and Modeling, 2015, v. 7, no. 2, pp. 329-345The problem is constructing the differences that arise on ${\mathrm{\LaTeX}}$ documents editing. Each document is represented as a parse tree whose nodes are called tokens. The smallest possible text representation of the document that does not change the syntax tree is constructed. All of the text is splitted into fragments whose boundaries correspond to tokens. A map of the initial text fragment sequence to the similar sequence of the edited document corresponding to the minimum distance is built with Hirschberg algorithm A map of text characters corresponding to the text fragment sequences map is cunstructed. Tokens, that chars are all deleted, or all inserted, or all not changed, are selected in the parse trees. The map for the trees formed with other tokens is built using Zhang–Shasha algorithm.
Keywords: automation, editing distance, text analysis, lexeme, machine learning, metric, parse tree, syntax tree, token, ${\mathrm{\LaTeX}}$.Views (last year): 2. Citations: 2 (RSCI). -
Numerical modeling of physical processes leading to the destruction of meteoroids in the Earth’s atmosphere
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 835-851Within the framework of the actual problem of comet-asteroid danger, the physical processes causing the destruction and fragmentation of meteor bodies in the Earth’s atmosphere are numerically investigated. Based on the developed physicalmathematical models that determines the movements of space objects of natural origin in the atmosphere and their interaction with it, the fall of three, one of the largest and by some parameters unusual bolides in the history of meteoritics, are considered: Tunguska, Vitim and Chelyabinsk. Their singularity lies in the absence of any material meteorite remains and craters in the area of the alleged crash site for the first two bodies and the non-detection, as it is assumed, of the main mother body for the third body (due to the too small amount of mass of the fallen fragments compared to the estimated mass). The effect of aerodynamic loads and heat flows on these bodies are studied, which leads to intensive surface mass loss and possible mechanical destruction. The velocities of the studied celestial bodies and the change in their masses are determined from the modernized system of equations of the theory of meteoric physics. An important factor that is taken into account here is the variability of the meteorite mass entrainment parameter under the action of heat fluxes (radiation and convective) along the flight path. The process of fragmentation of meteoroids in this paper is considered within the framework of a progressive crushing model based on the statistical theory of strength, taking into account the influence of the scale factor on the ultimate strength of objects. The phenomena and effects arising at various kinematic and physical parameters of each of these bodies are revealed. In particular, the change in the ballistics of their flight in the denser layers of the atmosphere, consisting in the transition from the fall mode to the ascent mode. At the same time, the following scenarios of the event can be realized: 1) the return of the body back to outer space at its residual velocity greater than the second cosmic one; 2) the transition of the body to the orbit of the Earth satellite at a residual velocity greater than the first cosmic one; 3) at lower values of the residual velocity of the body, its return after some time to the fall mode and falling out at a considerable distance from the intended crash site. It is the implementation of one of these three scenarios of the event that explains, for example, the absence of material traces, including craters, in the case of the Tunguska bolide in the vicinity of the forest collapse. Assumptions about the possibility of such scenarios have been made earlier by other authors, and in this paper their implementation is confirmed by the results of numerical calculations.
-
Bibliographic link prediction using contrast resampling technique
Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1317-1336The paper studies the problem of searching for fragments with missing bibliographic links in a scientific article using automatic binary classification. To train the model, we propose a new contrast resampling technique, the innovation of which is the consideration of the context of the link, taking into account the boundaries of the fragment, which mostly affects the probability of presence of a bibliographic links in it. The training set was formed of automatically labeled samples that are fragments of three sentences with class labels «without link» and «with link» that satisfy the requirement of contrast: samples of different classes are distanced in the source text. The feature space was built automatically based on the term occurrence statistics and was expanded by constructing additional features — entities (names, numbers, quotes and abbreviations) recognized in the text.
A series of experiments was carried out on the archives of the scientific journals «Law enforcement review» (273 articles) and «Journal Infectology» (684 articles). The classification was carried out by the models Nearest Neighbors, RBF SVM, Random Forest, Multilayer Perceptron, with the selection of optimal hyperparameters for each classifier.
Experiments have confirmed the hypothesis put forward. The highest accuracy was reached by the neural network classifier (95%), which is however not as fast as the linear one that showed also high accuracy with contrast resampling (91–94%). These values are superior to those reported for NER and Sentiment Analysis on comparable data. The high computational efficiency of the proposed method makes it possible to integrate it into applied systems and to process documents online.
-
Simulation of traffic flows based on the quasi-gasdynamic approach and the cellular automata theory using supercomputers
Computer Research and Modeling, 2024, v. 16, no. 1, pp. 175-194The purpose of the study is to simulate the dynamics of traffic flows on city road networks as well as to systematize the current state of affairs in this area. The introduction states that the development of intelligent transportation systems as an integral part of modern transportation technologies is coming to the fore. The core of these systems contain adequate mathematical models that allow to simulate traffic as close to reality as possible. The necessity of using supercomputers due to the large amount of calculations is also noted, therefore, the creation of special parallel algorithms is needed. The beginning of the article is devoted to the up-to-date classification of traffic flow models and characterization of each class, including their distinctive features and relevant examples with links. Further, the main focus of the article is shifted towards the development of macroscopic and microscopic models, created by the authors, and determination of the place of these models in the aforementioned classification. The macroscopic model is based on the continuum approach and uses the ideology of quasi-gasdynamic systems of equations. Its advantages are indicated in comparison with existing models of this class. The model is presented both in one-dimensional and two-dimensional versions. The both versions feature the ability to study multi-lane traffic. In the two-dimensional version it is made possible by introduction of the concept of “lateral” velocity, i. e., the speed of changing lanes. The latter version allows for carrying out calculations in the computational domain which corresponds to the actual geometry of the road. The section also presents the test results of modeling vehicle dynamics on a road fragment with the local widening and on a road fragment with traffic lights, including several variants of traffic light regimes. In the first case, the calculations allow to draw interesting conclusions about the impact of a road widening on a road capacity as a whole, and in the second case — to select the optimal regime configuration to obtain the “green wave” effect. The microscopic model is based on the cellular automata theory and the single-lane Nagel – Schreckenberg model and is generalized for the multi-lane case by the authors of the article. The model implements various behavioral strategies of drivers. Test computations for the real transport network section in Moscow city center are presented. To achieve an adequate representation of vehicles moving through the network according to road traffic regulations the authors implemented special algorithms adapted for parallel computing. Test calculations were performed on the K-100 supercomputer installed in the Centre of Collective Usage of KIAM RAS.
-
Identification of the author of the text by segmentation method
Computer Research and Modeling, 2022, v. 14, no. 5, pp. 1199-1210The paper describes a method for recognizing authors of literary texts by the proximity of fragments into which a separate text is divided to the standard of the author. The standard is the empirical frequency distribution of letter combinations, built on a training sample, which included expertly selected reliably known works of this author. A set of standards of different authors forms a library, within which the problem of identifying the author of an unknown text is solved. The proximity between texts is understood in the sense of the norm in L1 for the frequency vector of letter combinations, which is constructed for each fragment and for the text as a whole. The author of an unknown text is assigned the one whose standard is most often chosen as the closest for the set of fragments into which the text is divided. The length of the fragment is optimized based on the principle of the maximum difference in distances from fragments to standards in the problem of recognition of «friend–foe». The method was tested on the corpus of domestic and foreign (translated) authors. 1783 texts of 100 authors with a total volume of about 700 million characters were collected. In order to exclude the bias in the selection of authors, authors whose surnames began with the same letter were considered. In particular, for the letter L, the identification error was 12%. Along with a fairly high accuracy, this method has another important property: it allows you to estimate the probability that the standard of the author of the text in question is missing in the library. This probability can be estimated based on the results of the statistics of the nearest standards for small fragments of text. The paper also examines statistical digital portraits of writers: these are joint empirical distributions of the probability that a certain proportion of the text is identified at a given level of trust. The practical importance of these statistics is that the carriers of the corresponding distributions practically do not overlap for their own and other people’s standards, which makes it possible to recognize the reference distribution of letter combinations at a high level of confidence.
-
Molecular dynamics of tubulin protofilaments and the effect of taxol on their bending deformation
Computer Research and Modeling, 2024, v. 16, no. 2, pp. 503-512Despite the widespread use of cancer chemotherapy drugs, the molecular mechanisms of action of many of them remain unclear. Some of these drugs, such as taxol, are known to affect the dynamics of microtubule assembly and stop the process of cell division in prophase-prometaphase. Recently, new spatial structures of microtubules and individual tubulin oligomers have emerged associated with various regulatory proteins and cancer chemotherapy drugs. However, knowledge of the spatial structure in itself does not provide information about the mechanism of action of drugs.
In this work, we applied the molecular dynamics method to study the behavior of taxol-bound tubulin oligomers and used our previously developed method for analyzing the conformation of tubulin protofilaments, based on the calculation of modified Euler angles. Recent structures of microtubule fragments have demonstrated that tubulin protofilaments bend not in the radial direction, as many researchers assume, but at an angle of approximately 45◦ from the radial direction. However, in the presence of taxol, the bending direction shifts closer to the radial direction. There was no significant difference between the mean bending and torsion angles of the studied tubulin structures when bound to the various natural regulatory ligands, guanosine triphosphate and guanosine diphosphate. The intra-dimer bending angle was found to be greater than the interdimer bending angle in all analyzed trajectories. This indicates that the bulk of the deformation energy is stored within the dimeric tubulin subunits and not between them. Analysis of the structures of the latest generation of tubulins indicated that the presence of taxol in the tubulin beta subunit pocket allosterically reduces the torsional rigidity of the tubulin oligomer, which could explain the underlying mechanism of taxol’s effect on microtubule dynamics. Indeed, a decrease in torsional rigidity makes it possible to maintain lateral connections between protofilaments, and therefore should lead to the stabilization of microtubules, which is what is observed in experiments. The results of the work shed light on the phenomenon of dynamic instability of microtubules and allow to come closer to understanding the molecular mechanisms of cell division.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"