Результаты поиска по 'exploration of graphs':
Найдено статей: 6
  1. Stepkin A.V.
    Using collective of agents for exploration of graph
    Computer Research and Modeling, 2013, v. 5, no. 4, pp. 525-532

    Problem of exploration finite undirected graphs by a collective of agents is considered in this work. Two agents-researchers simultaneously move on graph, they read and change marks of graph elements, transfer the information to the agent-experimenter (it builds explored graph representation). It was constructed an algorithm linear (from amount of the graph’s nodes) time complexity, quadratic space complexity and communication complexity, that is equal to O(n2·log(n)). Two agents (which move on graph) need two different colors (in total three colors) for graph exploration. An algorithm is based on depth-first traversal method.

    Views (last year): 4. Citations: 2 (RSCI).
  2. Yevin I.A., Komarov V.V., Popova M.S., Marchenko D.K., Samsonova A.J.
    Cities road networks
    Computer Research and Modeling, 2016, v. 8, no. 5, pp. 775-786

    Road network infrastructure is the basis of any urban area. This article compares the structural characteristics (meshedness coefficient, clustering coefficient) road networks of Moscow center (Old Moscow), formed as a result of self-organization and roads near Leninsky Prospekt (postwar Moscow), which was result of cetralized planning. Data for the construction of road networks in the form of graphs taken from the Internet resource OpenStreetMap, allowing to accurately identify the coordinates of the intersections. According to the characteristics of the calculated Moscow road networks areas the cities with road network which have a similar structure to the two Moscow areas was found in foreign publications. Using the dual representation of road networks of centers of Moscow and St. Petersburg, studied the information and cognitive features of navigation in these tourist areas of the two capitals. In the construction of the dual graph of the studied areas were not taken into account the different types of roads (unidirectional or bi-directional traffic, etc), that is built dual graphs are undirected. Since the road network in the dual representation are described by a power law distribution of vertices on the number of edges (scale-free networks), exponents of these distributions were calculated. It is shown that the information complexity of the dual graph of the center of Moscow exceeds the cognitive threshold 8.1 bits, and the same feature for the center of St. Petersburg below this threshold, because the center of St. Petersburg road network was created on the basis of planning and therefore more easy to navigate. In conclusion, using the methods of statistical mechanics (the method of calculating the partition functions) for the road network of some Russian cities the Gibbs entropy were calculated. It was found that with the road network size increasing their entropy decreases. We discuss the problem of studying the evolution of urban infrastructure networks of different nature (public transport, supply , communication networks, etc.), which allow us to more deeply explore and understand the fundamental laws of urbanization.

    Views (last year): 3.
  3. Stepkin A.V., Stepkina A.S.
    Algorithm of simple graph exploration by a collective of agents
    Computer Research and Modeling, 2021, v. 13, no. 1, pp. 33-45

    The study presented in the paper is devoted to the problem of finite graph exploration using a collective of agents. Finite non-oriented graphs without loops and multiple edges are considered in this paper. The collective of agents consists of two agents-researchers, who have a finite memory independent of the number of nodes of the graph studied by them and use two colors each (three colors are used in the aggregate) and one agentexperimental, who has a finite, unlimitedly growing internal memory. Agents-researches can simultaneously traverse the graph, read and change labels of graph elements, and also transmit the necessary information to a third agent — the agent-experimenter. An agent-experimenter is a non-moving agent in whose memory the result of the functioning of agents-researchers at each step is recorded and, also, a representation of the investigated graph (initially unknown to agents) is gradually built up with a list of edges and a list of nodes.

    The work includes detail describes of the operating modes of agents-researchers with an indication of the priority of their activation. The commands exchanged between agents-researchers and an agent-experimenter during the execution of procedures are considered. Problematic situations arising in the work of agentsresearchers are also studied in detail, for example, staining a white vertex, when two agents simultaneously fall into the same node, or marking and examining the isthmus (edges connecting subgraphs examined by different agents-researchers), etc. The full algorithm of the agent-experimenter is presented with a detailed description of the processing of messages received from agents-researchers, on the basis of which a representation of the studied graph is built. In addition, a complete analysis of the time, space, and communication complexities of the constructed algorithm was performed.

    The presented graph exploration algorithm has a quadratic (with respect to the number of nodes of the studied graph) time complexity, quadratic space complexity, and quadratic communication complexity. The graph exploration algorithm is based on the depth-first traversal method.

  4. Kovalenko S.Yu., Yusubalieva G.M.
    Survival task for the mathematical model of glioma therapy with blood-brain barrier
    Computer Research and Modeling, 2018, v. 10, no. 1, pp. 113-123

    The paper proposes a mathematical model for the therapy of glioma, taking into account the blood-brain barrier, radiotherapy and antibody therapy. The parameters were estimated from experimental data and the evaluation of the effect of parameter values on the effectiveness of treatment and the prognosis of the disease were obtained. The possible variants of sequential use of radiotherapy and the effect of antibodies have been explored. The combined use of radiotherapy with intravenous administration of $mab$ $Cx43$ leads to a potentiation of the therapeutic effect in glioma.

    Radiotherapy must precede chemotherapy, as radio exposure reduces the barrier function of endothelial cells. Endothelial cells of the brain vessels fit tightly to each other. Between their walls are formed so-called tight contacts, whose role in the provision of BBB is that they prevent the penetration into the brain tissue of various undesirable substances from the bloodstream. Dense contacts between endothelial cells block the intercellular passive transport.

    The mathematical model consists of a continuous part and a discrete one. Experimental data on the volume of glioma show the following interesting dynamics: after cessation of radio exposure, tumor growth does not resume immediately, but there is some time interval during which glioma does not grow. Glioma cells are divided into two groups. The first group is living cells that divide as fast as possible. The second group is cells affected by radiation. As a measure of the health of the blood-brain barrier system, the ratios of the number of BBB cells at the current moment to the number of cells at rest, that is, on average healthy state, are chosen.

    The continuous part of the model includes a description of the division of both types of glioma cells, the recovery of BBB cells, and the dynamics of the drug. Reducing the number of well-functioning BBB cells facilitates the penetration of the drug to brain cells, that is, enhances the action of the drug. At the same time, the rate of division of glioma cells does not increase, since it is limited not by the deficiency of nutrients available to cells, but by the internal mechanisms of the cell. The discrete part of the mathematical model includes the operator of radio interaction, which is applied to the indicator of BBB and to glial cells.

    Within the framework of the mathematical model of treatment of a cancer tumor (glioma), the problem of optimal control with phase constraints is solved. The patient’s condition is described by two variables: the volume of the tumor and the condition of the BBB. The phase constraints delineate a certain area in the space of these indicators, which we call the survival area. Our task is to find such treatment strategies that minimize the time of treatment, maximize the patient’s rest time, and at the same time allow state indicators not to exceed the permitted limits. Since the task of survival is to maximize the patient’s lifespan, it is precisely such treatment strategies that return the indicators to their original position (and we see periodic trajectories on the graphs). Periodic trajectories indicate that the deadly disease is translated into a chronic one.

    Views (last year): 14.
  5. Lukianchenko P.P., Danilov A.M., Bugaev A.S., Gorbunov E.I., Pashkov R.A., Ilyina P.G., Gadzhimirzayev Sh.M.
    Approach to Estimating the Dynamics of the Industry Consolidation Level
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 129-140

    In this article we propose a new approach to the analysis of econometric industry parameters for the industry consolidation level. The research is based on the simple industry automatic control model. The state of the industry is measured by quarterly obtained econometric parameters from each industry’s company provided by the tax control regulator. An approach to analysis of the industry, which does not provide for tracking the economy of each company, but explores the parameters of the set of all companies as a whole, is proposed. Quarterly obtained econometric parameters from each industry’s company are Income, Quantity of employers, Taxes, and Income from Software Licenses. The ABC analysis method was modified by ABCD analysis (D — companies with zero-level impact to industry metrics) and used to make the results obtained for different indicators comparable. Pareto charts were formed for the set of econometric indicators.

    To estimate the industry monopolization, the Herfindahl – Hirschman index was calculated for the most sensitive companies metrics. Using the HHI approach, it was proved that COVID-19 does not lead to changes in the monopolization of the Russian IT industry.

    As the most visually obvious approach to the industry visualization, scattering diagrams in combination with the Pareto graph colors were proposed. The affect of the accreditation procedure is clearly observed by scattering diagram in combination with red/black dots for accredited and nonaccredited companies respectively.

    The last reported result is the proposal to use the Licenses End-to-End Product Identification as the market structure control instrument. It is the basis to avoid the multiple accounting of the licenses reselling within the chain of software distribution.

    The results of research could be the basis for future IT industry analysis and simulation on the agent based approach.

  6. Zavodskikh R.K., Efanov N.N.
    Performance prediction for chosen types of loops over one-dimensional arrays with embedding-driven intermediate representations analysis
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 211-224

    The method for mapping of intermediate representations (IR) set of C, C++ programs to vector embedding space is considered to create an empirical estimation framework for static performance prediction using LLVM compiler infrastructure. The usage of embeddings makes programs easier to compare due to avoiding Control Flow Graphs (CFG) and Data Flow Graphs (DFG) direct comparison. This method is based on transformation series of the initial IR such as: instrumentation — injection of artificial instructions in an instrumentation compiler’s pass depending on load offset delta in the current instruction compared to the previous one, mapping of instrumented IR into multidimensional vector with IR2Vec and dimension reduction with t-SNE (t-distributed stochastic neighbor embedding) method. The D1 cache miss ratio measured with perf stat tool is considered as performance metric. A heuristic criterion of programs having more or less cache miss ratio is given. This criterion is based on embeddings of programs in 2D-space. The instrumentation compiler’s pass developed in this work is described: how it generates and injects artificial instructions into IR within the used memory model. The software pipeline that implements the performance estimation based on LLVM compiler infrastructure is given. Computational experiments are performed on synthetic tests which are the sets of programs with the same CFGs but with different sequences of offsets used when accessing the one-dimensional array of a given size. The correlation coefficient between performance metric and distance to the worst program’s embedding is measured and proved to be negative regardless of t-SNE initialization. This fact proves the heuristic criterion to be true. The process of such synthetic tests generation is also considered. Moreover, the variety of performance metric in programs set in such a test is proposed as a metric to be improved with exploration of more tests generators.

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"