Результаты поиска по 'distributed data processing':
Найдено статей: 52
  1. Kuznetsov M.B., Kolobov A.V.
    Mathematical investigation of antiangiogenic monotherapy effect on heterogeneous tumor progression
    Computer Research and Modeling, 2017, v. 9, no. 3, pp. 487-501

    In the last decade along with classical cytotoxic agents, antiangiogenic drugs have been actively used in cancer chemotherapy. They are not aimed at killing malignant cells, but at blocking the process of angiogenesis, i.e., the growth of new vessels in the tumor and its surrounding tissues. Agents that stimulate angiogenesis, in particular, vascular endothelial growth factor, are actively produced by tumor cells in the state of metabolic stress. It is believed that blocking of tumor neovascularization should lead to a shortage of nutrients flow to the tumor, and thus can stop, or at least significantly slow down its growth. Clinical practice on the use of first antiangiogenic drug bevacizumab has shown that in some cases such therapy does not influence the growth rate of the tumor, whereas for other types of malignant neoplasms antiangiogenic therapy has a high antitumor effect. However, it has been shown that along with successful slowing of tumor growth, therapy with bevacizumab can induce directed tumor progression to a more invasive, and therefore more lethal, type. These data require theoretical analysis and rationale for the evolutionary factors that lead to the observation of epithelial-mesenchymal transition. For this purpose we have developed a spatially distributed mathematical model of growth and antiangiogenic therapy of heterogeneous tumor consisting of two subpopulations of malignant cells. One of subpopulations possesses inherent characteristics of epithelial phenotype, i.e., low motility and high proliferation rate, the other one corresponds to mesenchymal phenotype having high motility and low proliferation rate. We have performed the investigation of competition between these subpopulations of heterogeneous tumor in the cases of tumor growth without therapy and under bevacizumab monotherapy. It is shown that constant use of antiangiogenic drug leads to an increase of the region in parameter space, where the dominance of mesenchymal phenotype takes place, i.e., within a certain range of parameters in the absence of therapy epithelial phenotype is dominant but during bevacizumab administration mesenchymal phenotype begins to dominate. This result provides a theoretical basis of the clinically observed directed tumor progression to more invasive type under antiangiogenic therapy.

    Views (last year): 10. Citations: 2 (RSCI).
  2. The paper develops a new mathematical method of the joint signal and noise calculation at the Rice statistical distribution based on combing the maximum likelihood method and the method of moments. The calculation of the sough-for values of signal and noise is implemented by processing the sampled measurements of the analyzed Rician signal’s amplitude. The explicit equations’ system has been obtained for required signal and noise parameters and the results of its numerical solution are provided confirming the efficiency of the proposed technique. It has been shown that solving the two-parameter task by means of the proposed technique does not lead to the increase of the volume of demanded calculative resources if compared with solving the task in one-parameter approximation. An analytical solution of the task has been obtained for the particular case of small value of the signal-to-noise ratio. The paper presents the investigation of the dependence of the sought for parameters estimation accuracy and dispersion on the quantity of measurements in experimental sample. According to the results of numerical experiments, the dispersion values of the estimated sought-for signal and noise parameters calculated by means of the proposed technique change in inverse proportion to the quantity of measurements in a sample. There has been implemented a comparison of the accuracy of the soughtfor Rician parameters’ estimation by means of the proposed technique and by earlier developed version of the method of moments. The problem having been considered in the paper is meaningful for the purposes of Rician data processing, in particular, at the systems of magnetic-resonance visualization, in devices of ultrasonic visualization, at optical signals’ analysis in range-measuring systems, at radar signals’ analysis, as well as at solving many other scientific and applied tasks that are adequately described by the Rice statistical model.

    Views (last year): 11.
  3. Suvorov N.V., Shleymovich M.P.
    Mathematical model of the biometric iris recognition system
    Computer Research and Modeling, 2020, v. 12, no. 3, pp. 629-639

    Automatic recognition of personal identity by biometric features is based on unique peculiarities or characteristics of people. Biometric identification process consist in making of reference templates and comparison with new input data. Iris pattern recognition algorithms presents high accuracy and low identification errors percent on practice. Iris pattern advantages over other biometric features are determined by its high degree of freedom (nearly 249), excessive density of unique features and constancy. High recognition reliability level is very important because it provides search in big databases. Unlike one-to-one check mode that is applicable only to small calculation count it allows to work in one-to-many identification mode. Every biometric identification system appears to be probabilistic and qualitative characteristics description utilizes such parameters as: recognition accuracy, false acceptance rate and false rejection rate. These characteristics allows to compare identity recognition methods and asses the system performance under any circumstances. This article explains the mathematical model of iris pattern biometric identification and its characteristics. Besides, there are analyzed results of comparison of model and real recognition process. To make such analysis there was carried out the review of existing iris pattern recognition methods based on different unique features vector. The Python-based software package is described below. It builds-up probabilistic distributions and generates large test data sets. Such data sets can be also used to educate the identification decision making neural network. Furthermore, synergy algorithm of several iris pattern identification methods was suggested to increase qualitative characteristics of system in comparison with the use of each method separately.

  4. Belov S.D., Deng Z., Li W., Lin T., Pelevanyuk I., Trofimov V.V., Uzhinskiy A.V., Yan T., Yan X., Zhang G., Zhao X., Zhang X., Zhemchugov A.S.
    BES-III distributed computing status
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 469-473

    The BES-III experiment at the IHEP CAS, Beijing, is running at the high-luminosity e+e- collider BEPC-II to study physics of charm quarks and tau leptons. The world largest samples of J/psi and psi' events are already collected, a number of unique data samples in the energy range 2.5–4.6 GeV have been taken. The data volume is expected to increase by an order of magnitude in the coming years. This requires to move from a centralized computing system to a distributed computing environment, thus allowing the use of computing resources from remote sites — members of the BES-III Collaboration. In this report the general information, latest results and development plans of the BES-III distributed computing system are presented.

    Views (last year): 3.
  5. Usanov M.S., Kulberg N.S., Yakovleva T.V., Morozov S.P.
    Determination of CT dose by means of noise analysis
    Computer Research and Modeling, 2018, v. 10, no. 4, pp. 525-533

    The article deals with the process of creating an effective algorithm for determining the amount of emitted quanta from an X-ray tube in computer tomography (CT) studies. An analysis of domestic and foreign literature showed that most of the work in the field of radiometry and radiography takes the tabulated values of X-ray absorption coefficients into account, while individual dose factors are not taken into account at all since many studies are lacking the Dose Report. Instead, an average value is used to simplify the calculation of statistics. In this regard, it was decided to develop a method to detect the amount of ionizing quanta by analyzing the noise of CT data. As the basis of the algorithm, we used Poisson and Gauss distribution mathematical model of owns’ design of logarithmic value. The resulting mathematical model was tested on the CT data of a calibration phantom consisting of three plastic cylinders filled with water, the X-ray absorption coefficient of which is known from the table values. The data were obtained from several CT devices from different manufacturers (Siemens, Toshiba, GE, Phillips). The developed algorithm made it possible to calculate the number of emitted X-ray quanta per unit time. These data, taking into account the noise level and the radiuses of the cylinders, were converted to X-ray absorption values, after which a comparison was made with tabulated values. As a result of this operation, the algorithm used with CT data of various configurations, experimental data were obtained, consistent with the theoretical part and the mathematical model. The results showed good accuracy of the algorithm and mathematical apparatus, which shows reliability of the obtained data. This mathematical model is already used in the noise reduction program of the CT of own design, where it participates as a method of creating a dynamic threshold of noise reduction. At the moment, the algorithm is being processed to work with real data from computer tomography of patients.

    Views (last year): 23. Citations: 1 (RSCI).
  6. Orlova E.V.
    Model for operational optimal control of financial recourses distribution in a company
    Computer Research and Modeling, 2019, v. 11, no. 2, pp. 343-358

    A critical analysis of existing approaches, methods and models to solve the problem of financial resources operational management has been carried out in the article. A number of significant shortcomings of the presented models were identified, limiting the scope of their effective usage. There are a static nature of the models, probabilistic nature of financial flows are not taken into account, daily amounts of receivables and payables that significantly affect the solvency and liquidity of the company are not identified. This necessitates the development of a new model that reflects the essential properties of the planning financial flows system — stochasticity, dynamism, non-stationarity.

    The model for the financial flows distribution has been developed. It bases on the principles of optimal dynamic control and provides financial resources planning ensuring an adequate level of liquidity and solvency of a company and concern initial data uncertainty. The algorithm for designing the objective cash balance, based on principles of a companies’ financial stability ensuring under changing financial constraints, is proposed.

    Characteristic of the proposed model is the presentation of the cash distribution process in the form of a discrete dynamic process, for which a plan for financial resources allocation is determined, ensuring the extremum of an optimality criterion. Designing of such plan is based on the coordination of payments (cash expenses) with the cash receipts. This approach allows to synthesize different plans that differ in combinations of financial outflows, and then to select the best one according to a given criterion. The minimum total costs associated with the payment of fines for non-timely financing of expenses were taken as the optimality criterion. Restrictions in the model are the requirement to ensure the minimum allowable cash balances for the subperiods of the planning period, as well as the obligation to make payments during the planning period, taking into account the maturity of these payments. The suggested model with a high degree of efficiency allows to solve the problem of financial resources distribution under uncertainty over time and receipts, coordination of funds inflows and outflows. The practical significance of the research is in developed model application, allowing to improve the financial planning quality, to increase the management efficiency and operational efficiency of a company.

    Views (last year): 33.
  7. Lukyantsev D.S., Afanasiev N.T., Tanaev A.B., Chudaev S.O.
    Numerical-analytical modeling of gravitational lensing of the electromagnetic waves in random-inhomogeneous space plasma
    Computer Research and Modeling, 2024, v. 16, no. 2, pp. 433-443

    Instrument of numerical-analytical modeling of characteristics of propagation of electromagnetic waves in chaotic space plasma with taking into account effects of gravitation is developed for interpretation of data of measurements of astrophysical precision instruments of new education. The task of propagation of waves in curved (Riemann’s) space is solved in Euclid’s space by introducing of the effective index of refraction of vacuum. The gravitational potential can be calculated for various model of distribution of mass of astrophysical objects and at solution of Poisson’s equation. As a result the effective index of refraction of vacuum can be evaluated. Approximate model of the effective index of refraction is suggested with condition that various objects additively contribute in total gravitational field. Calculation of the characteristics of electromagnetic waves in the gravitational field of astrophysical objects is performed by the approximation of geometrical optics with condition that spatial scales of index of refraction a lot more wavelength. Light differential equations in Euler’s form are formed the basis of numerical-analytical instrument of modeling of trajectory characteristic of waves. Chaotic inhomogeneities of space plasma are introduced by model of spatial correlation function of index of refraction. Calculations of refraction scattering of waves are performed by the approximation of geometrical optics. Integral equations for statistic moments of lateral deviations of beams in picture plane of observer are obtained. Integrals for moments are reduced to system of ordinary differential equations the firsts order with using analytical transformations for cooperative numerical calculation of arrange and meansquare deviations of light. Results of numerical-analytical modeling of trajectory picture of propagation of electromagnetic waves in interstellar space with taking into account impact of gravitational fields of space objects and refractive scattering of waves on inhomogeneities of index of refraction of surrounding plasma are shown. Based on the results of modeling quantitative estimation of conditions of stochastic blurring of the effect of gravitational lensing of electromagnetic waves at various frequency ranges is performed. It’s shown that operating frequencies of meter range of wavelengths represent conditional low-frequency limit for observational of the effect of gravitational lensing in stochastic space plasma. The offered instrument of numerical-analytical modeling can be used for analyze of structure of electromagnetic radiation of quasar propagating through group of galactic.

  8. Makarov I.S., Bagantsova E.R., Iashin P.A., Kovaleva M.D., Zakharova E.M.
    Development of and research into a rigid algorithm for analyzing Twitter publications and its influence on the movements of the cryptocurrency market
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 157-170

    Social media is a crucial indicator of the position of assets in the financial market. The paper describes the rigid solution for the classification problem to determine the influence of social media activity on financial market movements. Reputable crypto traders influencers are selected. Twitter posts packages are used as data. The methods of text, which are characterized by the numerous use of slang words and abbreviations, and preprocessing consist in lemmatization of Stanza and the use of regular expressions. A word is considered as an element of a vector of a data unit in the course of solving the problem of binary classification. The best markup parameters for processing Binance candles are searched for. Methods of feature selection, which is necessary for a precise description of text data and the subsequent process of establishing dependence, are represented by machine learning and statistical analysis. First, the feature selection is used based on the information criterion. This approach is implemented in a random forest model and is relevant for the task of feature selection for splitting nodes in a decision tree. The second one is based on the rigid compilation of a binary vector during a rough check of the presence or absence of a word in the package and counting the sum of the elements of this vector. Then a decision is made depending on the superiority of this sum over the threshold value that is predetermined previously by analyzing the frequency distribution of mentions of the word. The algorithm used to solve the problem was named benchmark and analyzed as a tool. Similar algorithms are often used in automated trading strategies. In the course of the study, observations of the influence of frequently occurring words, which are used as a basis of dimension 2 and 3 in vectorization, are described as well.

  9. Bernadotte A., Mazurin A.D.
    Optimization of the brain command dictionary based on the statistical proximity criterion in silent speech recognition task
    Computer Research and Modeling, 2023, v. 15, no. 3, pp. 675-690

    In our research, we focus on the problem of classification for silent speech recognition to develop a brain– computer interface (BCI) based on electroencephalographic (EEG) data, which will be capable of assisting people with mental and physical disabilities and expanding human capabilities in everyday life. Our previous research has shown that the silent pronouncing of some words results in almost identical distributions of electroencephalographic signal data. Such a phenomenon has a suppressive impact on the quality of neural network model behavior. This paper proposes a data processing technique that distinguishes between statistically remote and inseparable classes in the dataset. Applying the proposed approach helps us reach the goal of maximizing the semantic load of the dictionary used in BCI.

    Furthermore, we propose the existence of a statistical predictive criterion for the accuracy of binary classification of the words in a dictionary. Such a criterion aims to estimate the lower and the upper bounds of classifiers’ behavior only by measuring quantitative statistical properties of the data (in particular, using the Kolmogorov – Smirnov method). We show that higher levels of classification accuracy can be achieved by means of applying the proposed predictive criterion, making it possible to form an optimized dictionary in terms of semantic load for the EEG-based BCIs. Furthermore, using such a dictionary as a training dataset for classification problems grants the statistical remoteness of the classes by taking into account the semantic and phonetic properties of the corresponding words and improves the classification behavior of silent speech recognition models.

  10. Sofronova E.A., Diveev A.I., Kazaryan D.E., Konstantinov S.V., Daryina A.N., Seliverstov Y.A., Baskin L.A.
    Utilizing multi-source real data for traffic flow optimization in CTraf
    Computer Research and Modeling, 2024, v. 16, no. 1, pp. 147-159

    The problem of optimal control of traffic flow in an urban road network is considered. The control is carried out by varying the duration of the working phases of traffic lights at controlled intersections. A description of the control system developed is given. The control system enables the use of three types of control: open-loop, feedback and manual. In feedback control, road infrastructure detectors, video cameras, inductive loop and radar detectors are used to determine the quantitative characteristics of current traffic flow state. The quantitative characteristics of the traffic flows are fed into a mathematical model of the traffic flow, implemented in the computer environment of an automatic traffic flow control system, in order to determine the moments for switching the working phases of the traffic lights. The model is a system of finite-difference recurrent equations and describes the change in traffic flow on each road section at each time step, based on retrived data on traffic flow characteristics in the network, capacity of maneuvers and flow distribution through alternative maneuvers at intersections. The model has scaling and aggregation properties. The structure of the model depends on the structure of the graph of the controlled road network. The number of nodes in the graph is equal to the number of road sections in the considered network. The simulation of traffic flow changes in real time makes it possible to optimally determine the duration of traffic light operating phases and to provide traffic flow control with feedback based on its current state. The system of automatic collection and processing of input data for the model is presented. In order to model the states of traffic flow in the network and to solve the problem of optimal traffic flow control, the CTraf software package has been developed, a brief description of which is given in the paper. An example of the solution of the optimal control problem of traffic flows on the basis of real data in the road network of Moscow is given.

Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"