Результаты поиска по 'data analysis':
Найдено статей: 121
  1. Andreeva A.A., Anand M., Lobanov A.I., Nikolaev A.V., Panteleev M.A.
    Using extended ODE systems to investigate the mathematical model of the blood coagulation
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 931-951

    Many properties of ordinary differential equations systems solutions are determined by the properties of the equations in variations. An ODE system, which includes both the original nonlinear system and the equations in variations, will be called an extended system further. When studying the properties of the Cauchy problem for the systems of ordinary differential equations, the transition to extended systems allows one to study many subtle properties of solutions. For example, the transition to the extended system allows one to increase the order of approximation for numerical methods, gives the approaches to constructing a sensitivity function without using numerical differentiation procedures, allows to use methods of increased convergence order for the inverse problem solution. Authors used the Broyden method belonging to the class of quasi-Newtonian methods. The Rosenbroke method with complex coefficients was used to solve the stiff systems of the ordinary differential equations. In our case, it is equivalent to the second order approximation method for the extended system.

    As an example of the proposed approach, several related mathematical models of the blood coagulation process were considered. Based on the analysis of the numerical calculations results, the conclusion was drawn that it is necessary to include a description of the factor XI positive feedback loop in the model equations system. Estimates of some reaction constants based on the numerical inverse problem solution were given.

    Effect of factor V release on platelet activation was considered. The modification of the mathematical model allowed to achieve quantitative correspondence in the dynamics of the thrombin production with experimental data for an artificial system. Based on the sensitivity analysis, the hypothesis tested that there is no influence of the lipid membrane composition (the number of sites for various factors of the clotting system, except for thrombin sites) on the dynamics of the process.

  2. Moiseev N.A., Nazarova D.I., Semina N.S., Maksimov D.A.
    Changepoint detection on financial data using deep learning approach
    Computer Research and Modeling, 2024, v. 16, no. 2, pp. 555-575

    The purpose of this study is to develop a methodology for change points detection in time series, including financial data. The theoretical basis of the study is based on the pieces of research devoted to the analysis of structural changes in financial markets, description of the proposed algorithms for detecting change points and peculiarities of building classical and deep machine learning models for solving this type of problems. The development of such tools is of interest to investors and other stakeholders, providing them with additional approaches to the effective analysis of financial markets and interpretation of available data.

    To address the research objective, a neural network was trained. In the course of the study several ways of training sample formation were considered, differing in the nature of statistical parameters. In order to improve the quality of training and obtain more accurate results, a methodology for feature generation was developed for the formation of features that serve as input data for the neural network. These features, in turn, were derived from an analysis of mathematical expectations and standard deviations of time series data over specific intervals. The potential for combining these features to achieve more stable results is also under investigation.

    The results of model experiments were analyzed to compare the effectiveness of the proposed model with other existing changepoint detection algorithms that have gained widespread usage in practical applications. A specially generated dataset, developed using proprietary methods, was utilized as both training and testing data. Furthermore, the model, trained on various features, was tested on daily data from the S&P 500 index to assess its effectiveness in a real financial context.

    As the principles of the model’s operation are described, possibilities for its further improvement are considered, including the modernization of the proposed model’s structure, optimization of training data generation, and feature formation. Additionally, the authors are tasked with advancing existing concepts for real-time changepoint detection.

  3. Guskov V.P., Gushchanskiy D.E., Kulabukhova N.V., Abrahamyan S.A., Balyan S.G., Degtyarev A.B., Bogdanov A.V.
    An interactive tool for developing distributed telemedicine systems
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 521-527

    Getting a qualified medical examination can be difficult for people in remote areas because medical staff available can either be inaccessible or it might lack expert knowledge at proper level. Telemedicine technologies can help in such situations. On one hand, such technologies allow highly qualified doctors to consult remotely, thereby increasing the quality of diagnosis and plan treatment. On the other hand, computer-aided analysis of the research results, anamnesis and information on similar cases assist medical staff in their routine activities and decision-making.

    Creating telemedicine system for a particular domain is a laborious process. It’s not sufficient to pick proper medical experts and to fill the knowledge base of the analytical module. It’s also necessary to organize the entire infrastructure of the system to meet the requirements in terms of reliability, fault tolerance, protection of personal data and so on. Tools with reusable infrastructure elements, which are common to such systems, are able to decrease the amount of work needed for the development of telemedicine systems.

    An interactive tool for creating distributed telemedicine systems is described in the article. A list of requirements for the systems is presented; structural solutions for meeting the requirements are suggested. A composition of such elements applicable for distributed systems is described in the article. A cardiac telemedicine system is described as a foundation of the tool

    Views (last year): 3. Citations: 4 (RSCI).
  4. Kamenev G.K., Kamenev I.G.
    Multicriterial metric data analysis in human capital modelling
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1223-1245

    The article describes a model of a human in the informational economy and demonstrates the multicriteria optimizational approach to the metric analysis of model-generated data. The traditional approach using the identification and study involves the model’s identification by time series and its further prediction. However, this is not possible when some variables are not explicitly observed and only some typical borders or population features are known, which is often the case in the social sciences, making some models pure theoretical. To avoid this problem, we propose a method of metric data analysis (MMDA) for identification and study of such models, based on the construction and analysis of the Kolmogorov – Shannon metric nets of the general population in a multidimensional space of social characteristics. Using this method, the coefficients of the model are identified and the features of its phase trajectories are studied. In this paper, we are describing human according to his role in information processing, considering his awareness and cognitive abilities. We construct two lifetime indices of human capital: creative individual (generalizing cognitive abilities) and productive (generalizing the amount of information mastered by a person) and formulate the problem of their multi-criteria (two-criteria) optimization taking into account life expectancy. This approach allows us to identify and economically justify the new requirements for the education system and the information environment of human existence. It is shown that the Pareto-frontier exists in the optimization problem, and its type depends on the mortality rates: at high life expectancy there is one dominant solution, while for lower life expectancy there are different types of Paretofrontier. In particular, the Pareto-principle applies to Russia: a significant increase in the creative human capital of an individual (summarizing his cognitive abilities) is possible due to a small decrease in the creative human capital (summarizing awareness). It is shown that the increase in life expectancy makes competence approach (focused on the development of cognitive abilities) being optimal, while for low life expectancy the knowledge approach is preferable.

  5. The paper presents the results of applying a scheme of very high accuracy and resolution to obtain numerical solutions of the Navier – Stokes equations of a compressible gas describing the occurrence and development of instability of a two-dimensional laminar boundary layer on a flat plate. The peculiarity of the conducted studies is the absence of commonly used artificial exciters of instability in the implementation of direct numerical modeling. The multioperator scheme used made it possible to observe the subtle effects of the birth of unstable modes and the complex nature of their development caused presumably by its small approximation errors. A brief description of the scheme design and its main properties is given. The formulation of the problem and the method of obtaining initial data are described, which makes it possible to observe the established non-stationary regime fairly quickly. A technique is given that allows detecting flow fluctuations with amplitudes many orders of magnitude smaller than its average values. A time-dependent picture of the appearance of packets of Tollmien – Schlichting waves with varying intensity in the vicinity of the leading edge of the plate and their downstream propagation is presented. The presented amplitude spectra with expanding peak values in the downstream regions indicate the excitation of new unstable modes other than those occurring in the vicinity of the leading edge. The analysis of the evolution of instability waves in time and space showed agreement with the main conclusions of the linear theory. The numerical solutions obtained seem to describe for the first time the complete scenario of the possible development of Tollmien – Schlichting instability, which often plays an essential role at the initial stage of the laminar-turbulent transition. They open up the possibilities of full-scale numerical modeling of this process, which is extremely important for practice, with a similar study of the spatial boundary layer.

  6. Timiryanova V.M., Lakman I.A., Larkin M.M.
    Retail forecasting on high-frequency depersonalized data
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1713-1734

    Technological development determines the emergence of highly detailed data in time and space, which expands the possibilities of analysis, allowing us to consider consumer decisions and the competitive behavior of enterprises in all their diversity, taking into account the context of the territory and the characteristics of time periods. Despite the promise of such studies, they are currently limited in the scientific literature. This is due to the range of problems, the solution of which is considered in this paper. The article draws attention to the complexity of the analysis of depersonalized high-frequency data and the possibility of modeling consumption changes in time and space based on them. The features of the new type of data are considered on the example of real depersonalized data received from the fiscal data operator “First OFD” (JSC “Energy Systems and Communications”). It is shown that along with the spectrum of problems inherent in high-frequency data, there are disadvantages associated with the process of generating data on the side of the sellers, which requires a wider use of data mining tools. A series of statistical tests were carried out on the data under consideration, including a Unit-Root Test, test for unobserved individual effects, test for serial correlation and for cross-sectional dependence in panels, etc. The presence of spatial autocorrelation of the data was tested using modified tests of Lagrange multipliers. The tests carried out showed the presence of a consistent correlation and spatial dependence of the data, which determine the expediency of applying the methods of panel and spatial analysis in relation to high-frequency data accumulated by fiscal operators. The constructed models made it possible to substantiate the spatial relationship of sales growth and its dependence on the day of the week. The limitation for increasing the predictive ability of the constructed models and their subsequent complication, due to the inclusion of explanatory factors, was the lack of open access statistics grouped in the required detail in time and space, which determines the relevance of the formation of high-frequency geographically structured data bases.

  7. Fedorov A.A., Soshilov I.V., Loginov V.N.
    Augmented data routing algorithms for satellite delay-tolerant networks. Development and validation
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 983-993

    The problem of centralized planning for data transmission routes in delay tolerant networks is considered. The original problem is extended with additional requirements to nodes storage and communication process. First, it is assumed that the connection between the nodes of the graph is established using antennas. Second, it is assumed that each node has a storage of finite capacity. The existing works do not consider these requirements. It is assumed that we have in advance information about messages to be processed, information about the network configuration at specified time points taken with a certain time periods, information on time delays for the orientation of the antennas for data transmission and restrictions on the amount of data storage on each satellite of the grouping. Two wellknown algorithms — CGR and Earliest Delivery with All Queues are improved to satisfy the extended requirements. The obtained algorithms solve the optimal message routing problem separately for each message. The problem of validation of the algorithms under conditions of lack of test data is considered as well. Possible approaches to the validation based on qualitative conjectures are proposed and tested, and experiment results are described. A performance comparison of the two implementations of the problem solving algorithms is made. Two algorithms named RDTNAS-CG and RDTNAS-AQ have been developed based on the CGR and Earliest Delivery with All Queues algorithms, respectively. The original algorithms have been significantly expanded and an augmented implementation has been developed. Validation experiments were carried to check the minimum «quality» requirements for the correctness of the algorithms. Comparative analysis of the performance of the two algorithms showed that the RDTNAS-AQ algorithm is several orders of magnitude faster than RDTNAS-CG.

  8. Kholodkov K.I., Aleshin I.M.
    Exact calculation of a posteriori probability distribution with distributed computing systems
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 539-542

    We'd like to present a specific grid infrastructure and web application development and deployment. The purpose of infrastructure and web application is to solve particular geophysical problems that require heavy computational resources. Here we cover technology overview and connector framework internals. The connector framework links problem-specific routines with middleware in a manner that developer of application doesn't have to be aware of any particular grid software. That is, the web application built with this framework acts as an interface between the user 's web browser and Grid's (often very) own middleware.

    Our distributed computing system is built around Gridway metascheduler. The metascheduler is connected to TORQUE resource managers of virtual compute nodes that are being run atop of compute cluster utilizing the virtualization technology. Such approach offers several notable features that are unavailable to bare-metal compute clusters.

    The first application we've integrated with our framework is seismic anisotropic parameters determination by inversion of SKS and converted phases. We've used probabilistic approach to inverse problem solution based on a posteriory probability distribution function (APDF) formalism. To get the exact solution of the problem we have to compute the values of multidimensional function. Within our implementation we used brute-force APDF calculation on rectangular grid across parameter space.

    The result of computation is stored in relational DBMS and then represented in familiar human-readable form. Application provides several instruments to allow analysis of function's shape by computational results: maximum value distribution, 2D cross-sections of APDF, 2D marginals and a few other tools. During the tests we've run the application against both synthetic and observed data.

    Views (last year): 3.
  9. Bobkov S.A., Teslyuk A.B., Gorobtsov O.Yu., Yefanov O.M., Kurta R.P., Ilyin V.A., Golosova M.V., Vartanyants I.A.
    XFEL diffraction patterns representation method for classification, indexing and search
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 631-639

    The paper presents the results of application of machine learning methods: principle component analysis and support vector machine for classification of diffraction images produced in experiments at free-electron lasers. High efficiency of this approach presented by application to simulated data of adenovirus capsid and bluetongue virus core. This dataset were simulated with taking into account the real conditions of the experiment on lasers free electrons such as noise and features of used detectors.

    Views (last year): 6.
  10. Dobrynin V.N., Filozova I.A.
    Cataloging technology of information fund
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 661-673

    The article discusses the approach to the improvement of information processing technology on the basis of logical-semantic network (LSN) Question–Answer–Reaction aimed at formation and support of the catalog service providing efficient search of answers to questions.

    The basis of such a catalog service are semantic links, reflecting the logic of presentation of the author's thoughts within the framework this publication, theme, subject area. Structuring and support of these links will allow working with a field of meanings, providing new opportunities for the study the corps of digital libraries documents. Cataloging of the information fund includes: formation of lexical dictionary; formation of the classification tree for several bases; information fund classification for question–answer topics; formation of the search queries that are adequate classification trees the question–answer; automated search queries on thematic search engines; analysis of the responses to queries; LSN catalog support during the operational phase (updating and refinement of the catalog). The technology is considered for two situations: 1) information fund has already been formed; 2) information fund is missing, you must create it.

    Views (last year): 3.
Pages: « first previous next

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"