Результаты поиска по 'data processing':
Найдено статей: 157
  1. Gankevich I.G., Degtyarev A.B.
    Efficient processing and classification of wave energy spectrum data with a distributed pipeline
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 517-520

    Processing of large amounts of data often consists of several steps, e.g. pre- and post-processing stages, which are executed sequentially with data written to disk after each step, however, when pre-processing stage for each task is different the more efficient way of processing data is to construct a pipeline which streams data from one stage to another. In a more general case some processing stages can be factored into several parallel subordinate stages thus forming a distributed pipeline where each stage can have multiple inputs and multiple outputs. Such processing pattern emerges in a problem of classification of wave energy spectra based on analytic approximations which can extract different wave systems and their parameters (e.g. wave system type, mean wave direction) from spectrum. Distributed pipeline approach achieves good performance compared to conventional “sequential-stage” processing.

    Views (last year): 3. Citations: 2 (RSCI).
  2. Andreeva A.A., Anand M., Lobanov A.I., Nikolaev A.V., Panteleev M.A.
    Using extended ODE systems to investigate the mathematical model of the blood coagulation
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 931-951

    Many properties of ordinary differential equations systems solutions are determined by the properties of the equations in variations. An ODE system, which includes both the original nonlinear system and the equations in variations, will be called an extended system further. When studying the properties of the Cauchy problem for the systems of ordinary differential equations, the transition to extended systems allows one to study many subtle properties of solutions. For example, the transition to the extended system allows one to increase the order of approximation for numerical methods, gives the approaches to constructing a sensitivity function without using numerical differentiation procedures, allows to use methods of increased convergence order for the inverse problem solution. Authors used the Broyden method belonging to the class of quasi-Newtonian methods. The Rosenbroke method with complex coefficients was used to solve the stiff systems of the ordinary differential equations. In our case, it is equivalent to the second order approximation method for the extended system.

    As an example of the proposed approach, several related mathematical models of the blood coagulation process were considered. Based on the analysis of the numerical calculations results, the conclusion was drawn that it is necessary to include a description of the factor XI positive feedback loop in the model equations system. Estimates of some reaction constants based on the numerical inverse problem solution were given.

    Effect of factor V release on platelet activation was considered. The modification of the mathematical model allowed to achieve quantitative correspondence in the dynamics of the thrombin production with experimental data for an artificial system. Based on the sensitivity analysis, the hypothesis tested that there is no influence of the lipid membrane composition (the number of sites for various factors of the clotting system, except for thrombin sites) on the dynamics of the process.

  3. Guskov V.P., Gushchanskiy D.E., Kulabukhova N.V., Abrahamyan S.A., Balyan S.G., Degtyarev A.B., Bogdanov A.V.
    An interactive tool for developing distributed telemedicine systems
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 521-527

    Getting a qualified medical examination can be difficult for people in remote areas because medical staff available can either be inaccessible or it might lack expert knowledge at proper level. Telemedicine technologies can help in such situations. On one hand, such technologies allow highly qualified doctors to consult remotely, thereby increasing the quality of diagnosis and plan treatment. On the other hand, computer-aided analysis of the research results, anamnesis and information on similar cases assist medical staff in their routine activities and decision-making.

    Creating telemedicine system for a particular domain is a laborious process. It’s not sufficient to pick proper medical experts and to fill the knowledge base of the analytical module. It’s also necessary to organize the entire infrastructure of the system to meet the requirements in terms of reliability, fault tolerance, protection of personal data and so on. Tools with reusable infrastructure elements, which are common to such systems, are able to decrease the amount of work needed for the development of telemedicine systems.

    An interactive tool for creating distributed telemedicine systems is described in the article. A list of requirements for the systems is presented; structural solutions for meeting the requirements are suggested. A composition of such elements applicable for distributed systems is described in the article. A cardiac telemedicine system is described as a foundation of the tool

    Views (last year): 3. Citations: 4 (RSCI).
  4. Kamenev G.K., Kamenev I.G.
    Multicriterial metric data analysis in human capital modelling
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1223-1245

    The article describes a model of a human in the informational economy and demonstrates the multicriteria optimizational approach to the metric analysis of model-generated data. The traditional approach using the identification and study involves the model’s identification by time series and its further prediction. However, this is not possible when some variables are not explicitly observed and only some typical borders or population features are known, which is often the case in the social sciences, making some models pure theoretical. To avoid this problem, we propose a method of metric data analysis (MMDA) for identification and study of such models, based on the construction and analysis of the Kolmogorov – Shannon metric nets of the general population in a multidimensional space of social characteristics. Using this method, the coefficients of the model are identified and the features of its phase trajectories are studied. In this paper, we are describing human according to his role in information processing, considering his awareness and cognitive abilities. We construct two lifetime indices of human capital: creative individual (generalizing cognitive abilities) and productive (generalizing the amount of information mastered by a person) and formulate the problem of their multi-criteria (two-criteria) optimization taking into account life expectancy. This approach allows us to identify and economically justify the new requirements for the education system and the information environment of human existence. It is shown that the Pareto-frontier exists in the optimization problem, and its type depends on the mortality rates: at high life expectancy there is one dominant solution, while for lower life expectancy there are different types of Paretofrontier. In particular, the Pareto-principle applies to Russia: a significant increase in the creative human capital of an individual (summarizing his cognitive abilities) is possible due to a small decrease in the creative human capital (summarizing awareness). It is shown that the increase in life expectancy makes competence approach (focused on the development of cognitive abilities) being optimal, while for low life expectancy the knowledge approach is preferable.

  5. The paper presents the results of applying a scheme of very high accuracy and resolution to obtain numerical solutions of the Navier – Stokes equations of a compressible gas describing the occurrence and development of instability of a two-dimensional laminar boundary layer on a flat plate. The peculiarity of the conducted studies is the absence of commonly used artificial exciters of instability in the implementation of direct numerical modeling. The multioperator scheme used made it possible to observe the subtle effects of the birth of unstable modes and the complex nature of their development caused presumably by its small approximation errors. A brief description of the scheme design and its main properties is given. The formulation of the problem and the method of obtaining initial data are described, which makes it possible to observe the established non-stationary regime fairly quickly. A technique is given that allows detecting flow fluctuations with amplitudes many orders of magnitude smaller than its average values. A time-dependent picture of the appearance of packets of Tollmien – Schlichting waves with varying intensity in the vicinity of the leading edge of the plate and their downstream propagation is presented. The presented amplitude spectra with expanding peak values in the downstream regions indicate the excitation of new unstable modes other than those occurring in the vicinity of the leading edge. The analysis of the evolution of instability waves in time and space showed agreement with the main conclusions of the linear theory. The numerical solutions obtained seem to describe for the first time the complete scenario of the possible development of Tollmien – Schlichting instability, which often plays an essential role at the initial stage of the laminar-turbulent transition. They open up the possibilities of full-scale numerical modeling of this process, which is extremely important for practice, with a similar study of the spatial boundary layer.

  6. Timiryanova V.M., Lakman I.A., Larkin M.M.
    Retail forecasting on high-frequency depersonalized data
    Computer Research and Modeling, 2023, v. 15, no. 6, pp. 1713-1734

    Technological development determines the emergence of highly detailed data in time and space, which expands the possibilities of analysis, allowing us to consider consumer decisions and the competitive behavior of enterprises in all their diversity, taking into account the context of the territory and the characteristics of time periods. Despite the promise of such studies, they are currently limited in the scientific literature. This is due to the range of problems, the solution of which is considered in this paper. The article draws attention to the complexity of the analysis of depersonalized high-frequency data and the possibility of modeling consumption changes in time and space based on them. The features of the new type of data are considered on the example of real depersonalized data received from the fiscal data operator “First OFD” (JSC “Energy Systems and Communications”). It is shown that along with the spectrum of problems inherent in high-frequency data, there are disadvantages associated with the process of generating data on the side of the sellers, which requires a wider use of data mining tools. A series of statistical tests were carried out on the data under consideration, including a Unit-Root Test, test for unobserved individual effects, test for serial correlation and for cross-sectional dependence in panels, etc. The presence of spatial autocorrelation of the data was tested using modified tests of Lagrange multipliers. The tests carried out showed the presence of a consistent correlation and spatial dependence of the data, which determine the expediency of applying the methods of panel and spatial analysis in relation to high-frequency data accumulated by fiscal operators. The constructed models made it possible to substantiate the spatial relationship of sales growth and its dependence on the day of the week. The limitation for increasing the predictive ability of the constructed models and their subsequent complication, due to the inclusion of explanatory factors, was the lack of open access statistics grouped in the required detail in time and space, which determines the relevance of the formation of high-frequency geographically structured data bases.

  7. Fedorov A.A., Soshilov I.V., Loginov V.N.
    Augmented data routing algorithms for satellite delay-tolerant networks. Development and validation
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 983-993

    The problem of centralized planning for data transmission routes in delay tolerant networks is considered. The original problem is extended with additional requirements to nodes storage and communication process. First, it is assumed that the connection between the nodes of the graph is established using antennas. Second, it is assumed that each node has a storage of finite capacity. The existing works do not consider these requirements. It is assumed that we have in advance information about messages to be processed, information about the network configuration at specified time points taken with a certain time periods, information on time delays for the orientation of the antennas for data transmission and restrictions on the amount of data storage on each satellite of the grouping. Two wellknown algorithms — CGR and Earliest Delivery with All Queues are improved to satisfy the extended requirements. The obtained algorithms solve the optimal message routing problem separately for each message. The problem of validation of the algorithms under conditions of lack of test data is considered as well. Possible approaches to the validation based on qualitative conjectures are proposed and tested, and experiment results are described. A performance comparison of the two implementations of the problem solving algorithms is made. Two algorithms named RDTNAS-CG and RDTNAS-AQ have been developed based on the CGR and Earliest Delivery with All Queues algorithms, respectively. The original algorithms have been significantly expanded and an augmented implementation has been developed. Validation experiments were carried to check the minimum «quality» requirements for the correctness of the algorithms. Comparative analysis of the performance of the two algorithms showed that the RDTNAS-AQ algorithm is several orders of magnitude faster than RDTNAS-CG.

  8. Kiryanov A.K.
    GridFTP frontend with redirection for DMlite
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 543-547

    One of the most widely used storage solutions in WLCG is a Disk Pool Manager (DPM) developed and supported by SDC/ID group at CERN. Recently DPM went through a massive overhaul to address scalability and extensibility issues of the old code.

    New system was called DMLite. Unlike the old DPM that was based on daemons, DMLite is arranged as a library that can be loaded directly by an application. This approach greatly improves performance and transaction rate by avoiding unnecessary inter-process communication via network as well as threading bottlenecks.

    DMLite has a modular architecture with its core library providing only the very basic functionality. Backends (storage engines) and frontends (data access protocols) are implemented as plug-in modules. Doubtlessly DMLite wouldn't be able to completely replace DPM without GridFTP as it is used for most of the data transfers in WLCG.

    In DPM GridFTP support was implemented in a Data Storage Interface (DSI) module for Globus’ GridFTP server. In DMLite an effort was made to rewrite a GridFTP module from scratch in order to take advantage of new DMLite features and also implement new functionality. The most important improvement over the old version is a redirection capability.

    With old GridFTP frontend a client needed to contact SRM on the head node in order to obtain a transfer URL (TURL) before reading or writing a file. With new GridFTP frontend this is no longer necessary: a client may connect directly to the GridFTP server on the head node and perform file I/O using only logical file names (LFNs). Data channel is then automatically redirected to a proper disk node.

    This renders the most often used part of SRM unnecessary, simplifies file access and improves performance. It also makes DMLite a more appealing choice for non-LHC VOs that were never much interested in SRM.

    With new GridFTP frontend it's also possible to access data on various DMLite-supported backends like HDFS, S3 and legacy DPM.

    Views (last year): 1.
  9. Podryga V.O., Polyakov S.V.
    3D molecular dynamic simulation of thermodynamic equilibrium problem for heated nickel
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 573-579

    This work is devoted to molecular dynamic modeling of the thermal impact processes on the metal sample consisting of nickel atoms. For the solution of this problem, a continuous mathematical model on the basis of the classical Newton mechanics equations has been used; a numerical method based on the Verlet scheme has been chosen; a parallel algorithm has been offered, and its realization within the MPI and OpenMP technologies has been executed. By means of the developed parallel program, the investigation of thermodynamic equilibrium of nickel atoms’ system under the conditions of heating a sample to desired temperature has been executed. In numerical experiments both optimum parameters of calculation procedure and physical parameters of analyzed process have been defined. The obtained numerical results are well corresponding to known theoretical and experimental data.

    Views (last year): 2.
  10. Reed R.G., Cox M.A., Wrigley T., Mellado B.
    A CPU benchmarking characterization of ARM based processors
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 581-586

    Big science projects are producing data at ever increases rates. Typical techniques involve storing the data to disk, after minor filtering, and then processing it in large computer farms. Data production has reached a point where on-line processing is required in order to filter the data down to manageable sizes. A potential solution involves using low-cost, low-power ARM processors in large arrays to provide massive parallelisation for data stream computing (DSC). The main advantage in using System on Chips (SoCs) is inherent in its design philosophy. SoCs are primarily used in mobile devices and hence consume less power while maintaining relatively good performance. A benchmarking characterisation of three different models of ARM processors will be presented.

    Views (last year): 1.
Pages: « first previous next

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"