Результаты поиска по 'computer technology':
Найдено статей: 69
  1. Guskov V.P., Gushchanskiy D.E., Kulabukhova N.V., Abrahamyan S.A., Balyan S.G., Degtyarev A.B., Bogdanov A.V.
    An interactive tool for developing distributed telemedicine systems
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 521-527

    Getting a qualified medical examination can be difficult for people in remote areas because medical staff available can either be inaccessible or it might lack expert knowledge at proper level. Telemedicine technologies can help in such situations. On one hand, such technologies allow highly qualified doctors to consult remotely, thereby increasing the quality of diagnosis and plan treatment. On the other hand, computer-aided analysis of the research results, anamnesis and information on similar cases assist medical staff in their routine activities and decision-making.

    Creating telemedicine system for a particular domain is a laborious process. It’s not sufficient to pick proper medical experts and to fill the knowledge base of the analytical module. It’s also necessary to organize the entire infrastructure of the system to meet the requirements in terms of reliability, fault tolerance, protection of personal data and so on. Tools with reusable infrastructure elements, which are common to such systems, are able to decrease the amount of work needed for the development of telemedicine systems.

    An interactive tool for creating distributed telemedicine systems is described in the article. A list of requirements for the systems is presented; structural solutions for meeting the requirements are suggested. A composition of such elements applicable for distributed systems is described in the article. A cardiac telemedicine system is described as a foundation of the tool

    Views (last year): 3. Citations: 4 (RSCI).
  2. Kholodkov K.I., Aleshin I.M.
    Exact calculation of a posteriori probability distribution with distributed computing systems
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 539-542

    We'd like to present a specific grid infrastructure and web application development and deployment. The purpose of infrastructure and web application is to solve particular geophysical problems that require heavy computational resources. Here we cover technology overview and connector framework internals. The connector framework links problem-specific routines with middleware in a manner that developer of application doesn't have to be aware of any particular grid software. That is, the web application built with this framework acts as an interface between the user 's web browser and Grid's (often very) own middleware.

    Our distributed computing system is built around Gridway metascheduler. The metascheduler is connected to TORQUE resource managers of virtual compute nodes that are being run atop of compute cluster utilizing the virtualization technology. Such approach offers several notable features that are unavailable to bare-metal compute clusters.

    The first application we've integrated with our framework is seismic anisotropic parameters determination by inversion of SKS and converted phases. We've used probabilistic approach to inverse problem solution based on a posteriory probability distribution function (APDF) formalism. To get the exact solution of the problem we have to compute the values of multidimensional function. Within our implementation we used brute-force APDF calculation on rectangular grid across parameter space.

    The result of computation is stored in relational DBMS and then represented in familiar human-readable form. Application provides several instruments to allow analysis of function's shape by computational results: maximum value distribution, 2D cross-sections of APDF, 2D marginals and a few other tools. During the tests we've run the application against both synthetic and observed data.

    Views (last year): 3.
  3. Minkin A.S., Knizhnik A.A., Potapkin B.V.
    OpenCL realization of some many-body potentials
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 549-558

    Modeling of carbon nanostructures by means of classical molecular dynamics requires a lot of computations. One of the ways to improve the performance of basic algorithms is to transform them for running on SIMD-type computing systems such as systems with dedicated GPU. In this work we describe the development of algorithms for computation of many-body interaction based on Tersoff and embedded-atom potentials by means of OpenCL technology. OpenCL standard provides universality and portability of the algorithms and can be successfully used for development of the software for heterogeneous computing systems. The performance of algorithms is evaluated on CPU and GPU hardware platforms. It is shown that concurrent memory writes is effective for Tersoff bond order potential. The same approach for embedded-atom potential is shown to be slower than algorithm without concurrent memory access. Performance evaluation shows a significant GPU acceleration of energy-force evaluation algorithms for many-body potentials in comparison to the corresponding serial implementations.

    Views (last year): 4. Citations: 1 (RSCI).
  4. Bogdanov A.V., Thurein Kyaw L.
    Query optimization in relational database systems and cloud computing technology
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 649-655

    Optimization is the heart of relational Database Management System (DMBS). Its can analyzes the SQL statements and determines the most efficient access plan to satisfy every query request. Optimization can solves this problem and analyzes SQL statements specifying which tables and columns are available. And then request the information system and statistical data stored in the system directory, to determine the best method of solving the tasks required to comply with the query requests.

    Views (last year): 1.
  5. Bondyakov A.S.
    Basic directions of information technology in National Academy of Sciences of Azerbaijan
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 657-660

    Grid is a new type of computing infrastructure, is intensively developed in today world of information technologies. Grid provides global integration of information and computing resources. The essence Conception of GRID in Azerbaijan is to create a set of standardized services to provide a reliable, compatible, inexpensive and secure access to geographically distributed high-tech information and computing resources a separate computer, cluster and supercomputing centers, information storage, networks, scientific tools etc.

    Views (last year): 6. Citations: 1 (RSCI).
  6. Ustimenko O.V.
    Features DIRAC data management
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 741-744

    The report presents an analysis of Big Data storage solutions in different directions. The purpose of this paper is to introduce the technology of Big Data storage, prospects of storage technologies, for example, the software DIRAC. The DIRAC is a software framework for distributed computing.

    The report considers popular storage technologies and lists their limitations. The main problems are the storage of large data, the lack of quality in the processing, scalability, the lack of rapid availability, the lack of implementation of intelligent data retrieval.

    Experimental computing tasks demand a wide range of requirements in terms of CPU usage, data access or memory consumption and unstable profile of resource use for a certain period. The DIRAC Data Management System (DMS), together with the DIRAC Storage Management System (SMS) provides the necessary functionality to execute and control all the activities related with data.

    Views (last year): 2.
  7. Degtyarev A.B., Myo Min S., Wunna K.
    Cloud computing for virtual testbed
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 753-758

    Nowadays cloud computing is an important topic in the field of information technology and computer system. Several companies and educational institutes have deployed cloud infrastructures to overcome their problems such as easy data access, software updates with minimal cost, large or unlimited storage, efficient cost factor, backup storage and disaster recovery, and some other benefits if compare with the traditional network infrastructures. The paper present the study of cloud computing technology for marine environmental data and processing. Cloud computing of marine environment information is proposed for the integration and sharing of marine information resources. It is highly desirable to perform empirical requiring numerous interactions with web servers and transfers of very large archival data files without affecting operational information system infrastructure. In this paper, we consider the cloud computing for virtual testbed to minimize the cost. That is related to real time infrastructure.

    Views (last year): 7.
  8. Ershov N.M.
    Non-uniform cellular genetic algorithms
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 775-780

    In this paper, we introduce the concept of non-uniform cellular genetic algorithm, in which a number of parameters that affect the operation of genetic operators is dependent on the location of the cells of a given cellular space. The results of numerical comparison of non-uniform cellular genetic algorithms with the standard genetic algorithms, showing the advantages of the proposed approach while minimizing multimodal functions with a large number of local extrema, are presented. The coarse-grained parallel implementation of the non-uniform algorithms using the technology of MPI is considered.

    Views (last year): 9. Citations: 3 (RSCI).
  9. Ershov N.M., Popova N.N.
    Natural models of parallel computations
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 781-785

    Course “Natural models of parallel computing”, given for senior students of the Faculty of Computational Mathematics and Cybernetics, Moscow State University, is devoted to the issues of supercomputer implementation of natural computational models and is, in fact, an introduction to the theory of natural computing, a relatively new branch of science, formed at the intersection of mathematics, computer science and natural sciences (especially biology). Topics of the natural computing include both already classic subjects such as cellular automata, and relatively new, introduced in the last 10–20 years, such as swarm intelligence. Despite its biological origin, all these models are widely applied in the fields related to computer data processing. Research in the field of natural computing is closely related to issues and technology of parallel computing. Presentation of theoretical material of the course is accompanied by a consideration of the possible schemes for parallel computing, in the practical part of the course it is supposed to perform by the students a software implementation using MPI technology and numerical experiments to investigate the effectiveness of the chosen schemes of parallel computing.

    Views (last year): 17. Citations: 2 (RSCI).
Pages: « first previous

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"