All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
Most viewed papers
Most cited papers (RSCI)-
Views (last year): 3.
Storage is the essential and expensive part of cloud computation both from the point of view of network requirements and data access organization. So the choice of storage architecture can be crucial for any application. In this article we can look at the types of cloud architectures for data processing and data storage based on the proven technology of enterprise storage. The advantage of cloud computing is the ability to virtualize and share resources among different applications for better server utilization. We are discussing and evaluating distributed data processing, database architectures for cloud computing and database query in the local network and for real time conditions.
-
Improvement of computational abilities in computing environments with virtualization technologies
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 499-504Views (last year): 3.In this paper, we illustrates the ways to improve abilities of the computing environments by using virtualization, single system image (SSI) and hypervisor technologies’ collaboration for goal to improve computational abilities. Recently cloud computing as a new service concept has become popular to provide various services to user such as multi-media sharing, online office software, game and online storage. The cloud computing is bringing together multiple computers and servers in a single environment designed to address certain types of tasks, such as scientific problems or complex calculations. By using virtualization technologies, cloud computing environment is able to virtualize and share resources among different applications with the objective for better server utilization, better load balancing and effectiveness.
-
Efficient processing and classification of wave energy spectrum data with a distributed pipeline
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 517-520Views (last year): 3. Citations: 2 (RSCI).Processing of large amounts of data often consists of several steps, e.g. pre- and post-processing stages, which are executed sequentially with data written to disk after each step, however, when pre-processing stage for each task is different the more efficient way of processing data is to construct a pipeline which streams data from one stage to another. In a more general case some processing stages can be factored into several parallel subordinate stages thus forming a distributed pipeline where each stage can have multiple inputs and multiple outputs. Such processing pattern emerges in a problem of classification of wave energy spectra based on analytic approximations which can extract different wave systems and their parameters (e.g. wave system type, mean wave direction) from spectrum. Distributed pipeline approach achieves good performance compared to conventional “sequential-stage” processing.
-
An interactive tool for developing distributed telemedicine systems
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 521-527Views (last year): 3. Citations: 4 (RSCI).Getting a qualified medical examination can be difficult for people in remote areas because medical staff available can either be inaccessible or it might lack expert knowledge at proper level. Telemedicine technologies can help in such situations. On one hand, such technologies allow highly qualified doctors to consult remotely, thereby increasing the quality of diagnosis and plan treatment. On the other hand, computer-aided analysis of the research results, anamnesis and information on similar cases assist medical staff in their routine activities and decision-making.
Creating telemedicine system for a particular domain is a laborious process. It’s not sufficient to pick proper medical experts and to fill the knowledge base of the analytical module. It’s also necessary to organize the entire infrastructure of the system to meet the requirements in terms of reliability, fault tolerance, protection of personal data and so on. Tools with reusable infrastructure elements, which are common to such systems, are able to decrease the amount of work needed for the development of telemedicine systems.
An interactive tool for creating distributed telemedicine systems is described in the article. A list of requirements for the systems is presented; structural solutions for meeting the requirements are suggested. A composition of such elements applicable for distributed systems is described in the article. A cardiac telemedicine system is described as a foundation of the tool
-
Exact calculation of a posteriori probability distribution with distributed computing systems
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 539-542Views (last year): 3.We'd like to present a specific grid infrastructure and web application development and deployment. The purpose of infrastructure and web application is to solve particular geophysical problems that require heavy computational resources. Here we cover technology overview and connector framework internals. The connector framework links problem-specific routines with middleware in a manner that developer of application doesn't have to be aware of any particular grid software. That is, the web application built with this framework acts as an interface between the user 's web browser and Grid's (often very) own middleware.
Our distributed computing system is built around Gridway metascheduler. The metascheduler is connected to TORQUE resource managers of virtual compute nodes that are being run atop of compute cluster utilizing the virtualization technology. Such approach offers several notable features that are unavailable to bare-metal compute clusters.
The first application we've integrated with our framework is seismic anisotropic parameters determination by inversion of SKS and converted phases. We've used probabilistic approach to inverse problem solution based on a posteriory probability distribution function (APDF) formalism. To get the exact solution of the problem we have to compute the values of multidimensional function. Within our implementation we used brute-force APDF calculation on rectangular grid across parameter space.
The result of computation is stored in relational DBMS and then represented in familiar human-readable form. Application provides several instruments to allow analysis of function's shape by computational results: maximum value distribution, 2D cross-sections of APDF, 2D marginals and a few other tools. During the tests we've run the application against both synthetic and observed data.
-
Running Parameter Sweep applications on Everest cloud platform
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 601-606Views (last year): 3.Parameter sweep applications are a very important class of applications, which are typically defined as a set of computational experiments over a set of input parameters, each of which is executed with its own parameter combination. These computations arise in many scientific contexts. This article introduces the Parameter Sweep web service that runs such applications in distributed computing environment. Also discussed is the Everest cloud platform, on which this service is built.
-
Cataloging technology of information fund
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 661-673Views (last year): 3.The article discusses the approach to the improvement of information processing technology on the basis of logical-semantic network (LSN) Question–Answer–Reaction aimed at formation and support of the catalog service providing efficient search of answers to questions.
The basis of such a catalog service are semantic links, reflecting the logic of presentation of the author's thoughts within the framework this publication, theme, subject area. Structuring and support of these links will allow working with a field of meanings, providing new opportunities for the study the corps of digital libraries documents. Cataloging of the information fund includes: formation of lexical dictionary; formation of the classification tree for several bases; information fund classification for question–answer topics; formation of the search queries that are adequate classification trees the question–answer; automated search queries on thematic search engines; analysis of the responses to queries; LSN catalog support during the operational phase (updating and refinement of the catalog). The technology is considered for two situations: 1) information fund has already been formed; 2) information fund is missing, you must create it.
-
Choice of design of transcatheter aortic valve prosthesis frame based on finite element analysis
Computer Research and Modeling, 2015, v. 7, no. 4, pp. 909-922Views (last year): 3. Citations: 1 (RSCI).This article presents an analysis of the impact of the transcatheter prosthesis frame design features on the results of its implantation in the aortic root model. In this paper we analyzed the various approaches to the design of such structures, as well as modifications in order to improve their functional characteristics during the implantation. As a general method for obtaining the results of interaction of the objects was used finite element method with nonlinear materials description and analysis of the main parameters: the stress-strain state, radial and friction forces.
-
Numerical simulation of adhesive technology application in tooth root canal on restoration properties
Computer Research and Modeling, 2015, v. 7, no. 5, pp. 1069-1079Views (last year): 3.The aim of the present study is to show how engineering approaches and ideas work in clinical restorative dentistry, in particular, how they affect the restoration design and durability of restored endodontically treated teeth. For these purposes a 3D-computational model of a first incisor including the elements of hard tooth tissues, periodontal ligament, surrounding bone structures and restoration itself has been constructed and numerically simulated for a variety of restoration designs under normal chewing loadings. It has been researched the effect of different adhesive technologies in root canal on the functional characteristics of a restored tooth. The 3D model designed could be applied for preclinical diagnostics to determine the areas of possible fractures of a restored tooth and prognosticate its longevity.
-
Neumann's method to solve boundary problems of elastic thin shells
Computer Research and Modeling, 2015, v. 7, no. 6, pp. 1143-1153Views (last year): 3.This paper studies possibilities to use Neumann's method to solve boundary problems of elastic thin shells. Variational statement of statical problems for shells allows examining the problems within the space of distributions. Convergence of the Neumann's method is proved for the shells with holes when the boundary of the domain is not completely fixed. Numerical implementation of the Neumann's method normally takes a lot of time before some reliable results can be achieved. This paper suggests a way to improve convergence of the process and allows for parallel computing and checkout procedure during calculations.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"