Результаты поиска по 'experiment':
Найдено статей: 212
  1. Andreeva A.A., Nikolaev A.V., Lobanov A.I.
    Analysis of point model of fibrin polymerization
    Computer Research and Modeling, 2017, v. 9, no. 2, pp. 247-258

    Functional modeling of blood clotting and fibrin-polymer mesh formation is of a significant value for medical and biophysics applications. Despite the fact of some discrepancies present in simplified functional models their results are of the great interest for the experimental science as a handy tool of the analysis for research planning, data processing and verification. Under conditions of the good correspondence to the experiment functional models can be used as an element of the medical treatment methods and biophysical technologies. The aim of the paper in hand is a modeling of a point system of the fibrin-polymer formation as a multistage polymerization process with a sol-gel transition at the final stage. Complex-value Rosenbroke method of second order (CROS) used for computational experiments. The results of computational experiments are presented and discussed. It was shown that in the physiological range of the model coefficients there is a lag period of approximately 20 seconds between initiation of the reaction and fibrin gel appearance which fits well experimental observations of fibrin polymerization dynamics. The possibility of a number of the consequent $(n = 1–3)$ sol-gel transitions demonstrated as well. Such a specific behavior is a consequence of multistage nature of fibrin polymerization process. At the final stage the solution of fibrin oligomers of length 10 can reach a semidilute state, leading to an extremely fast gel formation controlled by oligomers’ rotational diffusion. Otherwise, if the semidilute state is not reached the gel formation is controlled by significantly slower process of translational diffusion. Such a duality in the sol-gel transition led authors to necessity of introduction of a switch-function in an equation for fibrin-polymer formation kinetics. Consequent polymerization events can correspond to experimental systems where fibrin mesh formed gets withdrawn from the volume by some physical process like precipitation. The sensitivity analysis of presented system shows that dependence on the first stage polymerization reaction constant is non-trivial.

    Views (last year): 8.
  2. We build new tests which permit to increase the human capacity for the information processing by the parallel execution of the several logic operations of prescribed type. For checking of the causes of the capacity increasing we develop the check tests on the same logic operations class in which the parallel organization of the calculations is low-effectively. We use the apparatus of the universal algebra and automat theory. This article is the extension of the cycle of the work, which investigates the human capacity for the parallel calculations. The general publications on this theme content in the references. The tasks in the described tests may to define in the form of the calculation of the result in the sequence of the same type operations from some algebra. If this operation is associative then the parallel calculation is effectively by successful grouping of process. In Theory of operations that is the using the simultaneous work several processors. Each processor transforms in the time unit the certain known number of the elements of the input date or the intermediate results (the processor productivity). Now it is not known what kind elements of date are using by the brain for the logical or mathematical calculation, and how many elements are treating in the time units. Therefore the test contains the sequence of the presentations of the tasks with different numbers of logical operations in the fixed alphabet. That is the measure of the complexity for the task. The analysis of the depending of the time for the task solution from the complexity gives the possible to estimate the processor productivity and the form of the calculate organization. For the sequence calculations only one processor is working, and the time of solution is a line function of complexity. If the new processors begin to work in parallel when the complexities of the task increase than the depending of the solution time from complexity is represented by the curve which is convex at the bottom. For the detection of situation when the man increases the speed of the single processor under the condition of the increasing complexity we use the task series with similar operations but in the no associate algebra. In such tasks the parallel calculation is little affectivity in the sense of the increasing efficiency by the increasing the number of processors. That is the check set of the tests. In article we consider still one class of the tests, which are based on the calculation of the trajectory of the formal automat state if the input sequence is determined. We investigate the special class of automats (relay) for which the construction affect on the affectivity of the parallel calculations of the final automat state. For all tests we estimate the affectivity of the parallel calculation. This article do not contained the experiment results.

    Views (last year): 14. Citations: 1 (RSCI).
  3. Gasnikov A.V., Kubentayeva M.B.
    Searching stochastic equilibria in transport networks by universal primal-dual gradient method
    Computer Research and Modeling, 2018, v. 10, no. 3, pp. 335-345

    We consider one of the problems of transport modelling — searching the equilibrium distribution of traffic flows in the network. We use the classic Beckman’s model to describe time costs and flow distribution in the network represented by directed graph. Meanwhile agents’ behavior is not completely rational, what is described by the introduction of Markov logit dynamics: any driver selects a route randomly according to the Gibbs’ distribution taking into account current time costs on the edges of the graph. Thus, the problem is reduced to searching of the stationary distribution for this dynamics which is a stochastic Nash – Wardrope equilibrium in the corresponding population congestion game in the transport network. Since the game is potential, this problem is equivalent to the problem of minimization of some functional over flows distribution. The stochasticity is reflected in the appearance of the entropy regularization, in contrast to non-stochastic case. The dual problem is constructed to obtain a solution of the optimization problem. The universal primal-dual gradient method is applied. A major specificity of this method lies in an adaptive adjustment to the local smoothness of the problem, what is most important in case of the complex structure of the objective function and an inability to obtain a prior smoothness bound with acceptable accuracy. Such a situation occurs in the considered problem since the properties of the function strongly depend on the transport graph, on which we do not impose strong restrictions. The article describes the algorithm including the numerical differentiation for calculation of the objective function value and gradient. In addition, the paper represents a theoretical estimate of time complexity of the algorithm and the results of numerical experiments conducted on a small American town.

    Views (last year): 28.
  4. Bogomolov S.V.
    Stochastic formalization of the gas dynamic hierarchy
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 767-779

    Mathematical models of gas dynamics and its computational industry, in our opinion, are far from perfect. We will look at this problem from the point of view of a clear probabilistic micro-model of a gas from hard spheres, relying on both the theory of random processes and the classical kinetic theory in terms of densities of distribution functions in phase space, namely, we will first construct a system of nonlinear stochastic differential equations (SDE), and then a generalized random and nonrandom integro-differential Boltzmann equation taking into account correlations and fluctuations. The key feature of the initial model is the random nature of the intensity of the jump measure and its dependence on the process itself.

    Briefly recall the transition to increasingly coarse meso-macro approximations in accordance with a decrease in the dimensionalization parameter, the Knudsen number. We obtain stochastic and non-random equations, first in phase space (meso-model in terms of the Wiener — measure SDE and the Kolmogorov – Fokker – Planck equations), and then — in coordinate space (macro-equations that differ from the Navier – Stokes system of equations and quasi-gas dynamics systems). The main difference of this derivation is a more accurate averaging by velocity due to the analytical solution of stochastic differential equations with respect to the Wiener measure, in the form of which an intermediate meso-model in phase space is presented. This approach differs significantly from the traditional one, which uses not the random process itself, but its distribution function. The emphasis is placed on the transparency of assumptions during the transition from one level of detail to another, and not on numerical experiments, which contain additional approximation errors.

    The theoretical power of the microscopic representation of macroscopic phenomena is also important as an ideological support for particle methods alternative to difference and finite element methods.

  5. Mezentsev Y.A., Razumnikova O.M., Estraykh I.V., Tarasova I.V., Trubnikova O.A.
    Tasks and algorithms for optimal clustering of multidimensional objects by a variety of heterogeneous indicators and their applications in medicine
    Computer Research and Modeling, 2024, v. 16, no. 3, pp. 673-693

    The work is devoted to the description of the author’s formal statements of the clustering problem for a given number of clusters, algorithms for their solution, as well as the results of using this toolkit in medicine.

    The solution of the formulated problems by exact algorithms of implementations of even relatively low dimensions before proving optimality is impossible in a finite time due to their belonging to the NP class.

    In this regard, we have proposed a hybrid algorithm that combines the advantages of precise methods based on clustering in paired distances at the initial stage with the speed of methods for solving simplified problems of splitting by cluster centers at the final stage. In the development of this direction, a sequential hybrid clustering algorithm using random search in the paradigm of swarm intelligence has been developed. The article describes it and presents the results of calculations of applied clustering problems.

    To determine the effectiveness of the developed tools for optimal clustering of multidimensional objects according to a variety of heterogeneous indicators, a number of computational experiments were performed using data sets including socio-demographic, clinical anamnestic, electroencephalographic and psychometric data on the cognitive status of patients of the cardiology clinic. An experimental proof of the effectiveness of using local search algorithms in the paradigm of swarm intelligence within the framework of a hybrid algorithm for solving optimal clustering problems has been obtained.

    The results of the calculations indicate the actual resolution of the main problem of using the discrete optimization apparatus — limiting the available dimensions of task implementations. We have shown that this problem is eliminated while maintaining an acceptable proximity of the clustering results to the optimal ones. The applied significance of the obtained clustering results is also due to the fact that the developed optimal clustering toolkit is supplemented by an assessment of the stability of the formed clusters, which allows for known factors (the presence of stenosis or older age) to additionally identify those patients whose cognitive resources are insufficient to overcome the influence of surgical anesthesia, as a result of which there is a unidirectional effect of postoperative deterioration of complex visual-motor reaction, attention and memory. This effect indicates the possibility of differentiating the classification of patients using the proposed tools.

  6. Alekseenko A.E., Kazennov A.M.
    CUDA and OpenCL implementations of Conway’s Game of Life cellular automata
    Computer Research and Modeling, 2010, v. 2, no. 3, pp. 323-326

    In this article the experience of reading “CUDA and OpenCL programming” course during high perfomance computing summer school MIPT-2010 is analyzed. Content of lectures and practical tasks, as well as manner of presenting of the material are regarded. Performance issues of different algorithms implemented by students at practical training session are dicussed.

    Views (last year): 9. Citations: 3 (RSCI).
  7. Bratsun D.A., Zakharov A.P.
    Modelling spatio-temporal dynamics of circadian rythms in Neurospora crassa
    Computer Research and Modeling, 2011, v. 3, no. 2, pp. 191-213

    We derive a new model of circadian oscillations in Neurospora crassa, which is suitable to analyze both temporal and spatial dynamics of proteins responsible for mechanism of rythms. The model is based on the non-linear interplay between proteins FRQ and WCC which are products of transcription of frequency and white collar genes forming a feedback loop comprised both positive and negative elements. The main component of oscillations mechanism is supposed to be time-delay in biochemical reactions of transcription. We show that the model accounts for various features observed in Neurospora’s experiments such as entrainment by light cycles, phase shift under light pulse, robustness to action of fluctuations and so on. Wave patterns excited during spatial development of the system are studied. It is shown that the wave of synchronization of biorythms arises under basal transcription factors.

    Views (last year): 6. Citations: 20 (RSCI).
  8. Winn A.P., Kyaw H., Troyanovskyi V.M., Aung Y.L.
    Methodology and program for the storage and statistical analysis of the results of computer experiment
    Computer Research and Modeling, 2013, v. 5, no. 4, pp. 589-595

    The problem of accumulation and the statistical analysis of computer experiment results are solved. The main experiment program is considered as the data source. The results of main experiment are collected on specially prepared sheet Excel with pre-organized structure for the accumulation, statistical processing and visualization of the data. The created method and the program are used at efficiency research of the scientific researches which are carried out by authors.

    Views (last year): 1. Citations: 5 (RSCI).
  9. Koganov A.V., Zlobin A.I., Rakcheeva T.A.
    Research of possibility for man the parallel information handling in task series with increase complexity
    Computer Research and Modeling, 2013, v. 5, no. 5, pp. 845-861

    We schedule the computer technology for present the engineer psychology tests which reveal probationer men which may hasten the logic task solution by simultaneous execution several standard logic operations. These tests based on the theory of two logic task kinds: in first kind the parallel logic is effectively, and in second kind it is not effectively. The realize experiment confirms the capability parallel logic for impotent part of people. The vital speedup execution of logic operations is very uncommon in simultaneous logic. The efficacy of methodic is confirmed.

    Views (last year): 1. Citations: 4 (RSCI).
  10. Nikitin I.S., Filimonov A.V., Yakushev V.L.
    Propagation of Rayleigh waves at oblique impact of the meteorite about the earth’s surface and their effects on buildings and structures
    Computer Research and Modeling, 2013, v. 5, no. 6, pp. 981-992

    In this paper the dynamic elasticity problem of the simultaneous normal and tangential impact on the half-space is solved. This problem simulates the oblique incidence of meteorite on the Earth’s surface. The surface Rayleigh wave is investigated. The resulting solution is used as an external effect on the high-rise building, located at some distance from the spot of falling for the safety and stability assessment of its structure. Numerical experiments were made based on the finite element software package STARK ES. Upper floors amplitudes of the selected object were calculated under such dynamic effects. Also a systematic comparison with the results at the foundation vibrations, relevant to  standard a 8-point earthquake accelerograms, was made.

    Views (last year): 3. Citations: 2 (RSCI).
Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"