Результаты поиска по 'standards':
Найдено статей: 75
  1. Tarasyuk I.A., Kravchuk A.S.
    Estimation of natural frequencies of pure bending vibrations of composite nonlinearly elastic beams and circular plates
    Computer Research and Modeling, 2017, v. 9, no. 6, pp. 945-953

    In the paper, it is represented a linearization method for the stress-strain curves of nonlinearly deformable beams and circular plates in order to generalize the pure bending vibration equations. It is considered composite, on average isotropic prismatic beams of a constant rectangular cross-section and circular plates of a constant thickness made of nonlinearly elastic materials. The technique consists in determining the approximate Young’s moduli from the initial stress-strain state of beam and plate subjected to the action of the bending moment.

    The paper proposes two criteria for linearization: the equality of the specific potential energy of deformation and the minimization of the standard deviation in the state equation approximation. The method allows obtaining in the closed form the estimated value of the natural frequencies of layered and structurally heterogeneous, on average isotropic nonlinearly elastic beams and circular plates. This makes it possible to significantly reduce the resources in the vibration analysis and modeling of these structural elements. In addition, the paper shows that the proposed linearization criteria allow to estimate the natural frequencies with the same accuracy.

    Since in the general case even isotropic materials exhibit different resistance to tension and compression, it is considered the piecewise-linear Prandtl’s diagrams with proportionality limits and tangential Young’s moduli that differ under tension and compression as the stress-strain curves of the composite material components. As parameters of the stress-strain curve, it is considered the effective Voigt’s characteristics (under the hypothesis of strain homogeneity) for a longitudinally layered material structure; the effective Reuss’ characteristics (under the hypothesis of strain homogeneity) for a transversely layered beam and an axially laminated plate. In addition, the effective Young’s moduli and the proportionality limits, obtained by the author’s homogenization method, are given for a structurally heterogeneous, on average isotropic material. As an example, it is calculated the natural frequencies of two-phase beams depending on the component concentrations.

    Views (last year): 14.
  2. Zhluktov S.V., Aksenov A.A., Savitskiy D.V.
    High-Reynolds number calculations of turbulent heat transfer in FlowVision software
    Computer Research and Modeling, 2018, v. 10, no. 4, pp. 461-481

    This work presents the model of heat wall functions FlowVision (WFFV), which allows simulation of nonisothermal flows of fluid and gas near solid surfaces on relatively coarse grids with use of turbulence models. The work follows the research on the development of wall functions applicable in wide range of the values of quantity y+. Model WFFV assumes smooth profiles of the tangential component of velocity, turbulent viscosity, temperature, and turbulent heat conductivity near a solid surface. Possibility of using a simple algebraic model for calculation of variable turbulent Prandtl number is investigated in this study (the turbulent Prandtl number enters model WFFV as parameter). The results are satisfactory. The details of implementation of model WFFV in the FlowVision software are explained. In particular, the boundary condition for the energy equation used in high-Reynolds number calculations of non-isothermal flows is considered. The boundary condition is deduced for the energy equation written via thermodynamic enthalpy and via full enthalpy. The capability of the model is demonstrated on two test problems: flow of incompressible fluid past a plate and supersonic flow of gas past a plate (M = 3).

    Analysis of literature shows that there exists essential ambiguity in experimental data and, as a consequence, in empirical correlations for the Stanton number (that being a dimensionless heat flux). The calculations suggest that the default values of the model parameters, automatically specified in the program, allow calculations of heat fluxes at extended solid surfaces with engineering accuracy. At the same time, it is obvious that one cannot invent universal wall functions. For this reason, the controls of model WFFV are made accessible from the FlowVision interface. When it is necessary, a user can tune the model for simulation of the required type of flow.

    The proposed model of wall functions is compatible with all the turbulence models implemented in the FlowVision software: the algebraic model of Smagorinsky, the Spalart-Allmaras model, the SST $k-\omega$ model, the standard $k-\varepsilon$ model, the $k-\varepsilon$ model of Abe, Kondoh, Nagano, the quadratic $k-\varepsilon$ model, and $k-\varepsilon$ model FlowVision.

    Views (last year): 23.
  3. Aleshin I.M., Malygin I.V.
    Machine learning interpretation of inter-well radiowave survey data
    Computer Research and Modeling, 2019, v. 11, no. 4, pp. 675-684

    Traditional geological search methods going to be ineffective. The exploration depth of kimberlite bodies and ore deposits has increased significantly. The only direct exploration method is to drill a system of wells to the depths that provide access to the enclosing rocks. Due to the high cost of drilling, the role of inter-well survey methods has increased. They allows to increase the mean well spacing without significantly reducing the kimberlite or ore body missing probability. The method of inter-well radio wave survey is effective to search for high contrast conductivity objects. The physics of the method based on the dependence of the electromagnetic wave propagation on the propagation medium conductivity. The source and receiver of electromagnetic radiation is an electric dipole, they are placed in adjacent wells. The distance between the source and receiver is known. Therefore we could estimate the medium absorption coefficient by the rate of radio wave amplitude decrease. Low electrical resistance rocks corresponds to high absorption of radio waves. The inter-well measurement data allows to estimate an effective electrical resistance (or conductivity) of the rock. Typically, the source and receiver are immersed in adjacent wells synchronously. The value of the of the electric field amplitude measured at the receiver site allows to estimate the average value of the attenuation coefficient on the line connecting the source and receiver. The measurements are taken during stops, approximately every 5 m. The distance between stops is much less than the distance between adjacent wells. This leads to significant spatial anisotropy in the measured data distribution. Drill grid covers a large area, and our point is to build a three-dimensional model of the distribution of the electrical properties of the inter-well space throughout the whole area. The anisotropy of spatial distribution makes hard to the use of standard geostatistics approach. To build a three-dimensional model of attenuation coefficient, we used one of machine learning theory methods, the method of nearest neighbors. In this method, the value of the absorption coefficient at a given point is calculated by $k$ nearest measurements. The number $k$ should be determined from additional reasons. The spatial distribution anisotropy effect can be reduced by changing the spatial scale in the horizontal direction. The scale factor $\lambda$ is one yet external parameter of the problem. To select the parameters $k$ and $\lambda$ values we used the determination coefficient. To demonstrate the absorption coefficient three-dimensional image construction we apply the procedure to the inter-well radio wave survey data. The data was obtained at one of the sites in Yakutia.

    Views (last year): 3.
  4. Tran T.T., Pham C.T.
    A hybrid regularizers approach based model for restoring image corrupted by Poisson noise
    Computer Research and Modeling, 2021, v. 13, no. 5, pp. 965-978

    Image denoising is one of the fundamental problems in digital image processing. This problem usually refers to the reconstruction of an image from an observed image degraded by noise. There are many factors that cause this degradation such as transceiver equipment, or environmental influences, etc. In order to obtain higher quality images, many methods have been proposed for image denoising problem. Most image denoising method are based on total variation (TV) regularization to develop efficient algorithms for solving the related optimization problem. TV-based models have become a standard technique in image restoration with the ability to preserve image sharpness.

    In this paper, we focus on Poisson noise usually appearing in photon-counting devices. We propose an effective regularization model based on combination of first-order and fractional-order total variation for image reconstruction corrupted by Poisson noise. The proposed model allows us to eliminate noise while edge preserving. An efficient alternating minimization algorithm is employed to solve the optimization problem. Finally, provided numerical results show that our proposed model can preserve more details and get higher image visual quality than recent state-of-the-art methods.

  5. Fialko N.S.
    Mixed algorithm for modeling of charge transfer in DNA on long time intervals
    Computer Research and Modeling, 2010, v. 2, no. 1, pp. 63-72

    Charge transfer in DNA is simulated by a discrete Holstein model «quantum particle + classical site chain + interaction». Thermostat temperature is taken into account as stochastic force, which acts on classical sites (Langevin equation). Thus dynamics of charge migration along the chain is described by ODE system with stochastic right-hand side. To integrate the system numerically, algorithms of order 1 or 2 are usually applied. We developed «mixed» algorithm having 4th order of accuracy for fast «quantum» variables (note that in quantum subsystem the condition «sum of probabilities of charge being on site is time-constant» must be held), and 2nd order for slow classical variables, which are affecting by stochastic force. The algorithm allows us to calculate trajectories on longer time intervals as compared to standard algorithms. Model calculations of polaron disruption in homogeneous chain caused by temperature fluctuations are given as an example.

    Views (last year): 2. Citations: 2 (RSCI).
  6. Koganov A.V., Zlobin A.I., Rakcheeva T.A.
    Research of possibility for man the parallel information handling in task series with increase complexity
    Computer Research and Modeling, 2013, v. 5, no. 5, pp. 845-861

    We schedule the computer technology for present the engineer psychology tests which reveal probationer men which may hasten the logic task solution by simultaneous execution several standard logic operations. These tests based on the theory of two logic task kinds: in first kind the parallel logic is effectively, and in second kind it is not effectively. The realize experiment confirms the capability parallel logic for impotent part of people. The vital speedup execution of logic operations is very uncommon in simultaneous logic. The efficacy of methodic is confirmed.

    Views (last year): 1. Citations: 4 (RSCI).
  7. Nikitin I.S., Filimonov A.V., Yakushev V.L.
    Propagation of Rayleigh waves at oblique impact of the meteorite about the earth’s surface and their effects on buildings and structures
    Computer Research and Modeling, 2013, v. 5, no. 6, pp. 981-992

    In this paper the dynamic elasticity problem of the simultaneous normal and tangential impact on the half-space is solved. This problem simulates the oblique incidence of meteorite on the Earth’s surface. The surface Rayleigh wave is investigated. The resulting solution is used as an external effect on the high-rise building, located at some distance from the spot of falling for the safety and stability assessment of its structure. Numerical experiments were made based on the finite element software package STARK ES. Upper floors amplitudes of the selected object were calculated under such dynamic effects. Also a systematic comparison with the results at the foundation vibrations, relevant to  standard a 8-point earthquake accelerograms, was made.

    Views (last year): 3. Citations: 2 (RSCI).
  8. Dimitrov V.
    Deriving semantics from WS-BPEL specifications of parallel business processes on an example
    Computer Research and Modeling, 2015, v. 7, no. 3, pp. 445-454

    WS-BPEL is a widely accepted standard for specification of business distributed and parallel processes. This standard is a mismatch of algebraic and Petri net paradigms. Following that, it is easy to specify WS-BPEL business process with unwanted features. That is why the verification of WS-BPEL business processes is very important. The intent of this paper is to show some possibilities for conversion of a WS-BPEL processes into more formal specifications that can be verified. CSP and Z-notation are used as formal models. Z-notation is useful for specification of abstract data types. Web services can be viewed as a kind of abstract data types.

    Views (last year): 6.
  9. Didych Y.O., Malinetsky G.G.
    The analysis of player’s behaviour in modified “Sea battle” game
    Computer Research and Modeling, 2016, v. 8, no. 5, pp. 817-827

    The well-known “Sea battle” game is in the focus of the current job. The main goal of the article is to provide modified version of “Sea battle” game and to find optimal players’ strategies in the new rules. Changes were applied to attacking strategies (new option to attack hitting four cells in one shot was added) as well as to the size of the field (sizes of 10 × 10, 20 × 20, 30 × 30 were used) and to the rules of disposal algorithms during the game (new possibility to move the ship off the attacking zone). The game was solved with the use of game theory capabilities: payoff matrices were found for each version of altered rules, for which optimal pure and mixed strategies were discovered. For solving payoff matrices iterative method was used. The simulation was in applying five attacking algorithms and six disposal ones with parameters variation due to the game of players with each other. Attacking algorithms were varied in 100 sets of parameters, disposal algorithms — in 150 sets. Major result is that using such algorithms the modified “Sea battle” game can be solved — that implies the possibility of finding stable pure and mixed strategies of behaviour, which guarantee the sides gaining optimal results in game theory terms. Moreover, influence of modifying the rules of “Sea battle” game is estimated. Comparison with prior authors’ results on this topic was made. Based on matching the payoff matrices with the statistical analysis, completed earlier, it was found out that standard “Sea battle” game could be represented as a special case of game modifications, observed in this article. The job is important not only because of its applications in war area, but in civil areas as well. Use of article’s results could save resources in exploration, provide an advantage in war conflicts, defend devices under devastating impact.

    Views (last year): 18.
  10. Kashchenko N.M., Ishanov S.A., Matsievsky S.V.
    Simulation equatorial plasma bubbles started from plasma clouds
    Computer Research and Modeling, 2019, v. 11, no. 3, pp. 463-476

    Experimental, theoretical and numerical investigations of equatorial spread F, equatorial plasma bubbles (EPBs), plasma depletion shells, and plasma clouds are continued at new variety articles. Nonlinear growth, bifurcation, pinching, atomic and molecular ion dynamics are considered at there articles. But the authors of this article believe that not all parameters of EPB development are correct. For example, EPB bifurcation is highly questionable.

    A maximum speed inside EPBs and a development time of EPB are defined and studied. EPBs starting from one, two or three zones of the increased density (initial plasma clouds). The development mechanism of EPB is the Rayleigh-Taylor instability (RTI). Time of the initial stage of EPB development went into EPB favorable time interval (in this case the increase linear increment is more than zero) and is 3000–7000 c for the Earth equatorial ionosphere.

    Numerous computing experiments were conducted with use of the original two-dimensional mathematical and numerical model MI2, similar USA standard model SAMI2. This model MI2 is described in detail. The received results can be used both in other theoretical works and for planning and carrying out natural experiments for generation of F-spread in Earth ionosphere.

    Numerical simulating was carried out for the geophysical conditions favorable for EPBs development. Numerical researches confirmed that development time of EPBs from initial irregularities with the increased density is significantly more than development time from zones of the lowered density. It is shown that developed irregularities interact among themselves strongly and not linearly even then when initial plasma clouds are strongly removed from each other. In addition, this interaction is stronger than interaction of EPBs starting from initial irregularities with the decreased density. The numerical experiments results showed the good consent of developed EPB parameters with experimental data and with theoretical researches of other authors.

    Views (last year): 14.
Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"