Результаты поиска по 'forming':
Найдено статей: 246
  1. Tarasevich Y.Y., Zelepukhina V.A.
    Academic network as excitable medium
    Computer Research and Modeling, 2015, v. 7, no. 1, pp. 177-183

    The paper simulated the spread of certain ideas in a professional virtual group. We consider the propagation of excitation in an inhomogeneous excitable medium of high connectivity. It is assumed that the network elements form a complete graph. Parameters of the elements are normally distributed. The simulation showed that interest in the idea can fade or fluctuate depending on the settings in the virtual group. The presence of a permanent excited element with relatively high activity leads to chaos — the fraction of members of the community actively interested in an idea varies irregularly.

    Views (last year): 6.
  2. Petrosyan A.Sh.
    The New Use of Network Element in ATLAS Workload Management System
    Computer Research and Modeling, 2015, v. 7, no. 6, pp. 1343-1349

    A crucial component of distributed computing systems is network infrastructure. While networking forms the backbone of such systems, it is often the invisible partner to storage and computing resources. We propose to integrate Network Elements directly into distributed systems through the workload management layer. There are many reasons for this approach. As the complexity and demand for distributed systems grow, it is important to use existing infrastructure efficiently. For example, one could use network performance measurements in the decision making mechanisms of workload management systems. New advanced technologies allow one to programmatically define network configuration, for example SDN — Software Defined Networks. We will describe how these methods are being used within the PanDA workload management system of the ATLAS collaboration.

    Views (last year): 2. Citations: 2 (RSCI).
  3. Lopatin N.V., Kydrjavtsev E.A., Panin P.V., Vidumkina S.V.
    Simulation of forming of UFG Ti-6-4 alloy at low temperature of superplasticity
    Computer Research and Modeling, 2017, v. 9, no. 1, pp. 127-133

    Superplastic forming of Ni and Ti based alloys is widely used in aerospace industry. The main advantage of using the effect of superplasticity in sheet metal forming processes is a feasibility of forming materials with a high amount of plastic strain in conditions of prevailing tensile stresses. This article is dedicated to study commercial FEM software SFTC DEFORM application for prediction thickness deviation during low temperature superplastic forming of UFG Ti-6-4 alloy. Experimentally, thickness deviation during superplastic forming can be observed in the local area of plastic deformation and this process is aggravated by local softening of the metal and this is stipulated by microstructure coarsening. The theoretical model was prepared to analyze experimentally observed metal flow. Two approaches have been used for that. The first one is the using of integrated creep rheology model in DEFORM. As superplastic effect is observed only in materials with fine and ultrafine grain sizes the second approach is carried out using own user procedures for rheology model which is based on microstructure evolution equations. These equations have been implemented into DEFORM via Fortran user’s solver subroutines. Using of FEM simulation for this type of forming allows tracking a strain rate in different parts of a workpiece during a process, which is crucial for maintaining the superplastic conditions. Comparison of these approaches allows us to make conclusions about effect of microstructure evolution on metal flow during superplastic deformation. The results of the FEM analysis and theoretical conclusions have been approved by results of the conducted Erichsen test. The main issues of this study are as follows: a) the DEFORM software allows an engineer to predict formation of metal shape under the condition of low-temperature superplasticity; b) in order to augment the accuracy of the prediction of local deformations, the effect of the microstructure state of an alloy having sub-microcristalline structure should be taken into account in the course of calculations in the DEFORM software.

    Views (last year): 10.
  4. Orel V.R., Tambovtseva R.V., Firsova E.A.
    Effects of the heart contractility and its vascular load on the heart rate in athlets
    Computer Research and Modeling, 2017, v. 9, no. 2, pp. 323-329

    Heart rate (HR) is the most affordable indicator for measuring. In order to control the individual response to physical exercises of different load types heart rate is measured when the athletes perform different types of muscular work (strength machines, various types of training and competitive exercises). The magnitude of heart rate and its dynamics during muscular work and recovery can be objectively judged on the functional status of the cardiovascular system of an athlete, the level of its individual physical performance, as well as an adaptive response to a particular exercise. However, the heart rate is not an independent determinant of the physical condition of an athlete. HR size is formed by the interaction of the basic physiological mechanisms underlying cardiac hemodynamic ejection mode. Heart rate depends on one hand, on contractility of the heart, the venous return, the volumes of the atria and ventricles of the heart and from vascular heart load, the main components of which are elastic and peripheral resistance of the arterial system on the other hand. The values of arterial system vascular resistances depend on the power of muscular work and its duration. HR sensitivity to changes in heart load and vascular contraction was determined in athletes by pair regression analysis simultaneously recorded heart rate data, and peripheral $(R)$ and elastic $(E_a)$ resistance (heart vascular load), and the power $(W)$ of heartbeats (cardiac contractility). The coefficients of sensitivity and pair correlation between heart rate indicators and vascular load and contractility of left ventricle of the heart were determined in athletes at rest and during the muscular work on the cycle ergometer. It is shown that increase in both ergometer power load and heart rate is accompanied by the increase of correlation coefficients and coefficients of the heart rate sensitivity to $R$, $E_a$ and $W$.

    Views (last year): 5. Citations: 1 (RSCI).
  5. Goncharenko V.M., Shapoval A.B.
    Hypergeometric functions in model of General equilibrium of multisector economy with monopolistic competition
    Computer Research and Modeling, 2017, v. 9, no. 5, pp. 825-836

    We show that basic properties of some models of monopolistic competition are described using families of hypergeometric functions. The results obtained by building a general equilibrium model in a multisector economy producing a differentiated good in $n$ high-tech sectors in which single-product firms compete monopolistically using the same technology. Homogeneous (traditional) sector is characterized by perfect competition. Workers are motivated to find a job in high-tech sectors as wages are higher there. However, they are at risk to remain unemployed. Unemployment persists in equilibrium by labor market imperfections. Wages are set by firms in high-tech sectors as a result of negotiations with employees. It is assumed that individuals are homogeneous consumers with identical preferences that are given the separable utility function of general form. In the paper the conditions are found such that the general equilibrium in the model exists and is unique. The conditions are formulated in terms of the elasticity of substitution $\mathfrak{S}$ between varieties of the differentiated good which is averaged over all consumers. The equilibrium found is symmetrical with respect to the varieties of differentiated good. The equilibrium variables can be represented as implicit functions which properties are associated elasticity $\mathfrak{S}$ introduced by the authors. A complete analytical description of the equilibrium variables is possible for known special cases of the utility function of consumers, for example, in the case of degree functions, which are incorrect to describe the response of the economy to changes in the size of the markets. To simplify the implicit function, we introduce a utility function defined by two one-parameter families of hypergeometric functions. One of the families describes the pro-competitive, and the other — anti-competitive response of prices to an increase in the size of the economy. A parameter change of each of the families corresponds to all possible values of the elasticity $\mathfrak{S}$. In this sense, the hypergeometric function exhaust natural utility function. It is established that with the increase in the elasticity of substitution between the varieties of the differentiated good the difference between the high-tech and homogeneous sectors is erased. It is shown that in the case of large size of the economy in equilibrium individuals consume a small amount of each product as in the case of degree preferences. This fact allows to approximate the hypergeometric functions by the sum of degree functions in a neighborhood of the equilibrium values of the argument. Thus, the change of degree utility functions by hypergeometric ones approximated by the sum of two power functions, on the one hand, retains all the ability to configure parameters and, on the other hand, allows to describe the effects of change the size of the sectors of the economy.

    Views (last year): 10.
  6. Ilyin O.V.
    Boundary conditions for lattice Boltzmann equations in applications to hemodynamics
    Computer Research and Modeling, 2020, v. 12, no. 4, pp. 865-882

    We consider a one-dimensional three velocity kinetic lattice Boltzmann model, which represents a secondorder difference scheme for hydrodynamic equations. In the framework of kinetic theory this system describes the propagation and interaction of three types of particles. It has been shown previously that the lattice Boltzmann model with external virtual force is equivalent at the hydrodynamic limit to the one-dimensional hemodynamic equations for elastic vessels, this equivalence can be achieved with use of the Chapman – Enskog expansion. The external force in the model is responsible for the ability to adjust the functional dependence between the lumen area of the vessel and the pressure applied to the wall of the vessel under consideration. Thus, the form of the external force allows to model various elastic properties of the vessels. In the present paper the physiological boundary conditions are considered at the inlets and outlets of the arterial network in terms of the lattice Boltzmann variables. We consider the following boundary conditions: for pressure and blood flow at the inlet of the vascular network, boundary conditions for pressure and blood flow for the vessel bifurcations, wave reflection conditions (correspond to complete occlusion of the vessel) and wave absorption at the ends of the vessels (these conditions correspond to the passage of the wave without distortion), as well as RCR-type conditions, which are similar to electrical circuits and consist of two resistors (corresponding to the impedance of the vessel, at the end of which the boundary conditions are set and the friction forces in microcirculatory bed) and one capacitor (describing the elastic properties of arterioles). The numerical simulations were performed: the propagation of blood in a network of three vessels was considered, the boundary conditions for the blood flow were set at the entrance of the network, RCR boundary conditions were stated at the ends of the network. The solutions to lattice Boltzmann model are compared with the benchmark solutions (based on numerical calculations for second-order McCormack difference scheme without viscous terms), it is shown that the both approaches give very similar results.

  7. Musaev A.A., Grigoriev D.A.
    Extracting knowledge from text messages: overview and state-of-the-art
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1291-1315

    In general, solving the information explosion problem can be delegated to systems for automatic processing of digital data. These systems are intended for recognizing, sorting, meaningfully processing and presenting data in formats readable and interpretable by humans. The creation of intelligent knowledge extraction systems that handle unstructured data would be a natural solution in this area. At the same time, the evident progress in these tasks for structured data contrasts with the limited success of unstructured data processing, and, in particular, document processing. Currently, this research area is undergoing active development and investigation. The present paper is a systematic survey on both Russian and international publications that are dedicated to the leading trend in automatic text data processing: Text Mining (TM). We cover the main tasks and notions of TM, as well as its place in the current AI landscape. Furthermore, we analyze the complications that arise during the processing of texts written in natural language (NLP) which are weakly structured and often provide ambiguous linguistic information. We describe the stages of text data preparation, cleaning, and selecting features which, alongside the data obtained via morphological, syntactic, and semantic analysis, constitute the input for the TM process. This process can be represented as mapping a set of text documents to «knowledge». Using the case of stock trading, we demonstrate the formalization of the problem of making a trade decision based on a set of analytical recommendations. Examples of such mappings are methods of Information Retrieval (IR), text summarization, sentiment analysis, document classification and clustering, etc. The common point of all tasks and techniques of TM is the selection of word forms and their derivatives used to recognize content in NL symbol sequences. Considering IR as an example, we examine classic types of search, such as searching for word forms, phrases, patterns and concepts. Additionally, we consider the augmentation of patterns with syntactic and semantic information. Next, we provide a general description of all NLP instruments: morphological, syntactic, semantic and pragmatic analysis. Finally, we end the paper with a comparative analysis of modern TM tools which can be helpful for selecting a suitable TM platform based on the user’s needs and skills.

  8. Makarov I.S., Bagantsova E.R., Iashin P.A., Kovaleva M.D., Gorbachev R.A.
    Development of and research on an algorithm for distinguishing features in Twitter publications for a classification problem with known markup
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 171-183

    Social media posts play an important role in demonstration of financial market state, and their analysis is a powerful tool for trading. The article describes the result of a study of the impact of social media activities on the movement of the financial market. The top authoritative influencers are selected. Twitter posts are used as data. Such texts usually include slang and abbreviations, so methods for preparing primary text data, including Stanza, regular expressions are presented. Two approaches to the representation of a point in time in the format of text data are considered. The difference of the influence of a single tweet or a whole package consisting of tweets collected over a certain period of time is investigated. A statistical approach in the form of frequency analysis is also considered, metrics defined by the significance of a particular word when identifying the relationship between price changes and Twitter posts are introduced. Frequency analysis involves the study of the occurrence distributions of various words and bigrams in the text for positive, negative or general trends. To build the markup, changes in the market are processed into a binary vector using various parameters, thus setting the task of binary classification. The parameters for Binance candlesticks are sorted out for better description of the movement of the cryptocurrency market, their variability is also explored in this article. Sentiment is studied using Stanford Core NLP. The result of statistical analysis is relevant to feature selection for further binary or multiclass classification tasks. The presented methods of text analysis contribute to the increase of the accuracy of models designed to solve natural language processing problems by selecting words, improving the quality of vectorization. Such algorithms are often used in automated trading strategies to predict the price of an asset, the trend of its movement.

  9. Sukhov E.A., Chekina E.A.
    Software complex for numerical modeling of multibody system dynamics
    Computer Research and Modeling, 2024, v. 16, no. 1, pp. 161-174

    This work deals with numerical modeling of motion of the multibody systems consisting of rigid bodies with arbitrary masses and inertial properties. We consider both planar and spatial systems which may contain kinematic loops.

    The numerical modeling is fully automatic and its computational algorithm contains three principal steps. On step one a graph of the considered mechanical system is formed from the userinput data. This graph represents the hierarchical structure of the mechanical system. On step two the differential-algebraic equations of motion of the system are derived using the so-called Joint Coordinate Method. This method allows to minimize the redundancy and lower the number of the equations of motion and thus optimize the calculations. On step three the equations of motion are integrated numerically and the resulting laws of motion are presented via user interface or files.

    The aforementioned algorithm is implemented in the software complex that contains a computer algebra system, a graph library, a mechanical solver, a library of numerical methods and a user interface.

  10. Garanina O.S., Romanovsky M.Y.
    Experimental investigation of Russian citizens expenses on new cars and a correspondence to their income
    Computer Research and Modeling, 2012, v. 4, no. 3, pp. 621-629

    The question of distribution of citizens expenses in modern Russia is experimentally investigated. New cars were chosen as representative group of the acquired goods as well as earlier. Results of the analysis of sales of new cars for 2007–2009 are presented below. Main “body” of density of probability to find certain number of cars depending on their price, since some initial price up to ~ k$60, is an exponential distribution. The found feature of distribution (unlike 2003–2005) was an existence of minimum price. For expensive cars (distribution “tail”), the asymptotic form is the Pareto distribution with a hyperbole exponent a little greater, than measured earlier for 2003–2005. The results turned up to be similar to direct measurements of distribution of tax declarations on their size, submitted to the USA in 2004 where exponential distribution of the income of citizens, since some minimum, with some asymptotic in the form of Pareto's distribution also was observed.

    Citations: 3 (RSCI).
Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"