Результаты поиска по 'variable parameters':
Найдено статей: 67
  1. Stepanyan I.V.
    Biomathematical system of the nucleic acids description
    Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434

    The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.

  2. Ansori Moch.F., Sumarti N.N., Sidarto K.A., Gunadi I.I.
    An Algorithm for Simulating the Banking Network System and Its Application for Analyzing Macroprudential Policy
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1275-1289

    Modeling banking systems using a network approach has received growing attention in recent years. One of the notable models is that developed by Iori et al, who proposed a banking system model for analyzing systemic risks in interbank networks. The model is built based on the simple dynamics of several bank balance sheet variables such as deposit, equity, loan, liquid asset, and interbank lending (or borrowing) in the form of difference equations. Each bank faces random shocks in deposits and loans. The balance sheet is updated at the beginning or end of each period. In the model, banks are grouped into either potential lenders or borrowers. The potential borrowers are those that have lack of liquidity and the potential lenders are those which have excess liquids after dividend payment and channeling new investment. The borrowers and the lenders are connected through the interbank market. Those borrowers have some percentage of linkage to random potential lenders for borrowing funds to maintain their safety net of the liquidity. If the demand for borrowing funds can meet the supply of excess liquids, then the borrower bank survives. If not, they are deemed to be in default and will be removed from the banking system. However, in their paper, most part of the interbank borrowing-lending mechanism is described qualitatively rather than by detailed mathematical or computational analysis. Therefore, in this paper, we enhance the mathematical parts of borrowing-lending in the interbank market and present an algorithm for simulating the model. We also perform some simulations to analyze the effects of the model’s parameters on banking stability using the number of surviving banks as the measure. We apply this technique to analyze the effects of a macroprudential policy called loan-to-deposit ratio based reserve requirement for banking stability.

  3. Syzranova N.G., Andruschenko V.A.
    Numerical modeling of physical processes leading to the destruction of meteoroids in the Earth’s atmosphere
    Computer Research and Modeling, 2022, v. 14, no. 4, pp. 835-851

    Within the framework of the actual problem of comet-asteroid danger, the physical processes causing the destruction and fragmentation of meteor bodies in the Earth’s atmosphere are numerically investigated. Based on the developed physicalmathematical models that determines the movements of space objects of natural origin in the atmosphere and their interaction with it, the fall of three, one of the largest and by some parameters unusual bolides in the history of meteoritics, are considered: Tunguska, Vitim and Chelyabinsk. Their singularity lies in the absence of any material meteorite remains and craters in the area of the alleged crash site for the first two bodies and the non-detection, as it is assumed, of the main mother body for the third body (due to the too small amount of mass of the fallen fragments compared to the estimated mass). The effect of aerodynamic loads and heat flows on these bodies are studied, which leads to intensive surface mass loss and possible mechanical destruction. The velocities of the studied celestial bodies and the change in their masses are determined from the modernized system of equations of the theory of meteoric physics. An important factor that is taken into account here is the variability of the meteorite mass entrainment parameter under the action of heat fluxes (radiation and convective) along the flight path. The process of fragmentation of meteoroids in this paper is considered within the framework of a progressive crushing model based on the statistical theory of strength, taking into account the influence of the scale factor on the ultimate strength of objects. The phenomena and effects arising at various kinematic and physical parameters of each of these bodies are revealed. In particular, the change in the ballistics of their flight in the denser layers of the atmosphere, consisting in the transition from the fall mode to the ascent mode. At the same time, the following scenarios of the event can be realized: 1) the return of the body back to outer space at its residual velocity greater than the second cosmic one; 2) the transition of the body to the orbit of the Earth satellite at a residual velocity greater than the first cosmic one; 3) at lower values of the residual velocity of the body, its return after some time to the fall mode and falling out at a considerable distance from the intended crash site. It is the implementation of one of these three scenarios of the event that explains, for example, the absence of material traces, including craters, in the case of the Tunguska bolide in the vicinity of the forest collapse. Assumptions about the possibility of such scenarios have been made earlier by other authors, and in this paper their implementation is confirmed by the results of numerical calculations.

  4. Goncharenko V.M., Shapoval A.B.
    Hypergeometric functions in model of General equilibrium of multisector economy with monopolistic competition
    Computer Research and Modeling, 2017, v. 9, no. 5, pp. 825-836

    We show that basic properties of some models of monopolistic competition are described using families of hypergeometric functions. The results obtained by building a general equilibrium model in a multisector economy producing a differentiated good in $n$ high-tech sectors in which single-product firms compete monopolistically using the same technology. Homogeneous (traditional) sector is characterized by perfect competition. Workers are motivated to find a job in high-tech sectors as wages are higher there. However, they are at risk to remain unemployed. Unemployment persists in equilibrium by labor market imperfections. Wages are set by firms in high-tech sectors as a result of negotiations with employees. It is assumed that individuals are homogeneous consumers with identical preferences that are given the separable utility function of general form. In the paper the conditions are found such that the general equilibrium in the model exists and is unique. The conditions are formulated in terms of the elasticity of substitution $\mathfrak{S}$ between varieties of the differentiated good which is averaged over all consumers. The equilibrium found is symmetrical with respect to the varieties of differentiated good. The equilibrium variables can be represented as implicit functions which properties are associated elasticity $\mathfrak{S}$ introduced by the authors. A complete analytical description of the equilibrium variables is possible for known special cases of the utility function of consumers, for example, in the case of degree functions, which are incorrect to describe the response of the economy to changes in the size of the markets. To simplify the implicit function, we introduce a utility function defined by two one-parameter families of hypergeometric functions. One of the families describes the pro-competitive, and the other — anti-competitive response of prices to an increase in the size of the economy. A parameter change of each of the families corresponds to all possible values of the elasticity $\mathfrak{S}$. In this sense, the hypergeometric function exhaust natural utility function. It is established that with the increase in the elasticity of substitution between the varieties of the differentiated good the difference between the high-tech and homogeneous sectors is erased. It is shown that in the case of large size of the economy in equilibrium individuals consume a small amount of each product as in the case of degree preferences. This fact allows to approximate the hypergeometric functions by the sum of degree functions in a neighborhood of the equilibrium values of the argument. Thus, the change of degree utility functions by hypergeometric ones approximated by the sum of two power functions, on the one hand, retains all the ability to configure parameters and, on the other hand, allows to describe the effects of change the size of the sectors of the economy.

    Views (last year): 10.
  5. Makarov I.S., Bagantsova E.R., Iashin P.A., Kovaleva M.D., Gorbachev R.A.
    Development of and research on an algorithm for distinguishing features in Twitter publications for a classification problem with known markup
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 171-183

    Social media posts play an important role in demonstration of financial market state, and their analysis is a powerful tool for trading. The article describes the result of a study of the impact of social media activities on the movement of the financial market. The top authoritative influencers are selected. Twitter posts are used as data. Such texts usually include slang and abbreviations, so methods for preparing primary text data, including Stanza, regular expressions are presented. Two approaches to the representation of a point in time in the format of text data are considered. The difference of the influence of a single tweet or a whole package consisting of tweets collected over a certain period of time is investigated. A statistical approach in the form of frequency analysis is also considered, metrics defined by the significance of a particular word when identifying the relationship between price changes and Twitter posts are introduced. Frequency analysis involves the study of the occurrence distributions of various words and bigrams in the text for positive, negative or general trends. To build the markup, changes in the market are processed into a binary vector using various parameters, thus setting the task of binary classification. The parameters for Binance candlesticks are sorted out for better description of the movement of the cryptocurrency market, their variability is also explored in this article. Sentiment is studied using Stanford Core NLP. The result of statistical analysis is relevant to feature selection for further binary or multiclass classification tasks. The presented methods of text analysis contribute to the increase of the accuracy of models designed to solve natural language processing problems by selecting words, improving the quality of vectorization. Such algorithms are often used in automated trading strategies to predict the price of an asset, the trend of its movement.

  6. Tsybulin V.G., Khosaeva Z.K.
    Mathematical model of political differentiation under social tension
    Computer Research and Modeling, 2019, v. 11, no. 5, pp. 999-1012

    We comsider a model of the dynamics a political system of several parties, accompanied and controlled by the growth of social tension. A system of nonlinear ordinary differential equations is proposed with respect to fractions and an additional scalar variable characterizing the magnitude of tension in society the change of each party is proportional to the current value multiplied by a coefficient that consists of an influx of novice, a flow from competing parties, and a loss due to the growth of social tension. The change in tension is made up of party contributions and own relaxation. The number of parties is fixed, there are no mechanisms in the model for combining existing or the birth of new parties.

    To study of possible scenarios of the dynamic processes of the model we derive an approach based on the selection of conditions under which this problem belongs to the class of cosymmetric systems. For the case of two parties, it is shown that in the system under consideration may have two families of equilibria, as well as a family of limit cycles. The existence of cosymmetry for a system of differential equations is ensured by the presence of additional constraints on the parameters, and in this case, the emergence of continuous families of stationary and nonstationary solutions is possible. To analyze the scenarios of cosymmetry breaking, an approach based on the selective function is applied. In the case of one political party, there is no multistability, one stable solution corresponds to each set of parameters. For the case of two parties, it is shown that in the system under consideration may have two families of equilibria, as well as a family of limit cycles. The results of numerical experiments demonstrating the destruction of the families and the implementation of various scenarios leading to the stabilization of the political system with the coexistence of both parties or to the disappearance of one of the parties, when part of the population ceases to support one of the parties and becomes indifferent are presented.

    This model can be used to predict the inter-party struggle during the election campaign. In this case necessary to take into account the dependence of the coefficients of the system on time.

  7. Matveev A.V.
    Modeling the kinetics of radiopharmaceuticals with iodine isotopes in nuclear medicine problems
    Computer Research and Modeling, 2020, v. 12, no. 4, pp. 883-905

    Radiopharmaceuticals with iodine radioisotopes are now widely used in imaging and non-imaging methods of nuclear medicine. When evaluating the results of radionuclide studies of the structural and functional state of organs and tissues, parallel modeling of the kinetics of radiopharmaceuticals in the body plays an important role. The complexity of such modeling lies in two opposite aspects. On the one hand, excessive simplification of the anatomical and physiological characteristics of the organism when splitting it to the compartments that may result in the loss or distortion of important clinical diagnosis information, on the other – excessive, taking into account all possible interdependencies of the functioning of the organs and systems that, on the contrary, will lead to excess amount of absolutely useless for clinical interpretation of the data or the mathematical model becomes even more intractable. Our work develops a unified approach to the construction of mathematical models of the kinetics of radiopharmaceuticals with iodine isotopes in the human body during diagnostic and therapeutic procedures of nuclear medicine. Based on this approach, three- and four-compartment pharmacokinetic models were developed and corresponding calculation programs were created in the C++ programming language for processing and evaluating the results of radionuclide diagnostics and therapy. Various methods for identifying model parameters based on quantitative data from radionuclide studies of the functional state of vital organs are proposed. The results of pharmacokinetic modeling for radionuclide diagnostics of the liver, kidney, and thyroid using iodine-containing radiopharmaceuticals are presented and analyzed. Using clinical and diagnostic data, individual pharmacokinetic parameters of transport of different radiopharmaceuticals in the body (transport constants, half-life periods, maximum activity in the organ and the time of its achievement) were determined. It is shown that the pharmacokinetic characteristics for each patient are strictly individual and cannot be described by averaged kinetic parameters. Within the framework of three pharmacokinetic models, “Activity–time” relationships were obtained and analyzed for different organs and tissues, including for tissues in which the activity of a radiopharmaceutical is impossible or difficult to measure by clinical methods. Also discussed are the features and the results of simulation and dosimetric planning of radioiodine therapy of the thyroid gland. It is shown that the values of absorbed radiation doses are very sensitive to the kinetic parameters of the compartment model. Therefore, special attention should be paid to obtaining accurate quantitative data from ultrasound and thyroid radiometry and identifying simulation parameters based on them. The work is based on the principles and methods of pharmacokinetics. For the numerical solution of systems of differential equations of the pharmacokinetic models we used Runge–Kutta methods and Rosenbrock method. The Hooke–Jeeves method was used to find the minimum of a function of several variables when identifying modeling parameters.

  8. Rusyak I.G., Tenenev V.A.
    Modeling of ballistics of an artillery shot taking into account the spatial distribution of parameters and backpressure
    Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1123-1147

    The paper provides a comparative analysis of the results obtained by various approaches to modeling the process of artillery shot. In this connection, the main problem of internal ballistics and its particular case of the Lagrange problem are formulated in averaged parameters, where, within the framework of the assumptions of the thermodynamic approach, the distribution of pressure and gas velocity over the projectile space for a channel of variable cross section is taken into account for the first time. The statement of the Lagrange problem is also presented in the framework of the gas-dynamic approach, taking into account the spatial (one-dimensional and two-dimensional axisymmetric) changes in the characteristics of the ballistic process. The control volume method is used to numerically solve the system of Euler gas-dynamic equations. Gas parameters at the boundaries of control volumes are determined using a selfsimilar solution to the Riemann problem. Based on the Godunov method, a modification of the Osher scheme is proposed, which allows to implement a numerical calculation algorithm with a second order of accuracy in coordinate and time. The solutions obtained in the framework of the thermodynamic and gas-dynamic approaches are compared for various loading parameters. The effect of projectile mass and chamber broadening on the distribution of the ballistic parameters of the shot and the dynamics of the projectile motion was studied. It is shown that the thermodynamic approach, in comparison with the gas-dynamic approach, leads to a systematic overestimation of the estimated muzzle velocity of the projectile in the entire range of parameters studied, while the difference in muzzle velocity can reach 35%. At the same time, the discrepancy between the results obtained in the framework of one-dimensional and two-dimensional gas-dynamic models of the shot in the same range of change in parameters is not more than 1.3%.

    A spatial gas-dynamic formulation of the backpressure problem is given, which describes the change in pressure in front of an accelerating projectile as it moves along the barrel channel. It is shown that accounting the projectile’s front, considered in the two-dimensional axisymmetric formulation of the problem, leads to a significant difference in the pressure fields behind the front of the shock wave, compared with the solution in the framework of the onedimensional formulation of the problem, where the projectile’s front is not possible to account. It is concluded that this can significantly affect the results of modeling ballistics of a shot at high shooting velocities.

  9. Elaraby A.E., Nechaevskiy A.V.
    An effective segmentation approach for liver computed tomography scans using fuzzy exponential entropy
    Computer Research and Modeling, 2021, v. 13, no. 1, pp. 195-202

    Accurate segmentation of liver plays important in contouring during diagnosis and the planning of treatment. Imaging technology analysis and processing are wide usage in medical diagnostics, and therapeutic applications. Liver segmentation referring to the process of automatic or semi-automatic detection of liver image boundaries. A major difficulty in segmentation of liver image is the high variability as; the human anatomy itself shows major variation modes. In this paper, a proposed approach for computed tomography (CT) liver segmentation is presented by combining exponential entropy and fuzzy c-partition. Entropy concept has been utilized in various applications in imaging computing domain. Threshold techniques based on entropy have attracted a considerable attention over the last years in image analysis and processing literatures and it is among the most powerful techniques in image segmentation. In the proposed approach, the computed tomography (CT) of liver is transformed into fuzzy domain and fuzzy entropies are defined for liver image object and background. In threshold selection procedure, the proposed approach considers not only the information of liver image background and object, but also interactions between them as the selection of threshold is done by find a proper parameter combination of membership function such that the total fuzzy exponential entropy is maximized. Differential Evolution (DE) algorithm is utilizing to optimize the exponential entropy measure to obtain image thresholds. Experimental results in different CT livers scan are done and the results demonstrate the efficient of the proposed approach. Based on the visual clarity of segmented images with varied threshold values using the proposed approach, it was observed that liver segmented image visual quality is better with the results higher level of threshold.

  10. Makarov I.S., Bagantsova E.R., Iashin P.A., Kovaleva M.D., Gorbachev R.A.
    Development of and research on machine learning algorithms for solving the classification problem in Twitter publications
    Computer Research and Modeling, 2023, v. 15, no. 1, pp. 185-195

    Posts on social networks can both predict the movement of the financial market, and in some cases even determine its direction. The analysis of posts on Twitter contributes to the prediction of cryptocurrency prices. The specificity of the community is represented in a special vocabulary. Thus, slang expressions and abbreviations are used in posts, the presence of which makes it difficult to vectorize text data, as a result of which preprocessing methods such as Stanza lemmatization and the use of regular expressions are considered. This paper describes created simplest machine learning models, which may work despite such problems as lack of data and short prediction timeframe. A word is considered as an element of a binary vector of a data unit in the course of the problem of binary classification solving. Basic words are determined according to the frequency analysis of mentions of a word. The markup is based on Binance candlesticks with variable parameters for a more accurate description of the trend of price changes. The paper introduces metrics that reflect the distribution of words depending on their belonging to a positive or negative classes. To solve the classification problem, we used a dense model with parameters selected by Keras Tuner, logistic regression, a random forest classifier, a naive Bayesian classifier capable of working with a small sample, which is very important for our task, and the k-nearest neighbors method. The constructed models were compared based on the accuracy metric of the predicted labels. During the investigation we recognized that the best approach is to use models which predict price movements of a single coin. Our model deals with posts that mention LUNA project, which no longer exist. This approach to solving binary classification of text data is widely used to predict the price of an asset, the trend of its movement, which is often used in automated trading.

Pages: « first previous next

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"