Результаты поиска по 'sequence analysis':
Найдено статей: 21
  1. Grachev V.A., Nayshtut Yu.S.
    Relaxation oscillations and buckling of thin shells
    Computer Research and Modeling, 2020, v. 12, no. 4, pp. 807-820

    The paper reviews possibilities to predict buckling of thin cylindrical shells with non-destructive techniques during operation. It studies shallow shells made of high strength materials. Such structures are known for surface displacements exceeding the thickness of the elements. In the explored shells relaxation oscillations of significant amplitude can be generated even under relatively low internal stresses. The problem of the cylindrical shell oscillation is mechanically and mathematically modeled in a simplified form by conversion into an ordinary differential equation. To create the model, the researches of many authors were used who studied the geometry of the surface formed after buckling (postbuckling behavior). The nonlinear ordinary differential equation for the oscillating shell matches the well-known Duffing equation. It is important that there is a small parameter before the second time derivative in the Duffing equation. The latter circumstance enables making a detailed analysis of the obtained equation and describing the physical phenomena — relaxation oscillations — that are unique to thin high-strength shells.

    It is shown that harmonic oscillations of the shell around the equilibrium position and stable relaxation oscillations are defined by the bifurcation point of the solutions to the Duffing equation. This is the first point in the Feigenbaum sequence to convert the stable periodic motions into dynamic chaos. The amplitude and the period of relaxation oscillations are calculated based on the physical properties and the level of internal stresses within the shell. Two cases of loading are reviewed: compression along generating elements and external pressure.

    It is highlighted that if external forces vary in time according to the harmonic law, the periodic oscillation of the shell (nonlinear resonance) is a combination of slow and stick-slip movements. Since the amplitude and the frequency of the oscillations are known, this fact enables proposing an experimental facility for prediction of the shell buckling with non-destructive techniques. The following requirement is set as a safety factor: maximum load combinations must not cause displacements exceeding specified limits. Based on the results of the experimental measurements a formula is obtained to estimate safety against buckling (safety factor) of the structure.

  2. Lyubushin A.A., Rodionov E.A.
    Analysis of predictive properties of ground tremor using Huang decomposition
    Computer Research and Modeling, 2024, v. 16, no. 4, pp. 939-958

    A method is proposed for analyzing the tremor of the earth’s surface, measured by means of space geodesy, in order to highlight the prognostic effects of seismicity activation. The method is illustrated by the example of a joint analysis of a set of synchronous time series of daily vertical displacements of the earth’s surface on the Japanese Islands for the time interval 2009–2023. The analysis is based on dividing the source data (1047 time series) into blocks (clusters of stations) and sequentially applying the principal component method. The station network is divided into clusters using the K-means method from the maximum pseudo-F-statistics criterion, and for Japan the optimal number of clusters was chosen to be 15. The Huang decomposition method into a sequence of independent empirical oscillation modes (EMD — Empirical Mode Decomposition) is applied to the time series of principal components from station blocks. To provide the stability of estimates of the waveforms of the EMD decomposition, averaging of 1000 independent additive realizations of white noise of limited amplitude was performed. Using the Cholesky decomposition of the covariance matrix of the waveforms of the first three EMD components in a sliding time window, indicators of abnormal tremor behavior were determined. By calculating the correlation function between the average indicators of anomalous behavior and the released seismic energy in the vicinity of the Japanese Islands, it was established that bursts in the measure of anomalous tremor behavior precede emissions of seismic energy. The purpose of the article is to clarify common hypotheses that movements of the earth’s crust recorded by space geodesy may contain predictive information. That displacements recorded by geodetic methods respond to the effects of earthquakes is widely known and has been demonstrated many times. But isolating geodetic effects that predict seismic events is much more challenging. In our paper, we propose one method for detecting predictive effects in space geodesy data.

  3. Varshavskiy A.E.
    A model for analyzing income inequality based on a finite functional sequence (adequacy and application problems)
    Computer Research and Modeling, 2022, v. 14, no. 3, pp. 675-689

    The paper considers the adequacy of the model developed earlier by the author for the analysis of income inequality and based on an empirically confirmed hypothesis that the relative (to the income of the richest group) income values of 20% population groups in total income can be represented as a finite functional sequence, each member of which depends on one parameter — a specially defined indicator of inequality. It is shown that in addition to the existing methods of inequality analysis, the model makes it possible to estimate with the help of analytical expressions the income shares of 20%, 10% and smaller groups of the population for different levels of inequality, as well as to identify how they change with the growth of inequality, to estimate the level of inequality for known ratios between the incomes of different groups of the population, etc.

    The paper provides a more detailed confirmation of the proposed model adequacy in comparison with the previously obtained results of statistical analysis of empirical data on the distribution of income between the 20% and 10% population groups. It is based on the analysis of certain ratios between the values of quintiles and deciles according to the proposed model. The verification of these ratios was carried out using a set of data for a large number of countries and the estimates obtained confirm the sufficiently high accuracy of the model.

    Data are presented that confirm the possibility of using the model to analyze the dependence of income distribution by population groups on the level of inequality, as well as to estimate the inequality indicator for income ratios between different groups, including variants when the income of the richest 20% is equal to the income of the poor 60 %, income of the middle class 40% or income of the rest 80% of the population, as well as when the income of the richest 10% is equal to the income of the poor 40 %, 50% or 60%, to the income of various middle class groups, etc., as well as for cases, when the distribution of income obeys harmonic proportions and when the quintiles and deciles corresponding to the middle class reach a maximum. It is shown that the income shares of the richest middle class groups are relatively stable and have a maximum at certain levels of inequality.

    The results obtained with the help of the model can be used to determine the standards for developing a policy of gradually increasing the level of progressive taxation in order to move to the level of inequality typical of countries with social oriented economy.

  4. Grinevich A.A., Ryasik A.A., Yakushevich L.V.
    The dynamics of polynucleotide chain consisting of two different homogeneous sequences, divided by interface
    Computer Research and Modeling, 2013, v. 5, no. 2, pp. 241-253

    To research dynamics of inhomogeneous polynucleotide DNA chain the Y-model with no dissipation term was used. Basing on this model using numerical methods calculations were carried out, which have shown the behaviour of nonlinear conformational excitation (kink), spreading along the inhomogeneous polynucleotide chain, consisting of two different homogeneous nucleotide sequences. As numerical analysis shows there are three ways of behaviour of the nonlinear kink excitation spreading along the DNA chain. After reaching the interface between two homogeneous sequences consisting of different types of bases kink can a) reflect, b) pass the interface with acceleration (increase its velocity), c) pass the interface with deceleration (decrease its velocity).

    Views (last year): 1. Citations: 3 (RSCI).
  5. Chuvilin K.V.
    An efficient algorithm for ${\mathrm{\LaTeX}}$ documents comparing
    Computer Research and Modeling, 2015, v. 7, no. 2, pp. 329-345

    The problem is constructing the differences that arise on ${\mathrm{\LaTeX}}$ documents editing. Each document is represented as a parse tree whose nodes are called tokens. The smallest possible text representation of the document that does not change the syntax tree is constructed. All of the text is splitted into fragments whose boundaries correspond to tokens. A map of the initial text fragment sequence to the similar sequence of the edited document corresponding to the minimum distance is built with Hirschberg algorithm A map of text characters corresponding to the text fragment sequences map is cunstructed. Tokens, that chars are all deleted, or all inserted, or all not changed, are selected in the parse trees. The map for the trees formed with other tokens is built using Zhang–Shasha algorithm.

    Views (last year): 2. Citations: 2 (RSCI).
  6. Stepanyan I.V.
    Biomathematical system of the nucleic acids description
    Computer Research and Modeling, 2020, v. 12, no. 2, pp. 417-434

    The article is devoted to the application of various methods of mathematical analysis, search for patterns and studying the composition of nucleotides in DNA sequences at the genomic level. New methods of mathematical biology that made it possible to detect and visualize the hidden ordering of genetic nucleotide sequences located in the chromosomes of cells of living organisms described. The research was based on the work on algebraic biology of the doctor of physical and mathematical sciences S. V. Petukhov, who first introduced and justified new algebras and hypercomplex numerical systems describing genetic phenomena. This paper describes a new phase in the development of matrix methods in genetics for studying the properties of nucleotide sequences (and their physicochemical parameters), built on the principles of finite geometry. The aim of the study is to demonstrate the capabilities of new algorithms and discuss the discovered properties of genetic DNA and RNA molecules. The study includes three stages: parameterization, scaling, and visualization. Parametrization is the determination of the parameters taken into account, which are based on the structural and physicochemical properties of nucleotides as elementary components of the genome. Scaling plays the role of “focusing” and allows you to explore genetic structures at various scales. Visualization includes the selection of the axes of the coordinate system and the method of visual display. The algorithms presented in this work are put forward as a new toolkit for the development of research software for the analysis of long nucleotide sequences with the ability to display genomes in parametric spaces of various dimensions. One of the significant results of the study is that new criteria were obtained for the classification of the genomes of various living organisms to identify interspecific relationships. The new concept allows visually and numerically assessing the variability of the physicochemical parameters of nucleotide sequences. This concept also allows one to substantiate the relationship between the parameters of DNA and RNA molecules with fractal geometric mosaics, reveals the ordering and symmetry of polynucleotides, as well as their noise immunity. The results obtained justified the introduction of new terms: “genometry” as a methodology of computational strategies and “genometrica” as specific parameters of a particular genome or nucleotide sequence. In connection with the results obtained, biosemiotics and hierarchical levels of organization of living matter are raised.

  7. Temlyakova E.A., Sorokin A.A.
    Detection of promoter and non-promoter E.coli sequences by analysis of their electrostatic profiles
    Computer Research and Modeling, 2015, v. 7, no. 2, pp. 347-359

    The article is devoted to the idea of using physical properties of DNA instead of sequence along for the aspect of accurate search and annotation of various prokaryotic genomic regions. Particulary, the possibility to use electrostatic potential distribution around DNA sequence as a classifier for identification of a few functional DNA regions was demonstrated. A number of classification models was built providing discrimination of promoters and non-promoter regions (random sequences, coding regions and promoter-like sequences) with accuracy value about 83–85%. The most valueable regions for the discrimination were determined and expected to play a certain role in the process of DNA-recognition by RNA-polymerase.

    Views (last year): 3.
  8. Musaev A.A., Grigoriev D.A.
    Extracting knowledge from text messages: overview and state-of-the-art
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1291-1315

    In general, solving the information explosion problem can be delegated to systems for automatic processing of digital data. These systems are intended for recognizing, sorting, meaningfully processing and presenting data in formats readable and interpretable by humans. The creation of intelligent knowledge extraction systems that handle unstructured data would be a natural solution in this area. At the same time, the evident progress in these tasks for structured data contrasts with the limited success of unstructured data processing, and, in particular, document processing. Currently, this research area is undergoing active development and investigation. The present paper is a systematic survey on both Russian and international publications that are dedicated to the leading trend in automatic text data processing: Text Mining (TM). We cover the main tasks and notions of TM, as well as its place in the current AI landscape. Furthermore, we analyze the complications that arise during the processing of texts written in natural language (NLP) which are weakly structured and often provide ambiguous linguistic information. We describe the stages of text data preparation, cleaning, and selecting features which, alongside the data obtained via morphological, syntactic, and semantic analysis, constitute the input for the TM process. This process can be represented as mapping a set of text documents to «knowledge». Using the case of stock trading, we demonstrate the formalization of the problem of making a trade decision based on a set of analytical recommendations. Examples of such mappings are methods of Information Retrieval (IR), text summarization, sentiment analysis, document classification and clustering, etc. The common point of all tasks and techniques of TM is the selection of word forms and their derivatives used to recognize content in NL symbol sequences. Considering IR as an example, we examine classic types of search, such as searching for word forms, phrases, patterns and concepts. Additionally, we consider the augmentation of patterns with syntactic and semantic information. Next, we provide a general description of all NLP instruments: morphological, syntactic, semantic and pragmatic analysis. Finally, we end the paper with a comparative analysis of modern TM tools which can be helpful for selecting a suitable TM platform based on the user’s needs and skills.

  9. Ignatev N.A., Tuliev U.Y.
    Semantic structuring of text documents based on patterns of natural language entities
    Computer Research and Modeling, 2022, v. 14, no. 5, pp. 1185-1197

    The technology of creating patterns from natural language words (concepts) based on text data in the bag of words model is considered. Patterns are used to reduce the dimension of the original space in the description of documents and search for semantically related words by topic. The process of dimensionality reduction is implemented through the formation of patterns of latent features. The variety of structures of document relations is investigated in order to divide them into themes in the latent space.

    It is considered that a given set of documents (objects) is divided into two non-overlapping classes, for the analysis of which it is necessary to use a common dictionary. The belonging of words to a common vocabulary is initially unknown. Class objects are considered as opposition to each other. Quantitative parameters of oppositionality are determined through the values of the stability of each feature and generalized assessments of objects according to non-overlapping sets of features.

    To calculate the stability, the feature values are divided into non-intersecting intervals, the optimal boundaries of which are determined by a special criterion. The maximum stability is achieved under the condition that the boundaries of each interval contain values of one of the two classes.

    The composition of features in sets (patterns of words) is formed from a sequence ordered by stability values. The process of formation of patterns and latent features based on them is implemented according to the rules of hierarchical agglomerative grouping.

    A set of latent features is used for cluster analysis of documents using metric grouping algorithms. The analysis applies the coefficient of content authenticity based on the data on the belonging of documents to classes. The coefficient is a numerical characteristic of the dominance of class representatives in groups.

    To divide documents into topics, it is proposed to use the union of groups in relation to their centers. As patterns for each topic, a sequence of words ordered by frequency of occurrence from a common dictionary is considered.

    The results of a computational experiment on collections of abstracts of scientific dissertations are presented. Sequences of words from the general dictionary on 4 topics are formed.

  10. Nikulin V.N., Odintsova A.S.
    Statistically fair price for the European call options according to the discreet mean/variance model
    Computer Research and Modeling, 2014, v. 6, no. 5, pp. 861-874

    We consider a portfolio with call option and the corresponding underlying asset under the standard assumption that stock-market price represents a random variable with lognormal distribution. Minimizing the variance hedging risk of the portfolio on the date of maturity of the call option we find a fraction of the asset per unit call option. As a direct consequence we derive the statistically fair lookback call option price in explicit form. In contrast to the famous Black–Scholes theory, any portfolio cannot be regarded as  risk-free because no additional transactions are supposed to be conducted over the life of the contract, but the sequence of independent portfolios will reduce risk to zero asymptotically. This property is illustrated in the experimental section using a dataset of daily stock prices of 37 leading US-based companies for the period from April 2006 to January 2013.

    Views (last year): 1.
Pages: previous next

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"