Результаты поиска по 'information':
Найдено статей: 168
  1. Svetlov K.V., Ivanov S.A.
    Stochastic model of voter dynamics in online media
    Computer Research and Modeling, 2019, v. 11, no. 5, pp. 979-997

    In the present article we explore the process of changing the level of approval of a political leader under the influence of the processes taking place in online platforms (social networks, forums, etc.). The driver of these changes is the interaction of users, through which they can exchange opinions with each other and formulate their position in relation to the political leader. In addition to interpersonal interaction, we will consider such factors as the information impact, expressed in the creation of an information flow with a given power and polarity (positive or negative, in the context of influencing the image of a political leader), as well as the presence of a group of agents (opinion leaders), supporting the leader, or, conversely, negatively affecting its representation in the media space.

    The mathematical basis of the presented research is the Kirman model, which has its roots in biology and initially found its application in economics. Within the framework of this model it is considered that each user is in one of the two possible states, and a Markov jump process describing transitions between these states is given. For the problem under consideration, these states are 0 or 1, depending on whether a particular agent is a supporter of a political leader or not. For further research, we find its diffusional approximation, known as the Jacoby process. With the help of spectral decomposition for the infinitesimal operator of this process we have an opportunity to find an analytical representation for the transition probability density.

    Analyzing the probabilities obtained in this way, we can assess the influence of individual factors of the model: the power and direction of the information flow, available to online users and relevant to the tasks of rating formation, as well as the number of supporters or opponents of the politician. Next, using the found eigenfunctions and eigenvalues, we derive expressions for the evaluation of conditional mathematical expectations of a politician’s rating, which can serve as a basis for building forecasts that are important for the formation of a strategy of representing a political leader in the online environment.

  2. Serkov L.A., Krasnykh S.S.
    Combining the agent approach and the general equilibrium approach to analyze the influence of the shadow sector on the Russian economy
    Computer Research and Modeling, 2020, v. 12, no. 3, pp. 669-684

    This article discusses the influence of the shadow, informal and household sectors on the dynamics of a stochastic model with heterogeneous (heterogeneous) agents. The study uses the integration of the general equilibrium approach to explain the behavior of demand, supply and prices in an economy with several interacting markets, and a multi-agent approach. The analyzed model describes an economy with aggregated uncertainty and with an infinite number of heterogeneous agents (households). The source of heterogeneity is the idiosyncratic income shocks of agents in the legal and shadow sectors of the economy. In the analysis, an algorithm is used to approximate the dynamics of the distribution function of the capital stocks of individual agents — the dynamics of its first and second moments. The synthesis of the agent approach and the general equilibrium approach is carried out using computer implementation of the recursive feedback between microagents and macroenvironment. The behavior of the impulse response functions of the main variables of the model confirms the positive influence of the shadow economy (below a certain limit) on minimizing the rate of decline in economic indicators during recessions, especially for developing economies. The scientific novelty of the study is the combination of a multi-agent approach and a general equilibrium approach for modeling macroeconomic processes at the regional and national levels. Further research prospects may be associated with the use of more detailed general equilibrium models, which allow, in particular, to describe the behavior of heterogeneous groups of agents in the entrepreneurial sector of the economy.

  3. Gesture recognition is an urgent challenge in developing systems of human-machine interfaces. We analyzed machine learning methods for gesture classification based on electromyographic muscle signals to identify the most effective one. Methods such as the naive Bayesian classifier (NBC), logistic regression, decision tree, random forest, gradient boosting, support vector machine (SVM), $k$-nearest neighbor algorithm, and ensembles (NBC and decision tree, NBC and gradient boosting, gradient boosting and decision tree) were considered. Electromyography (EMG) was chosen as a method of obtaining information about gestures. This solution does not require the location of the hand in the field of view of the camera and can be used to recognize finger movements. To test the effectiveness of the selected methods of gesture recognition, a device was developed for recording the EMG signal, which includes three electrodes and an EMG sensor connected to the microcontroller and the power supply. The following gestures were chosen: clenched fist, “thumb up”, “Victory”, squeezing an index finger and waving a hand from right to left. Accuracy, precision, recall and execution time were used to evaluate the effectiveness of classifiers. These parameters were calculated for three options for the location of EMG electrodes on the forearm. According to the test results, the most effective methods are $k$-nearest neighbors’ algorithm, random forest and the ensemble of NBC and gradient boosting, the average accuracy of ensemble for three electrode positions was 81.55%. The position of the electrodes was also determined at which machine learning methods achieve the maximum accuracy. In this position, one of the differential electrodes is located at the intersection of the flexor digitorum profundus and flexor pollicis longus, the second — above the flexor digitorum superficialis.

  4. Segmentation of medical image is one of the most challenging tasks in analysis of medical image. It classifies the organs pixels or lesions from medical images background like MRI or CT scans, that is to provide critical information about the human organ’s volumes and shapes. In scientific imaging field, medical imaging is considered one of the most important topics due to the rapid and continuing progress in computerized medical image visualization, advances in analysis approaches and computer-aided diagnosis. Digital image processing becomes more important in healthcare field due to the growing use of direct digital imaging systems for medical diagnostics. Due to medical imaging techniques, approaches of image processing are now applicable in medicine. Generally, various transformations will be needed to extract image data. Also, a digital image can be considered an approximation of a real situation includes some uncertainty derived from the constraints on the process of vision. Since information on the level of uncertainty will influence an expert’s attitude. To address this challenge, we propose novel framework involving interval concept that consider a good tool for dealing with the uncertainty, In the proposed approach, the medical images are transformed into interval valued representation approach and entropies are defined for an image object and background. Then we determine a threshold for lower-bound image and for upper-bound image, and then calculate the mean value for the final output results. To demonstrate the effectiveness of the proposed framework, we evaluate it by using synthetic image and its ground truth. Experimental results showed how performance of the segmentation-based entropy threshold can be enhanced using proposed approach to overcome ambiguity.

  5. Musaev A.A., Grigoriev D.A.
    Extracting knowledge from text messages: overview and state-of-the-art
    Computer Research and Modeling, 2021, v. 13, no. 6, pp. 1291-1315

    In general, solving the information explosion problem can be delegated to systems for automatic processing of digital data. These systems are intended for recognizing, sorting, meaningfully processing and presenting data in formats readable and interpretable by humans. The creation of intelligent knowledge extraction systems that handle unstructured data would be a natural solution in this area. At the same time, the evident progress in these tasks for structured data contrasts with the limited success of unstructured data processing, and, in particular, document processing. Currently, this research area is undergoing active development and investigation. The present paper is a systematic survey on both Russian and international publications that are dedicated to the leading trend in automatic text data processing: Text Mining (TM). We cover the main tasks and notions of TM, as well as its place in the current AI landscape. Furthermore, we analyze the complications that arise during the processing of texts written in natural language (NLP) which are weakly structured and often provide ambiguous linguistic information. We describe the stages of text data preparation, cleaning, and selecting features which, alongside the data obtained via morphological, syntactic, and semantic analysis, constitute the input for the TM process. This process can be represented as mapping a set of text documents to «knowledge». Using the case of stock trading, we demonstrate the formalization of the problem of making a trade decision based on a set of analytical recommendations. Examples of such mappings are methods of Information Retrieval (IR), text summarization, sentiment analysis, document classification and clustering, etc. The common point of all tasks and techniques of TM is the selection of word forms and their derivatives used to recognize content in NL symbol sequences. Considering IR as an example, we examine classic types of search, such as searching for word forms, phrases, patterns and concepts. Additionally, we consider the augmentation of patterns with syntactic and semantic information. Next, we provide a general description of all NLP instruments: morphological, syntactic, semantic and pragmatic analysis. Finally, we end the paper with a comparative analysis of modern TM tools which can be helpful for selecting a suitable TM platform based on the user’s needs and skills.

  6. Stonyakin F.S., Savchuk O.S., Baran I.V., Alkousa M.S., Titov A.A.
    Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods
    Computer Research and Modeling, 2023, v. 15, no. 2, pp. 413-432

    This paper is devoted to some variants of improving the convergence rate guarantees of the gradient-type algorithms for relatively smooth and relatively Lipschitz-continuous problems in the case of additional information about some analogues of the strong convexity of the objective function. We consider two classes of problems, namely, convex problems with a relative functional growth condition, and problems (generally, non-convex) with an analogue of the Polyak – Lojasiewicz gradient dominance condition with respect to Bregman divergence. For the first type of problems, we propose two restart schemes for the gradient type methods and justify theoretical estimates of the convergence of two algorithms with adaptively chosen parameters corresponding to the relative smoothness or Lipschitz property of the objective function. The first of these algorithms is simpler in terms of the stopping criterion from the iteration, but for this algorithm, the near-optimal computational guarantees are justified only on the class of relatively Lipschitz-continuous problems. The restart procedure of another algorithm, in its turn, allowed us to obtain more universal theoretical results. We proved a near-optimal estimate of the complexity on the class of convex relatively Lipschitz continuous problems with a functional growth condition. We also obtained linear convergence rate guarantees on the class of relatively smooth problems with a functional growth condition. For a class of problems with an analogue of the gradient dominance condition with respect to the Bregman divergence, estimates of the quality of the output solution were obtained using adaptively selected parameters. We also present the results of some computational experiments illustrating the performance of the methods for the second approach at the conclusion of the paper. As examples, we considered a linear inverse Poisson problem (minimizing the Kullback – Leibler divergence), its regularized version which allows guaranteeing a relative strong convexity of the objective function, as well as an example of a relatively smooth and relatively strongly convex problem. In particular, calculations show that a relatively strongly convex function may not satisfy the relative variant of the gradient dominance condition.

  7. Vorontsova D.V., Isaeva M.V., Menshikov I.A., Orlov K.Y., Bernadotte A.
    Frequency, time, and spatial electroencephalogram changes after COVID-19 during a simple speech task
    Computer Research and Modeling, 2023, v. 15, no. 3, pp. 691-701

    We found a predominance of α-rhythm patterns in the left hemisphere in healthy people compared to people with COVID-19 history. Moreover, we observe a significant decrease in the left hemisphere contribution to the speech center area in people who have undergone COVID-19 when performing speech tasks.

    Our findings show that the signal in healthy subjects is more spatially localized and synchronized between hemispheres when performing tasks compared to people who recovered from COVID-19. We also observed a decrease in low frequencies in both hemispheres after COVID-19.

    EEG-patterns of COVID-19 are detectable in an unusual frequency domain. What is usually considered noise in electroencephalographic (EEG) data carries information that can be used to determine whether or not a person has had COVID-19. These patterns can be interpreted as signs of hemispheric desynchronization, premature brain ageing, and more significant brain strain when performing simple tasks compared to people who did not have COVID-19.

    In our work, we have shown the applicability of neural networks in helping to detect the long-term effects of COVID-19 on EEG-data. Furthermore, our data following other studies supported the hypothesis of the severity of the long-term effects of COVID-19 detected on the EEG-data of EEG-based BCI. The presented findings of functional activity of the brain– computer interface make it possible to use machine learning methods on simple, non-invasive brain–computer interfaces to detect post-COVID syndrome and develop progress in neurorehabilitation.

  8. Nechaevskiy A.V., Streltsova O.I., Kulikov K.V., Bashashin M.V., Butenko Y.A., Zuev M.I.
    Development of a computational environment for mathematical modeling of superconducting nanostructures with a magnet
    Computer Research and Modeling, 2023, v. 15, no. 5, pp. 1349-1358

    Now days the main research activity in the field of nanotechnology is aimed at the creation, study and application of new materials and new structures. Recently, much attention has been attracted by the possibility of controlling magnetic properties using a superconducting current, as well as the influence of magnetic dynamics on the current–voltage characteristics of hybrid superconductor/ferromagnet (S/F) nanostructures. In particular, such structures include the S/F/S Josephson junction or molecular nanomagnets coupled to the Josephson junctions. Theoretical studies of the dynamics of such structures need processes of a large number of coupled nonlinear equations. Numerical modeling of hybrid superconductor/magnet nanostructures implies the calculation of both magnetic dynamics and the dynamics of the superconducting phase, which strongly increases their complexity and scale, so it is advisable to use heterogeneous computing systems.

    In the course of studying the physical properties of these objects, it becomes necessary to numerically solve complex systems of nonlinear differential equations, which requires significant time and computational resources.

    The currently existing micromagnetic algorithms and frameworks are based on the finite difference or finite element method and are extremely useful for modeling the dynamics of magnetization on a wide time scale. However, the functionality of existing packages does not allow to fully implement the desired computation scheme.

    The aim of the research is to develop a unified environment for modeling hybrid superconductor/magnet nanostructures, providing access to solvers and developed algorithms, and based on a heterogeneous computing paradigm that allows research of superconducting elements in nanoscale structures with magnets and hybrid quantum materials. In this paper, we investigate resonant phenomena in the nanomagnet system associated with the Josephson junction. Such a system has rich resonant physics. To study the possibility of magnetic reversal depending on the model parameters, it is necessary to solve numerically the Cauchy problem for a system of nonlinear equations. For numerical simulation of hybrid superconductor/magnet nanostructures, a computing environment based on the heterogeneous HybriLIT computing platform is implemented. During the calculations, all the calculation times obtained were averaged over three launches. The results obtained here are of great practical importance and provide the necessary information for evaluating the physical parameters in superconductor/magnet hybrid nanostructures.

  9. Petrov A.P., Podlipskaia O.G., Podlipskii O.K.
    Modeling the dynamics of political positions: network density and the chances of minority
    Computer Research and Modeling, 2024, v. 16, no. 3, pp. 785-796

    In some cases, information warfare results in almost whole population accepting one of two contesting points of view and rejecting the other. In other cases, however, the “majority party” gets only a small advantage over the “minority party”. The relevant question is which network characteristics of a population contribute to the minority being able to maintain some significant numbers. Given that some societies are more connected than others, in the sense that they have a higher density of social ties, this question is specified as follows: how does the density of social ties affect the chances of a minority to maintain a significant number? Does a higher density contribute to a landslide victory of majority, or to resistance of minority? To address this issue, we consider information warfare between two parties, called the Left and the Right, in the population, which is represented as a network, the nodes of which are individuals, and the connections correspond to their acquaintance and describe mutual influence. At each of the discrete points in time, each individual decides which party to support based on their attitude, i. e. predisposition to the Left or Right party and taking into account the influence of his network ties. The influence means here that each tie sends a cue with a certain probability to the individual in question in favor of the party that themselves currently support. If the tie switches their party affiliation, they begin to agitate the individual in question for their “new” party. Such processes create dynamics, i. e. the process of changing the partisanship of individuals. The duration of the warfare is exogenously set, with the final time point roughly associated with the election day. The described model is numerically implemented on a scale-free network. Numerical experiments have been carried out for various values of network density. Because of the presence of stochastic elements in the model, 200 runs were conducted for each density value, for each of which the final number of supporters of each of the parties was calculated. It is found that with higher density, the chances increase that the winner will cover almost the entire population. Conversely, low network density contributes to the chances of a minority to maintain significant numbers.

  10. Ansori Moch.F., Al Jasir H., Sihombing A.H., Putra S.M., Nurfaizah D.A., Nurulita E.
    Assessing the impact of deposit benchmark interest rate on banking loan dynamics
    Computer Research and Modeling, 2024, v. 16, no. 4, pp. 1023-1032

    Deposit benchmark interest rates are a policy implemented by banking regulators to calculate the interest rates offered to depositors, maintaining equitable and competitive rates within the financial industry. It functions as a benchmark for determining the pricing of different banking products, expenses, and financial choices. The benchmark rate will have a direct impact on the amount of money deposited, which in turn will determine the amount of money available for lending.We are motivated to analyze the influence of deposit benchmark interest rates on the dynamics of banking loans. This study examines the issue using a difference equation of banking loans. In this process, the decision on the loan amount in the next period is influenced by both the present loan volume and the information on its marginal profit. An analysis is made of the loan equilibrium point and its stability. We also analyze the bifurcations that arise in the model. To ensure a stable banking loan, it is necessary to set the benchmark rate higher than the flip value and lower than the transcritical bifurcation values. The confirmation of this result is supported by the bifurcation diagram and its associated Lyapunov exponent. Insufficient deposit benchmark interest rates might lead to chaotic dynamics in banking lending. Additionally, a bifurcation diagram with two parameters is also shown. We do numerical sensitivity analysis by examining contour plots of the stability requirements, which vary with the deposit benchmark interest rate and other parameters. In addition, we examine a nonstandard difference approach for the previous model, assess its stability, and make a comparison with the standard model. The outcome of our study can provide valuable insights to the banking regulator in making informed decisions regarding deposit benchmark interest rates, taking into account several other banking factors.

Pages: « first previous next last »

Indexed in Scopus

Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU

The journal is included in the Russian Science Citation Index

The journal is included in the RSCI

International Interdisciplinary Conference "Mathematics. Computing. Education"