All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Using extended ODE systems to investigate the mathematical model of the blood coagulation
Computer Research and Modeling, 2022, v. 14, no. 4, pp. 931-951Many properties of ordinary differential equations systems solutions are determined by the properties of the equations in variations. An ODE system, which includes both the original nonlinear system and the equations in variations, will be called an extended system further. When studying the properties of the Cauchy problem for the systems of ordinary differential equations, the transition to extended systems allows one to study many subtle properties of solutions. For example, the transition to the extended system allows one to increase the order of approximation for numerical methods, gives the approaches to constructing a sensitivity function without using numerical differentiation procedures, allows to use methods of increased convergence order for the inverse problem solution. Authors used the Broyden method belonging to the class of quasi-Newtonian methods. The Rosenbroke method with complex coefficients was used to solve the stiff systems of the ordinary differential equations. In our case, it is equivalent to the second order approximation method for the extended system.
As an example of the proposed approach, several related mathematical models of the blood coagulation process were considered. Based on the analysis of the numerical calculations results, the conclusion was drawn that it is necessary to include a description of the factor XI positive feedback loop in the model equations system. Estimates of some reaction constants based on the numerical inverse problem solution were given.
Effect of factor V release on platelet activation was considered. The modification of the mathematical model allowed to achieve quantitative correspondence in the dynamics of the thrombin production with experimental data for an artificial system. Based on the sensitivity analysis, the hypothesis tested that there is no influence of the lipid membrane composition (the number of sites for various factors of the clotting system, except for thrombin sites) on the dynamics of the process.
-
Connection between discrete financial models and continuous models with Wiener and Poisson processes
Computer Research and Modeling, 2023, v. 15, no. 3, pp. 781-795The paper is devoted to the study of relationships between discrete and continuous models financial processes and their probabilistic characteristics. First, a connection is established between the price processes of stocks, hedging portfolio and options in the models conditioned by binomial perturbations and their limit perturbations of the Brownian motion type. Secondly, analogues in the coefficients of stochastic equations with various random processes, continuous and jumpwise, and in the coefficients corresponding deterministic equations for their probabilistic characteristics. Statement of the results on the connections and finding analogies, obtained in this paper, led to the need for an adequate presentation of preliminary information and results from financial mathematics, as well as descriptions of related objects of stochastic analysis. In this paper, partially new and known results are presented in an accessible form for those who are not specialists in financial mathematics and stochastic analysis, and for whom these results are important from the point of view of applications. Specifically, the following sections are presented.
• In one- and n-period binomial models, it is proposed a unified approach to determining on the probability space a risk-neutral measure with which the discounted option price becomes a martingale. The resulting martingale formula for the option price is suitable for numerical simulation. In the following sections, the risk-neutral measures approach is applied to study financial processes in continuous-time models.
• In continuous time, models of the price of shares, hedging portfolios and options are considered in the form of stochastic equations with the Ito integral over Brownian motion and over a compensated Poisson process. The study of the properties of these processes in this section is based on one of the central objects of stochastic analysis — the Ito formula. Special attention is given to the methods of its application.
• The famous Black – Scholes formula is presented, which gives a solution to the partial differential equation for the function $v(t, x)$, which, when $x = S (t)$ is substituted, where $S(t)$ is the stock price at the moment time $t$, gives the price of the option in the model with continuous perturbation by Brownian motion.
• The analogue of the Black – Scholes formula for the case of the model with a jump-like perturbation by the Poisson process is suggested. The derivation of this formula is based on the technique of risk-neutral measures and the independence lemma.
-
Changepoint detection on financial data using deep learning approach
Computer Research and Modeling, 2024, v. 16, no. 2, pp. 555-575The purpose of this study is to develop a methodology for change points detection in time series, including financial data. The theoretical basis of the study is based on the pieces of research devoted to the analysis of structural changes in financial markets, description of the proposed algorithms for detecting change points and peculiarities of building classical and deep machine learning models for solving this type of problems. The development of such tools is of interest to investors and other stakeholders, providing them with additional approaches to the effective analysis of financial markets and interpretation of available data.
To address the research objective, a neural network was trained. In the course of the study several ways of training sample formation were considered, differing in the nature of statistical parameters. In order to improve the quality of training and obtain more accurate results, a methodology for feature generation was developed for the formation of features that serve as input data for the neural network. These features, in turn, were derived from an analysis of mathematical expectations and standard deviations of time series data over specific intervals. The potential for combining these features to achieve more stable results is also under investigation.
The results of model experiments were analyzed to compare the effectiveness of the proposed model with other existing changepoint detection algorithms that have gained widespread usage in practical applications. A specially generated dataset, developed using proprietary methods, was utilized as both training and testing data. Furthermore, the model, trained on various features, was tested on daily data from the S&P 500 index to assess its effectiveness in a real financial context.
As the principles of the model’s operation are described, possibilities for its further improvement are considered, including the modernization of the proposed model’s structure, optimization of training data generation, and feature formation. Additionally, the authors are tasked with advancing existing concepts for real-time changepoint detection.
-
On the question of choosing the structure of a multivariate regression model on the example of the analysis of burnout factors of artists
Computer Research and Modeling, 2021, v. 13, no. 1, pp. 265-274The article discusses the problem of the influence of the research goals on the structure of the multivariate model of regression analysis (in particular, on the implementation of the procedure for reducing the dimension of the model). It is shown how bringing the specification of the multiple regression model in line with the research objectives affects the choice of modeling methods. Two schemes for constructing a model are compared: the first does not allow taking into account the typology of primary predictors and the nature of their influence on the performance characteristics, the second scheme implies a stage of preliminary division of the initial predictors into groups, in accordance with the objectives of the study. Using the example of solving the problem of analyzing the causes of burnout of creative workers, the importance of the stage of qualitative analysis and systematization of a priori selected factors is shown, which is implemented not by computing means, but by attracting the knowledge and experience of specialists in the studied subject area. The presented example of the implementation of the approach to determining the specification of the regression model combines formalized mathematical and statistical procedures and the preceding stage of the classification of primary factors. The presence of this stage makes it possible to explain the scheme of managing (corrective) actions (softening the leadership style and increasing approval lead to a decrease in the manifestations of anxiety and stress, which, in turn, reduces the severity of the emotional exhaustion of the team members). Preclassification also allows avoiding the combination in one main component of controlled and uncontrolled, regulatory and controlled feature factors, which could worsen the interpretability of the synthesized predictors. On the example of a specific problem, it is shown that the selection of factors-regressors is a process that requires an individual solution. In the case under consideration, the following were consistently used: systematization of features, correlation analysis, principal component analysis, regression analysis. The first three methods made it possible to significantly reduce the dimension of the problem, which did not affect the achievement of the goal for which this task was posed: significant measures of controlling influence on the team were shown. allowing to reduce the degree of emotional burnout of its participants.
-
Theoretical modeling consensus building in the work of standardization technical committees in coalitions based on regular Markov chains
Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1247-1256Often decisions in social groups are made by consensus. This applies, for example, to the examination in the technical committee for standardization (TC) before the approval of the national standard by Rosstandart. The standard is approved if and only if the secured consensus in the TC. The same approach to standards development was adopted in almost all countries and at the regional and international level. Previously published works of authors dedicated to the construction of a mathematical model of time to reach consensus in technical committees for standardization in terms of variation in the number of TC members and their level of authoritarianism. The present study is a continuation of these works for the case of the formation of coalitions that are often formed during the consideration of the draft standard to the TC. In the article the mathematical model is constructed to ensure consensus on the work of technical standardization committees in terms of coalitions. In the framework of the model it is shown that in the presence of coalitions consensus is not achievable. However, the coalition, as a rule, are overcome during the negotiation process, otherwise the number of the adopted standards would be extremely small. This paper analyzes the factors that influence the bridging coalitions: the value of the assignment and an index of the effect of the coalition. On the basis of statistical modelling of regular Markov chains is investigated their effects on the time to ensure consensus in the technical Committee. It is proved that the time to reach consensus significantly depends on the value of unilateral concessions coalition and weakly depends on the size of coalitions. Built regression model of dependence of the average number of approvals from the value of the assignment. It was revealed that even a small concession leads to the onset of consensus, increasing the size of the assignment results (with other factors being equal) to a sharp decline in time before the consensus. It is shown that the assignment of a larger coalition against small coalitions takes on average more time before consensus. The result has practical value for all organizational structures, where the emergence of coalitions entails the inability of decision-making in the framework of consensus and requires the consideration of various methods for reaching a consensus decision.
-
Estimation of probabilistic model of employee labor process
Computer Research and Modeling, 2012, v. 4, no. 4, pp. 969-975Views (last year): 1.The mathematical estimation model for employee labor process, built on the basis of Bayesian network is presented in the article. The great attention is given to the estimation of qualitative characteristics of labor product. Usage of described model is supposed in the companies with the management employee workflows system.
-
3D molecular dynamic simulation of thermodynamic equilibrium problem for heated nickel
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 573-579Views (last year): 2.This work is devoted to molecular dynamic modeling of the thermal impact processes on the metal sample consisting of nickel atoms. For the solution of this problem, a continuous mathematical model on the basis of the classical Newton mechanics equations has been used; a numerical method based on the Verlet scheme has been chosen; a parallel algorithm has been offered, and its realization within the MPI and OpenMP technologies has been executed. By means of the developed parallel program, the investigation of thermodynamic equilibrium of nickel atoms’ system under the conditions of heating a sample to desired temperature has been executed. In numerical experiments both optimum parameters of calculation procedure and physical parameters of analyzed process have been defined. The obtained numerical results are well corresponding to known theoretical and experimental data.
-
Interactive graphical toolkit global computer simulations in marine service operational forecasts
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 641-648Citations: 1 (RSCI).Efficiency and completeness of the numerical simulation in oceanography and hydrometeorology are entirely determined by algorithmic features of the construction of an interactive computer simulations in the scale of the oceans with adaptive coated closed seas and coastal waters refined mathematical models, with the possibility of specifying software parallelization calculations near the concrete — the protected areas of the sea coast. An important component of the research is continuous graphical visualization techniques in the course of calculations, including those undertaken in parallel processes with shared RAM or test points on the external media. The results of computational experiments are used in the description of hydrodynamic processes near the coast, which is important in keeping the organization of sea control services and forecasting marine hazards.
-
Modeling of behavior of the option. The formulation of the problem
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 759-766Views (last year): 2. Citations: 1 (RSCI).Object of research: The creation of algorithm for mass computations of options‘ price for formation of a riskless portfolio. The method is based on the generalization of the Black–Scholes method. The task is the modeling of behavior of all options and tools for their insurance. This task is characterized by large volume of realtime complex computations that should be executed concurrently The problem of the research: depending on conditions approaches to the solution should be various. There are three methods which can be used with different conditions: the finite difference method, the path-integral approach and methods which work in conditions of trade stop. Distributed computating in these three cases is organized differently and it is necessary to involve various approaches. In addition to complexity the mathematical formulation of the problem in literature is not quite correct. There is no complete description of boundary and initial conditions and also several hypotheses of the model do not correspond to real market. It is necessary to give mathematically correct formulation of the task, and to neutralize a difference between hypotheses of the model and their prototypes in the market. For this purpose it is necessary to expand standard formulation by additional methods and develop methods of realization for each of solution branches.
-
Algorithmic construction of explicit numerical schemes and visualization of objects and processes in the computational experiment in fluid mechanics
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 767-774Views (last year): 1.The paper discusses the design and verification stages in the development of complex numerical algorithms to create direct computational experiments in fluid mechanics. The modeling of physical fields and nonstationary processes of continuum mechanics, it is desirable to rely on strict rules of construction the numerical objects and related computational algorithms. Synthesis of adaptive the numerical objects and effective arithmetic- logic operations can serve to optimize the whole computing tasks, provided strict following and compliance with the original of the laws of fluid mechanics. The possibility of using ternary logic enables to resolve some contradictions of functional and declarative programming in the implementation of purely applied problems of mechanics. Similar design decisions lead to new numerical schemes tensor mathematics to help optimize effectiveness and validate correctness the simulation results. The most important consequence is the possibility of using interactive graphical techniques for the visualization of intermediate results of modeling, as well as managed to influence the course of computing experiment under the supervision of engineers aerohydrodynamics– researchers.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"