All issues
- 2025 Vol. 17
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Modeling of the macromolecular composition dynamics of microalgae batch culture
Computer Research and Modeling, 2023, v. 15, no. 3, pp. 739-756The work focuses on mathematical modeling of light influence mechanisms on macromolecular composition of microalgae batch culture. It is shown that even with a single limiting factor, the growth of microalgae is associated with a significant change in the biochemical composition of the biomass in any part of the batch curve. The well-known qualitative models of microalgae are based on concepts of enzymatic kinetics and do not take into account the possible change of the limiting factor during batch culture growth. Such models do not allow describing the dynamics of the relative content of biochemical components of cells. We proposed an alternative approach which is based on generally accepted two-stage photoautotrophic growth of microalgae. Microalgae biomass can be considered as the sum of two macromolecular components — structural and reserve. At the first stage, during photosynthesis a reserve part of biomass is formed, from which the biosynthesis of cell structures occurs at the second stage. Model also assumes the proportionality of all biomass structural components which greatly simplifies mathematical calculations and experimental data fitting. The proposed mathematical model is represented by a system of two differential equations describing the synthesis of reserve biomass compounds at the expense of light and biosynthesis of structural components from reserve ones. The model takes into account that a part of the reserve compounds is spent on replenishing the pool of macroergs. The rates of synthesis of structural and reserve forms of biomass are given by linear splines. Such approach allows us to mathematically describe the change in the limiting factor with an increase in the biomass of the enrichment culture of microalgae. It is shown that under light limitation conditions the batch curve must be divided into several areas: unlimited growth, low cell concentration and optically dense culture. The analytical solutions of the basic system of equations describing the dynamics of macromolecular biomass content made it possible to determine species-specific coefficients for various light conditions. The model was verified on the experimental data of biomass growth and dynamics of chlorophyll $a$ content of the red marine microalgae Pоrphуridium purpurеum batch culture.
-
Modeling the dynamics of plankton community considering the trophic characteristics of zooplankton
Computer Research and Modeling, 2024, v. 16, no. 2, pp. 525-554We propose a four-component model of a plankton community with discrete time. The model considers the competitive relationships of phytoplankton groups exhibited between each other and the trophic characteristics zooplankton displays: it considers the division of zooplankton into predatory and non-predatory components. The model explicitly represents the consumption of non-predatory zooplankton by predatory. Non-predatory zooplankton feeds on phytoplankton, which includes two competing components: toxic and non-toxic types, with the latter being suitable for zooplankton food. A model of two coupled Ricker equations, focused on describing the dynamics of a competitive community, describes the interaction of two phytoplanktons and allows implicitly taking into account the limitation of each of the competing components of biomass growth by the availability of external resources. The model describes the prey consumption by their predators using a Holling type II trophic function, considering predator saturation.
The analysis of scenarios for the transition from stationary dynamics to fluctuations in the population size of community members showed that the community loses the stability of the non-trivial equilibrium corresponding to the coexistence of the complete community both through a cascade of period-doubling bifurcations and through a Neimark – Sacker bifurcation leading to the emergence of quasi-periodic oscillations. Although quite simple, the model proposed in this work demonstrates dynamics of comunity similar to that natural systems and experiments observe: with a lag of predator oscillations relative to the prey by about a quarter of the period, long-period antiphase cycles of predator and prey, as well as hidden cycles in which the prey density remains almost constant, and the predator density fluctuates, demonstrating the influence fast evolution exhibits that masks the trophic interaction. At the same time, the variation of intra-population parameters of phytoplankton or zooplankton can lead to pronounced changes the community experiences in the dynamic mode: sharp transitions from regular to quasi-periodic dynamics and further to exact cycles with a small period or even stationary dynamics. Quasi-periodic dynamics can arise at sufficiently small phytoplankton growth rates corresponding to stable or regular community dynamics. The change of the dynamic mode in this area (the transition from stable dynamics to quasi-periodic and vice versa) can occur due to the variation of initial conditions or external influence that changes the current abundances of components and shifts the system to the basin of attraction of another dynamic mode.
-
Multicriterial metric data analysis in human capital modelling
Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1223-1245The article describes a model of a human in the informational economy and demonstrates the multicriteria optimizational approach to the metric analysis of model-generated data. The traditional approach using the identification and study involves the model’s identification by time series and its further prediction. However, this is not possible when some variables are not explicitly observed and only some typical borders or population features are known, which is often the case in the social sciences, making some models pure theoretical. To avoid this problem, we propose a method of metric data analysis (MMDA) for identification and study of such models, based on the construction and analysis of the Kolmogorov – Shannon metric nets of the general population in a multidimensional space of social characteristics. Using this method, the coefficients of the model are identified and the features of its phase trajectories are studied. In this paper, we are describing human according to his role in information processing, considering his awareness and cognitive abilities. We construct two lifetime indices of human capital: creative individual (generalizing cognitive abilities) and productive (generalizing the amount of information mastered by a person) and formulate the problem of their multi-criteria (two-criteria) optimization taking into account life expectancy. This approach allows us to identify and economically justify the new requirements for the education system and the information environment of human existence. It is shown that the Pareto-frontier exists in the optimization problem, and its type depends on the mortality rates: at high life expectancy there is one dominant solution, while for lower life expectancy there are different types of Paretofrontier. In particular, the Pareto-principle applies to Russia: a significant increase in the creative human capital of an individual (summarizing his cognitive abilities) is possible due to a small decrease in the creative human capital (summarizing awareness). It is shown that the increase in life expectancy makes competence approach (focused on the development of cognitive abilities) being optimal, while for low life expectancy the knowledge approach is preferable.
-
Reinforcement learning in optimisation of financial market trading strategy parameters
Computer Research and Modeling, 2024, v. 16, no. 7, pp. 1793-1812High frequency algorithmic trading became is a subclass of trading which is focused on gaining basis-point like profitability on sub-second time frames. Such trading strategies do not depend on most of the factors eligible for the longer-term trading and require specific approach. There were many attempts to utilize machine learning techniques to both high and low frequency trading. However, it is still having limited application in the real world trading due to high exposure to overfitting, requirements for rapid adaptation to new market regimes and overall instability of the results. We conducted a comprehensive research on combination of known quantitative theory and reinforcement learning methods in order derive more effective and robust approach at construction of automated trading system in an attempt to create a support for a known algorithmic trading techniques. Using classical price behavior theories as well as modern application cases in sub-millisecond trading, we utilized the Reinforcement Learning models in order to improve quality of the algorithms. As a result, we derived a robust model which utilize Deep Reinforcement learning in order to optimise static market making trading algorithms’ parameters capable of online learning on live data. More specifically, we explored the system in the derivatives cryptocurrency market which mostly not dependent on external factors in short terms. Our research was implemented in high-frequency environment and the final models showed capability to operate within accepted high-frequency trading time-frames. We compared various combinations of Deep Reinforcement Learning approaches and the classic algorithms and evaluated robustness and effectiveness of improvements for each combination.
-
Visualization of work of a distributed application based on the mqcloud library
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 529-532Citations: 1 (RSCI).Independent components communicating with each other due to complex control make the work of complex distributed computer systems poorly scalable within the framework of the existing communication middleware. Two major problems of such systems' scaling can be defined: overloading of unequal nodes due to proportional redistribution of workload and difficulties in the realization of continuous communication between several nodes of the system. This paper is focused on the developed solution enabling visualization of the work of such a dynamical system.
-
Decomposition of the modeling task of some objects of archeological research for processing in a distributed computer system
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 533-537Views (last year): 1. Citations: 2 (RSCI).Although each task of recreating artifacts is truly unique, the modeling process for façades, foundations and building elements can be parametrized. This paper is focused on a complex of the existing programming libraries and solutions that need to be united into a single computer system to solve such a task. An algorithm of generating 3D filling of objects under reconstruction is presented. The solution architecture necessary for the system's adaptation for a cloud environment is studied.
-
Defining volunteer computing: a formal approach
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 565-571Volunteer computing resembles private desktop grids whereas desktop grids are not fully equivalent to volunteer computing. There are several attempts to distinguish and categorize them using informal and formal methods. However, most formal approaches model a particular middleware and do not focus on the general notion of volunteer or desktop grid computing. This work makes an attempt to formalize their characteristics and relationship. To this end formal modeling is applied that tries to grasp the semantic of their functionalities — as opposed to comparisons based on properties, features, etc. We apply this modeling method to formalize the Berkeley Open Infrastructure for Network Computing (BOINC) [Anderson D. P., 2004] volunteer computing system.
-
Memory benchmarking characterisation of ARM-based SoCs
Computer Research and Modeling, 2015, v. 7, no. 3, pp. 607-613Computational intensity is traditionally the focus of large-scale computing system designs, generally leaving such designs ill-equipped to efficiently handle throughput-oriented workloads. In addition, cost and energy consumption considerations for large-scale computing systems in general remain a source of concern. A potential solution involves using low-cost, low-power ARM processors in large arrays in a manner which provides massive parallelisation and high rates of data throughput (relative to existing large-scale computing designs). Giving greater priority to both throughput-rate and cost considerations increases the relevance of primary memory performance and design optimisations to overall system performance. Using several primary memory performance benchmarks to evaluate various aspects of RAM and cache performance, we provide characterisations of the performances of four different models of ARM-based system-on-chip, namely the Cortex-A9, Cortex- A7, Cortex-A15 r3p2 and Cortex-A15 r3p3. We then discuss the relevance of these results to high volume computing and the potential for ARM processors.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"




