All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Computational design of closed-chain linkages: synthesis of ergonomic spine support module of exosuit
Computer Research and Modeling, 2022, v. 14, no. 6, pp. 1269-1280The article focuses on the problem of mechanisms’ co-design for robotic systems to perform adaptive physical interaction with an unstructured environment, including physical human robot interaction. The co-design means simultaneous optimization of mechanics and control system, ensuring optimal behavior and performance of the system. Mechanics optimization refers to the search for optimal structure, geometric parameters, mass distribution among the links and their compliance; control refers to the search for motion trajectories for mechanism’s joints. The paper presents a generalized method of structural-parametric synthesis of underactuated mechanisms with closed kinematics for robotic systems for various purposes, e. g., it was previously used for the co-design of fingers’ mechanisms for anthropomorphic gripper and legs’ mechanisms for galloping robots. The method implements the concept of morphological computation of control laws due to the features of mechanical design, minimizing the control effort from the algorithmic component of the control system, which reduces the requirements for the level of technical equipment and reduces energy consumption. In this paper, the proposed method is used to optimize the structure and geometric parameters of the passive mechanism of the back support module of an industrial exosuit. Human movements are diverse and non-deterministic when compared with the movements of autonomous robots, which complicates the design of wearable robotic devices. To reduce injuries, fatigue and increase the productivity of workers, the synthesized industrial exosuit should not only compensate for loads, but also not interfere with the natural human motions. To test the developed exosuit, kinematic datasets from motion capture of an entire human body during industrial operations were used. The proposed method of structural-parametric synthesis was used to improve the ergonomics of a wearable robotic device. Verification of the synthesized mechanism was carried out using simulation: the passive module of the back is attached to two geometric primitives that move the chest and pelvis of the exosuit operator in accordance with the motion capture data. The ergonomics of the back module is quantified by the distance between the joints connecting the upper and bottom parts of the exosuit; minimizing deviation from the average value corresponds to a lesser limitation of the operator’s movement, i. e. greater ergonomics. The article provides a detailed description of the method of structural-parametric synthesis, an example of synthesis of an exosuit module and the results of simulation.
-
Variational principle for shape memory solids under variable external forces and temperatures
Computer Research and Modeling, 2021, v. 13, no. 3, pp. 541-555The quasistatic deformation problem for shape memory alloys is reviewed within the phenomenological mechanics of solids without microphysics analysis. The phenomenological approach is based on comparison of two material deformation diagrams. The first diagram corresponds to the active proportional loading when the alloy behaves as an ideal elastoplastic material; the residual strain is observed after unloading. The second diagram is relevant to the case when the deformed sample is heated to a certain temperature for each alloy. The initial shape is restored: the reverse distortion matches deformations on the first diagram, except for the sign. Because the first step of distortion can be described with the variational principle, for which the existence of the generalized solutions is proved under arbitrary loading, it becomes clear how to explain the reverse distortion within the slightly modified theory of plasticity. The simply connected surface of loading needs to be replaced with the doubly connected one, and the variational principle needs to be updated with two laws of thermodynamics and the principle of orthogonality for thermodynamic forces and streams. In this case it is not difficult to prove the existence of solutions either. The successful application of the theory of plasticity under the constant temperature causes the need to obtain a similar result for a more general case of variable external forces and temperatures. The paper studies the ideal elastoplastic von Mises model at linear strain rates. Taking into account hardening and arbitrary loading surface does not cause any additional difficulties.
The extended variational principle of the Reissner type is defined. Together with the laws of thermal plasticity it enables to prove the existence of the generalized solutions for three-dimensional bodies made of shape memory materials. The main issue to resolve is a challenge to choose a functional space for the rates and deformations of the continuum points. The space of bounded deformation, which is the main instrument of the mathematical theory of plasticity, serves this purpose in the paper. The proving process shows that the choice of the functional spaces used in the paper is not the only one. The study of other possible problem settings for the extended variational principle and search for regularity of generalized solutions seem an interesting challenge for future research.
-
Searching for connections between biological and physico-chemical characteristics of Rybinsk reservoir ecosystem. Part 1. Criteria of connection nonrandomness
Computer Research and Modeling, 2013, v. 5, no. 1, pp. 83-105Views (last year): 3. Citations: 6 (RSCI).Based on contents of phytoplankton pigments, fluorescence samples and some physico-chemical characteristics of the Rybinsk reservoir waters, searching for connections between biological and physicalchemical characteristics is working out. The standard methods of statistical analysis (correlation, regression), methods of description of connection between qualitative classes of characteristics, based on deviation of the studied characteristics distribution from independent distribution, are studied. A method of searching for boundaries of quality classes by criterion of maximum connection coefficient is offered.
-
Machine learning interpretation of inter-well radiowave survey data
Computer Research and Modeling, 2019, v. 11, no. 4, pp. 675-684Views (last year): 3.Traditional geological search methods going to be ineffective. The exploration depth of kimberlite bodies and ore deposits has increased significantly. The only direct exploration method is to drill a system of wells to the depths that provide access to the enclosing rocks. Due to the high cost of drilling, the role of inter-well survey methods has increased. They allows to increase the mean well spacing without significantly reducing the kimberlite or ore body missing probability. The method of inter-well radio wave survey is effective to search for high contrast conductivity objects. The physics of the method based on the dependence of the electromagnetic wave propagation on the propagation medium conductivity. The source and receiver of electromagnetic radiation is an electric dipole, they are placed in adjacent wells. The distance between the source and receiver is known. Therefore we could estimate the medium absorption coefficient by the rate of radio wave amplitude decrease. Low electrical resistance rocks corresponds to high absorption of radio waves. The inter-well measurement data allows to estimate an effective electrical resistance (or conductivity) of the rock. Typically, the source and receiver are immersed in adjacent wells synchronously. The value of the of the electric field amplitude measured at the receiver site allows to estimate the average value of the attenuation coefficient on the line connecting the source and receiver. The measurements are taken during stops, approximately every 5 m. The distance between stops is much less than the distance between adjacent wells. This leads to significant spatial anisotropy in the measured data distribution. Drill grid covers a large area, and our point is to build a three-dimensional model of the distribution of the electrical properties of the inter-well space throughout the whole area. The anisotropy of spatial distribution makes hard to the use of standard geostatistics approach. To build a three-dimensional model of attenuation coefficient, we used one of machine learning theory methods, the method of nearest neighbors. In this method, the value of the absorption coefficient at a given point is calculated by $k$ nearest measurements. The number $k$ should be determined from additional reasons. The spatial distribution anisotropy effect can be reduced by changing the spatial scale in the horizontal direction. The scale factor $\lambda$ is one yet external parameter of the problem. To select the parameters $k$ and $\lambda$ values we used the determination coefficient. To demonstrate the absorption coefficient three-dimensional image construction we apply the procedure to the inter-well radio wave survey data. The data was obtained at one of the sites in Yakutia.
-
Synchronous components of financial time series
Computer Research and Modeling, 2017, v. 9, no. 4, pp. 639-655The article proposes a method of joint analysis of multidimensional financial time series based on the evaluation of the set of properties of stock quotes in a sliding time window and the subsequent averaging of property values for all analyzed companies. The main purpose of the analysis is to construct measures of joint behavior of time series reacting to the occurrence of a synchronous or coherent component. The coherence of the behavior of the characteristics of a complex system is an important feature that makes it possible to evaluate the approach of the system to sharp changes in its state. The basis for the search for precursors of sharp changes is the general idea of increasing the correlation of random fluctuations of the system parameters as it approaches the critical state. The increments in time series of stock values have a pronounced chaotic character and have a large amplitude of individual noises, against which a weak common signal can be detected only on the basis of its correlation in different scalar components of a multidimensional time series. It is known that classical methods of analysis based on the use of correlations between neighboring samples are ineffective in the processing of financial time series, since from the point of view of the correlation theory of random processes, increments in the value of shares formally have all the attributes of white noise (in particular, the “flat spectrum” and “delta-shaped” autocorrelation function). In connection with this, it is proposed to go from analyzing the initial signals to examining the sequences of their nonlinear properties calculated in time fragments of small length. As such properties, the entropy of the wavelet coefficients is used in the decomposition into the Daubechies basis, the multifractal parameters and the autoregressive measure of signal nonstationarity. Measures of synchronous behavior of time series properties in a sliding time window are constructed using the principal component method, moduli values of all pairwise correlation coefficients, and a multiple spectral coherence measure that is a generalization of the quadratic coherence spectrum between two signals. The shares of 16 large Russian companies from the beginning of 2010 to the end of 2016 were studied. Using the proposed method, two synchronization time intervals of the Russian stock market were identified: from mid-December 2013 to mid- March 2014 and from mid-October 2014 to mid-January 2016.
Keywords: financial time series, wavelets, entropy, multi-fractals, predictability, synchronization.Views (last year): 12. Citations: 2 (RSCI). -
Comparative analysis of human adaptation to the growth of visual information in the tasks of recognizing formal symbols and meaningful images
Computer Research and Modeling, 2021, v. 13, no. 3, pp. 571-586We describe an engineering-psychological experiment that continues the study of ways to adapt a person to the increasing complexity of logical problems by presenting a series of problems of increasing complexity, which is determined by the volume of initial data. Tasks require calculations in an associative or non-associative system of operations. By the nature of the change in the time of solving the problem, depending on the number of necessary operations, we can conclude that a purely sequential method of solving problems or connecting additional brain resources to the solution in parallel mode. In a previously published experimental work, a person in the process of solving an associative problem recognized color images with meaningful images. In the new study, a similar problem is solved for abstract monochrome geometric shapes. Analysis of the result showed that for the second case, the probability of the subject switching to a parallel method of processing visual information is significantly reduced. The research method is based on presenting a person with two types of tasks. One type of problem contains associative calculations and allows a parallel solution algorithm. Another type of problem is the control one, which contains problems in which calculations are not associative and parallel algorithms are ineffective. The task of recognizing and searching for a given object is associative. A parallel strategy significantly speeds up the solution with relatively small additional resources. As a control series of problems (to separate parallel work from the acceleration of a sequential algorithm), we use, as in the previous experiment, a non-associative comparison problem in cyclic arithmetic, presented in the visual form of the game “rock, paper, scissors”. In this problem, the parallel algorithm requires a large number of processors with a small efficiency coefficient. Therefore, the transition of a person to a parallel algorithm for solving this problem is almost impossible, and the acceleration of processing input information is possible only by increasing the speed. Comparing the dependence of the solution time on the volume of source data for two types of problems allows us to identify four types of strategies for adapting to the increasing complexity of the problem: uniform sequential, accelerated sequential, parallel computing (where possible), or undefined (for this method) strategy. The Reducing of the number of subjects, who switch to a parallel strategy when encoding input information with formal images, shows the effectiveness of codes that cause subject associations. They increase the speed of human perception and processing of information. The article contains a preliminary mathematical model that explains this phenomenon. It is based on the appearance of a second set of initial data, which occurs in a person as a result of recognizing the depicted objects.
-
System to store DNA physical properties profiles with application to the promoters of Escherichia coli
Computer Research and Modeling, 2013, v. 5, no. 3, pp. 443-450Views (last year): 3.Database to store, search and retrieve DNA physical properties profiles has been developed and its use for analysis of E. coli promoters has been demonstrated. Unique feature of the database is in its ability to handle whole profile as single internal object type in a way similar to integers or character strings. To demonstrate utility of such database it was populated with data of 1227 known promoters, their nucleotide sequence, profile of electrostatic potential, transcription factor binding sites. Each promoter is also connected to all genes, whose transcription is controlled by that promoter. Content of the database is available for search via web interface. Source code of profile datatype and library to work with it from R/Bioconductor are available from the internet, dump of the database is available from authors by request.
-
Investigation of individual-based mechanisms of single-species population dynamics by logical deterministic cellular automata
Computer Research and Modeling, 2015, v. 7, no. 6, pp. 1279-1293Views (last year): 16. Citations: 3 (RSCI).Investigation of logical deterministic cellular automata models of population dynamics allows to reveal detailed individual-based mechanisms. The search for such mechanisms is important in connection with ecological problems caused by overexploitation of natural resources, environmental pollution and climate change. Classical models of population dynamics have the phenomenological nature, as they are “black boxes”. Phenomenological models fundamentally complicate research of detailed mechanisms of ecosystem functioning. We have investigated the role of fecundity and duration of resources regeneration in mechanisms of population growth using four models of ecosystem with one species. These models are logical deterministic cellular automata and are based on physical axiomatics of excitable medium with regeneration. We have modeled catastrophic death of population arising from increasing of resources regeneration duration. It has been shown that greater fecundity accelerates population extinction. The investigated mechanisms are important for understanding mechanisms of sustainability of ecosystems and biodiversity conservation. Prospects of the presented modeling approach as a method of transparent multilevel modeling of complex systems are discussed.
-
Proof of the connection between the Backman model with degenerate cost functions and the model of stable dynamics
Computer Research and Modeling, 2022, v. 14, no. 2, pp. 335-342Since 1950s the field of city transport modelling has progressed rapidly. The first equilibrium distribution models of traffic flow appeared. The most popular model (which is still being widely used) was the Beckmann model, based on the two Wardrop principles. The core of the model could be briefly described as the search for the Nash equilibrium in a population demand game, in which losses of agents (drivers) are calculated based on the chosen path and demands of this path with correspondences being fixed. The demands (costs) of a path are calculated as the sum of the demands of different path segments (graph edges), that are included in the path. The costs of an edge (edge travel time) are determined by the amount of traffic on this edge (more traffic means larger travel time). The flow on a graph edge is determined by the sum of flows over all paths passing through the given edge. Thus, the cost of traveling along a path is determined not only by the choice of the path, but also by the paths other drivers have chosen. Thus, it is a standard game theory task. The way cost functions are constructed allows us to narrow the search for equilibrium to solving an optimization problem (game is potential in this case). If the cost functions are monotone and non-decreasing, the optimization problem is convex. Actually, different assumptions about the cost functions form different models. The most popular model is based on the BPR cost function. Such functions are massively used in calculations of real cities. However, in the beginning of the XXI century, Yu. E. Nesterov and A. de Palma showed that Beckmann-type models have serious weak points. Those could be fixed using the stable dynamics model, as it was called by the authors. The search for equilibrium here could be also reduced to an optimization problem, moreover, the problem of linear programming. In 2013, A.V.Gasnikov discovered that the stable dynamics model can be obtained by a passage to the limit in the Beckmann model. However, it was made only for several practically important, but still special cases. Generally, the question if this passage to the limit is possible remains open. In this paper, we provide the justification of the possibility of the above-mentioned passage to the limit in the general case, when the cost function for traveling along the edge as a function of the flow along the edge degenerates into a function equal to fixed costs until the capacity is reached and it is equal to plus infinity when the capacity is exceeded.
-
Searching for connections between biological and physico-chemical characteristics of Rybinsk reservoir ecosystem. Part 3. Calculation of the boundaries of water quality classes
Computer Research and Modeling, 2013, v. 5, no. 3, pp. 451-471Views (last year): 4. Citations: 4 (RSCI).Approbation of calculation of borders of water quality classes for the purpose of ecological diagnosis and standardization by data of the Rybinsk reservoir is carried out. For bioindication indicators of phytoplankton fluorescence and the contents of pigments of phytoplankton are used. Chesnokov's importance coefficient proved to be the most preferred measure of connection for analyzing the effects of environmental factors on indicators. The factors important for environmental condition are identified. Comparison of borders between quality classes “valid” and “invalid” of factors values and boundaries of the classifications of water quality.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"