All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Neural network model of human intoxication functional state determining in some problems of transport safety solution
Computer Research and Modeling, 2018, v. 10, no. 3, pp. 285-293Views (last year): 42. Citations: 2 (RSCI).This article solves the problem of vehicles drivers intoxication functional statedetermining. Its solution is relevant in the transport security field during pre-trip medical examination. The problem solution is based on the papillomometry method application, which allows to evaluate the driver state by his pupillary reaction to illumination change. The problem is to determine the state of driver inebriation by the analysis of the papillogram parameters values — a time series characterizing the change in pupil dimensions upon exposure to a short-time light pulse. For the papillograms analysis it is proposed to use a neural network. A neural network model for determining the drivers intoxication functional state is developed. For its training, specially prepared data samples are used which are the values of the following parameters of pupillary reactions grouped into two classes of functional states of drivers: initial diameter, minimum diameter, half-constriction diameter, final diameter, narrowing amplitude, rate of constriction, expansion rate, latent reaction time, the contraction time, the expansion time, the half-contraction time, and the half-expansion time. An example of the initial data is given. Based on their analysis, a neural network model is constructed in the form of a single-layer perceptron consisting of twelve input neurons, twenty-five neurons of the hidden layer, and one output neuron. To increase the model adequacy using the method of ROC analysis, the optimal cut-off point for the classes of solutions at the output of the neural network is determined. A scheme for determining the drivers intoxication state is proposed, which includes the following steps: pupillary reaction video registration, papillogram construction, parameters values calculation, data analysis on the base of the neural network model, driver’s condition classification as “norm” or “rejection of the norm”, making decisions on the person being audited. A medical worker conducting driver examination is presented with a neural network assessment of his intoxication state. On the basis of this assessment, an opinion on the admission or removal of the driver from driving the vehicle is drawn. Thus, the neural network model solves the problem of increasing the efficiency of pre-trip medical examination by increasing the reliability of the decisions made.
-
Web-based interactive registry of the geosensors
Computer Research and Modeling, 2016, v. 8, no. 4, pp. 621-632Views (last year): 5.Selection and correct applying of the geosensor — the instrument of mineral geothermobarometry is challenging because of the wide variety of existing geosensors on the one hand and the availability of specific requirements for their use on the other. In this paper, organization of the geosensors within the computer system called interactive registry was proposed for reducing the labor intensity of the geosensors usage and providing information support for them. The article provides a formal description of the thermodynamic geosensor, as a function of the minerals composition and independent parameters, as well as the basic steps of pressure and temperature estimation which are common for all geosensors: conversion to the formula units, calculation of the additional parameters and the calculation of the required values. Existing collections of geosensors made as standalone applications, or as spreadsheets was examined for advantages and disadvantages of these approaches. Additional information necessary to use the geosensor was described: paragenesis, accuracy and range of parameter values, reference and others. Implementation of the geosensors registry as the webbased application which uses wiki technology was proposed. Usage of the wiki technology allows to effectively organize not so well formalized additional information about the geosensor and it’s algorithm which had written in a programming language into a single information system. For information organization links, namespaces and wiki markup was used. The article discusses the implementation of the applications on the top of DokuWiki system with specially designed RESTful server, allowing users to apply the geosensors from the registry to their own data. Programming language R uses as a geosensors description language. RServe server uses for calculations. The unittest for each geosensor allows to check the correctness of it’s implementation. The user interface of the application was developed as DokuWiki plug-in. The example of usage was given. In the article conclusion, the questions of the application security, performance and scaling was discussed.
-
Four-factor computing experiment for the random walk on a two-dimensional square field
Computer Research and Modeling, 2017, v. 9, no. 6, pp. 905-918Views (last year): 21.Nowadays the random search became a widespread and effective tool for solving different complex optimization and adaptation problems. In this work, the problem of an average duration of a random search for one object by another is regarded, depending on various factors on a square field. The problem solution was carried out by holding total experiment with 4 factors and orthogonal plan with 54 lines. Within each line, the initial conditions and the cellular automaton transition rules were simulated and the duration of the search for one object by another was measured. As a result, the regression model of average duration of a random search for an object depending on the four factors considered, specifying the initial positions of two objects, the conditions of their movement and detection is constructed. The most significant factors among the factors considered in the work that determine the average search time are determined. An interpretation is carried out in the problem of random search for an object from the constructed model. The important result of the work is that the qualitative and quantitative influence of initial positions of objects, the size of the lattice and the transition rules on the average duration of search is revealed by means of model obtained. It is shown that the initial neighborhood of objects on the lattice does not guarantee a quick search, if each of them moves. In addition, it is quantitatively estimated how many times the average time of searching for an object can increase or decrease with increasing the speed of the searching object by 1 unit, and also with increasing the field size by 1 unit, with different initial positions of the two objects. The exponential nature of the growth in the number of steps for searching for an object with an increase in the lattice size for other fixed factors is revealed. The conditions for the greatest increase in the average search duration are found: the maximum distance of objects in combination with the immobility of one of them when the field size is changed by 1 unit. (that is, for example, with $4 \times 4$ at $5 \times 5$) can increase the average search duration in $e^{1.69} \approx 5.42$. The task presented in the work may be relevant from the point of view of application both in the landmark for ensuring the security of the state, and, for example, in the theory of mass service.
-
Technology for collecting initial data for constructing models for assessing the functional state of a human by pupil's response to illumination changes in the solution of some problems of transport safety
Computer Research and Modeling, 2021, v. 13, no. 2, pp. 417-427This article solves the problem of developing a technology for collecting initial data for building models for assessing the functional state of a person. This condition is assessed by the pupil response of a person to a change in illumination based on the pupillometry method. This method involves the collection and analysis of initial data (pupillograms), presented in the form of time series characterizing the dynamics of changes in the human pupils to a light impulse effect. The drawbacks of the traditional approach to the collection of initial data using the methods of computer vision and smoothing of time series are analyzed. Attention is focused on the importance of the quality of the initial data for the construction of adequate mathematical models. The need for manual marking of the iris and pupil circles is updated to improve the accuracy and quality of the initial data. The stages of the proposed technology for collecting initial data are described. An example of the obtained pupillogram is given, which has a smooth shape and does not contain outliers, noise, anomalies and missing values. Based on the presented technology, a software and hardware complex has been developed, which is a collection of special software with two main modules, and hardware implemented on the basis of a Raspberry Pi 4 Model B microcomputer, with peripheral equipment that implements the specified functionality. To evaluate the effectiveness of the developed technology, models of a single-layer perspetron and a collective of neural networks are used, for the construction of which the initial data on the functional state of intoxication of a person were used. The studies have shown that the use of manual marking of the initial data (in comparison with automatic methods of computer vision) leads to a decrease in the number of errors of the 1st and 2nd years of the kind and, accordingly, to an increase in the accuracy of assessing the functional state of a person. Thus, the presented technology for collecting initial data can be effectively used to build adequate models for assessing the functional state of a person by pupillary response to changes in illumination. The use of such models is relevant in solving individual problems of ensuring transport security, in particular, monitoring the functional state of drivers.
-
The model of the rationale for the focus of border security efforts at the state level
Computer Research and Modeling, 2019, v. 11, no. 1, pp. 187-196Views (last year): 26.The most important principle of military science and border security is the principle of concentrating the main efforts on the main directions and tasks. At the tactical level, there are many mathematical models for computing the optimal resource allocation by directions and objects, whereas at the state level there are no corresponding models. Using the statistical data on the results of the protection of the US border, an exponential type border production function parameter is calculated that reflects the organizational and technological capabilities of the border guard. The production function determines the dependence of the probability of detaining offenders from the density of border guards per kilometer of the border. Financial indicators in the production function are not taken into account, as the border maintenance budget and border equipment correlate with the number of border agents. The objective function of the border guards is defined — the total prevented damage from detained violators taking into account their expected danger for the state and society, which is to be maximized. Using Slater's condition, the solution of the problem was found — optimal density of border guard was calculated for the regions of the state. Having a model of resource allocation, the example of the three border regions of the United States has also solved the reverse problem — threats in the regions have been assessed based on the known allocation of resources. The expected danger from an individual offender on the US-Canada border is 2–5 times higher than from an offender on the US-Mexican border. The results of the calculations are consistent with the views of US security experts: illegal migrants are mostly detained on the US-Mexican border, while potential terrorists prefer to use other channels of penetration into the US (including the US-Canadian border), where the risks of being detained are minimal. Also, the results of the calculations are consistent with the established practice of border protection: in 2013 the number of border guards outside the checkpoints on the US-Mexican border increased by 2 times compared with 2001, while on the American-Canadian border — 4 times. The practice of border protection and the views of specialists give grounds for approval of the verification of the model.
-
National security and geopotential of the State: mathematical modeling and forecasting
Computer Research and Modeling, 2015, v. 7, no. 4, pp. 951-969Views (last year): 11.Using mathematical modeling, geopolitical, historical and natural science approach, the model of national security. Security model reflects the dichotomy of values development and conservation, being the product of the corresponding functions. In this paper we evaluated the basic parameters of the model and discusses some of its applications in the field of geopolitics and national security.
-
Approaches to a social network groups clustering
Computer Research and Modeling, 2015, v. 7, no. 5, pp. 1127-1139Views (last year): 8. Citations: 2 (RSCI).The research is devoted to the problem of the use of social networks as a tool of the illegal activity and as a source of information that could be dangerous to society. The article presents the structure of the multiagent system with which a social network groups could be clustered according to the criteria uniquely defines a group as a destructive. The agents’ of the system clustering algorithm is described.
-
Reconstruction of the security of the Roman Empire
Computer Research and Modeling, 2016, v. 8, no. 1, pp. 169-200Views (last year): 3.The paper considers the model of national security, which reflects the dichotomy of values development and conservation, to evaluate its options on the example of Russia (USSR), the United States, Germany and Ukraine. The calculations to assess the safety of the Roman Empire. It is shown that in 160 AD the conservation value of the function has reached a critically low values, which served as the impetus for modernization and reform.
-
Forecasting the labor force dynamics in a multisectoral labor market
Computer Research and Modeling, 2021, v. 13, no. 1, pp. 235-250The article considers the problem of forecasting the number of employed and unemployed persons in a multisectoral labor market using a balance mathematical model of labor force intersectoral dynamics.
The balance mathematical model makes it possible to calculate the values of intersectoral dynamics indicators using only statistical data on sectoral employment and unemployment provided by the Federal State Statistics Service. Intersectoral dynamics indicators of labor force calculated for several years in a row are used to build trends for each of these indicators. The found trends are used to calculation of forecasted intersectoral dynamics indicators of labor force. The sectoral employment and unemployment of researched multisectoral labor market is forecasted based on values these forecasted indicators.
The proposed approach was applied to forecast the employed persons in the economic sectors of the Russian Federation in 2011–2016. The following types of trends were used to describe changes of intersectoral dynamics indicators values: linear, non-linear, constant. The procedure for selecting trends is clearly demonstrated by the example of indicators that determine the labor force movements from the “Transport and communications” sector to the “Healthcare and social services” sector, as well as from the “Public administration and military security, social security” sector to the “Education” sector.
Several approaches to forecasting was compared: a) naive forecast, within which the labor market indicators was forecasted only using a constant trend; b) forecasting based on a balance model using only a constant trend for all intersectoral dynamics indicators of labor force; c) forecasting directly by the number employed persons in economic sectors using the types of trends considered in the article; d) forecasting based on a balance model with the trends choice for each intersectoral dynamics indicators of labor force.
The article shows that the use of a balance model provides a better forecast quality compared to forecasting directly by the number of employed persons. The use of trends in intersectoral dynamics indicators improves the quality of the forecast. The article also provides analysis examples of the multisectoral labor market in the Russian Federation. Using the balance model, the following information was obtained: the labor force flows distribution outgoing from concrete sectors by sectors of the economy; the sectoral structure of the labor force flows ingoing in concrete sectors. This information is not directly contained in the data provided by the Federal State Statistics Service.
-
Theoretical modeling consensus building in the work of standardization technical committees in coalitions based on regular Markov chains
Computer Research and Modeling, 2020, v. 12, no. 5, pp. 1247-1256Often decisions in social groups are made by consensus. This applies, for example, to the examination in the technical committee for standardization (TC) before the approval of the national standard by Rosstandart. The standard is approved if and only if the secured consensus in the TC. The same approach to standards development was adopted in almost all countries and at the regional and international level. Previously published works of authors dedicated to the construction of a mathematical model of time to reach consensus in technical committees for standardization in terms of variation in the number of TC members and their level of authoritarianism. The present study is a continuation of these works for the case of the formation of coalitions that are often formed during the consideration of the draft standard to the TC. In the article the mathematical model is constructed to ensure consensus on the work of technical standardization committees in terms of coalitions. In the framework of the model it is shown that in the presence of coalitions consensus is not achievable. However, the coalition, as a rule, are overcome during the negotiation process, otherwise the number of the adopted standards would be extremely small. This paper analyzes the factors that influence the bridging coalitions: the value of the assignment and an index of the effect of the coalition. On the basis of statistical modelling of regular Markov chains is investigated their effects on the time to ensure consensus in the technical Committee. It is proved that the time to reach consensus significantly depends on the value of unilateral concessions coalition and weakly depends on the size of coalitions. Built regression model of dependence of the average number of approvals from the value of the assignment. It was revealed that even a small concession leads to the onset of consensus, increasing the size of the assignment results (with other factors being equal) to a sharp decline in time before the consensus. It is shown that the assignment of a larger coalition against small coalitions takes on average more time before consensus. The result has practical value for all organizational structures, where the emergence of coalitions entails the inability of decision-making in the framework of consensus and requires the consideration of various methods for reaching a consensus decision.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"