All issues
- 2024 Vol. 16
- 2023 Vol. 15
- 2022 Vol. 14
- 2021 Vol. 13
- 2020 Vol. 12
- 2019 Vol. 11
- 2018 Vol. 10
- 2017 Vol. 9
- 2016 Vol. 8
- 2015 Vol. 7
- 2014 Vol. 6
- 2013 Vol. 5
- 2012 Vol. 4
- 2011 Vol. 3
- 2010 Vol. 2
- 2009 Vol. 1
-
Noise removal from images using the proposed three-term conjugate gradient algorithm
Computer Research and Modeling, 2024, v. 16, no. 4, pp. 841-853Conjugate gradient algorithms represent an important class of unconstrained optimization algorithms with strong local and global convergence properties and simple memory requirements. These algorithms have advantages that place them between the steep regression method and Newton’s algorithm because they require calculating the first derivatives only and do not require calculating and storing the second derivatives that Newton’s algorithm needs. They are also faster than the steep descent algorithm, meaning that they have overcome the slow convergence of this algorithm, and it does not need to calculate the Hessian matrix or any of its approximations, so it is widely used in optimization applications. This study proposes a novel method for image restoration by fusing the convex combination method with the hybrid (CG) method to create a hybrid three-term (CG) algorithm. Combining the features of both the Fletcher and Revees (FR) conjugate parameter and the hybrid Fletcher and Revees (FR), we get the search direction conjugate parameter. The search direction is the result of concatenating the gradient direction, the previous search direction, and the gradient from the previous iteration. We have shown that the new algorithm possesses the properties of global convergence and descent when using an inexact search line, relying on the standard Wolfe conditions, and using some assumptions. To guarantee the effectiveness of the suggested algorithm and processing image restoration problems. The numerical results of the new algorithm show high efficiency and accuracy in image restoration and speed of convergence when used in image restoration problems compared to Fletcher and Revees (FR) and three-term Fletcher and Revees (TTFR).
-
A New Method For Point Estimating Parameters Of Simple Regression
Computer Research and Modeling, 2014, v. 6, no. 1, pp. 57-77Views (last year): 2. Citations: 4 (RSCI).A new method is described for finding parameters of univariate regression model: the greatest cosine method. Implementation of the method involves division of regression model parameters into two groups. The first group of parameters responsible for the angle between the experimental data vector and the regression model vector are defined by the maximum of the cosine of the angle between these vectors. The second group includes the scale factor. It is determined by means of “straightening” the relationship between the experimental data vector and the regression model vector. The interrelation of the greatest cosine method with the method of least squares is examined. Efficiency of the method is illustrated by examples.
Indexed in Scopus
Full-text version of the journal is also available on the web site of the scientific electronic library eLIBRARY.RU
The journal is included in the Russian Science Citation Index
The journal is included in the RSCI
International Interdisciplinary Conference "Mathematics. Computing. Education"