Please use this identifier to cite or link to this item:
Title: The Steepest Descent Method Using the Empirical Mode Gradient Decomposition
Author(s): Esaulov, Vasiliy
Sinetsky, Roman
Issue Date: 2020
Language: English
Subjects: Datenverarbeitung
Abstract: The aim of the article is to study the possibility of improving gradient optimization methods. The leading approach to the chosen concept is based on the possibility of a featured description of the gradient that sets the direction of the search for a solution. A modification of the method of steepest descent of global optimization based on the Hilbert-Huang transform is proposed. The proposed solution is based on the decomposition of the gradient of the objective function into empirical modes. The main results of the work are iterative optimization methods, in which, in addition to the gradient, its empirical modes are also taken into account. New estimates of the descent step are obtained, which could not be deduced in the classical formulation of the steepest descent method. Their correctness is due to the fact that in the absence of the possibility of gradient decomposition, they are reduced to existing estimates for the steepest descent method. The theoretical significance of the results lies in the possibility of expanding the existing gradient methods by a previously not used gradient description method. The practical significance is that the proposed recommendations can help accelerate the convergence of gradient methods and improve the accuracy of their results. Using the Python language, computational experiments were carried out, as a result of which the adequacy of the proposed method and its robustness were confirmed.
Open Access: Open access publication
License: In CopyrightIn Copyright
Appears in Collections:International Conference on Applied Innovations in IT (ICAIIT)

Files in This Item:
File Description SizeFormat 
2_1_Esaulov.pdfThe Steepest Descent Method Using the Empirical Mode Gradient Decomposition507.3 kBAdobe PDFThumbnail