Impact Factor (2025): 6.9
DOI Prefix: 10.47001/IRJIET
Vol 10 No 3 (2026): Volume 10, Issue 3, March 2026 | Pages: 145-155
International Research Journal of Innovations in Engineering and Technology
OPEN ACCESS | Research Article | Published Date: 21-03-2026
This study investigates the impact of an inexact oracle (δ,L) on the convergence of the classical gradient algorithm and Nesterov's accelerated algorithm for smooth convex optimization problems [1]. We show that the classical gradient method maintains a convergence rate of O(1/k) with an accuracy floor determined by (δ), without error accumulation [2]. In contrast, Nesterov's algorithm achieves an accelerated rate of O(1/k2) at the expense of an error accumulation of order O(kδ) [1,2]. Through a numerical example, we confirm that the acceleration provides an advantage in the early iterations, while the classical method exhibits greater stability as the error increases, demonstrating that the choice of algorithm is directly linked to the quality of the oracle.
First-order methods, inexact oracle (?,L), convex optimization, Lipschitz constant, classical gradient method, Nesterov's accelerated method, error accumulation, convergence rate
Abdulrahman Al-Younes, & Iyad Al-Hammada. (2026). Comparison between the Classical Gradient Algorithm and Nesterov's Accelerated Algorithm under an Inexact Oracle (?,L). International Research Journal of Innovations in Engineering and Technology - IRJIET, 10(3), 145-155. Article DOI https://doi.org/10.47001/IRJIET/2026.103020
This work is licensed under Creative common Attribution Non Commercial 4.0 Internation Licence