Impact Factor (2025): 6.9
DOI Prefix: 10.47001/IRJIET
This study
investigates the impact of an inexact oracle (δ,L) on the convergence of the classical gradient algorithm and
Nesterov's accelerated algorithm for smooth convex optimization problems [1].
We show that the classical gradient method maintains a convergence rate of O(1/k) with an accuracy floor determined
by (δ), without error accumulation [2].
In contrast, Nesterov's algorithm achieves an accelerated rate of O(1/k2) at the expense of an
error accumulation of order O(kδ) [1,2].
Through a numerical example, we confirm that the acceleration provides an
advantage in the early iterations, while the classical method exhibits greater
stability as the error increases, demonstrating that the choice of algorithm is
directly linked to the quality of the oracle.
Country : Syria
IRJIET, Volume 10, Issue 3, March 2026 pp. 145-155