Comparison between the Classical Gradient Algorithm and Nesterov's Accelerated Algorithm under an Inexact Oracle (δ,L)

Abstract

This study investigates the impact of an inexact oracle (δ,L) on the convergence of the classical gradient algorithm and Nesterov's accelerated algorithm for smooth convex optimization problems [1]. We show that the classical gradient method maintains a convergence rate of O(1/k) with an accuracy floor determined by (δ), without error accumulation [2]. In contrast, Nesterov's algorithm achieves an accelerated rate of O(1/k2) at the expense of an error accumulation of order O(kδ) [1,2]. Through a numerical example, we confirm that the acceleration provides an advantage in the early iterations, while the classical method exhibits greater stability as the error increases, demonstrating that the choice of algorithm is directly linked to the quality of the oracle.

Country : Syria

1 Abdulrahman Al-Younes2 Iyad Al-Hammada

  1. Postgraduate Student (Master's) in the Department of Mathematics, Faculty of Sciences, University of Aleppo, Syria
  2. Lecturer Doctor in the Department of Mathematics, Faculty of Sciences, University of Aleppo, Syria

IRJIET, Volume 10, Issue 3, March 2026 pp. 145-155

doi.org/10.47001/IRJIET/2026.103020

References

  1. Devolder, O., Glineur, F., & Nesterov, Y. (2014). First-order methods with inexact oracle: The strongly convex case. Mathematical Programming, 147(1), 193–228.
  2. Nesterov, Y. (2018). Lectures on convex optimization (2nd ed.). Springer.
  3. Lan, G. (2020). First-order and stochastic optimization methods for machine learning. Springer.
  4. Boyd, S., & Vandenberghe, L. (2004). Convex optimization. Cambridge University Press.
  5. Bubeck, S. (2015). Convex optimization: Algorithms and complexity. Foundations and Trends in Machine Learning, Volume 8, Issue (3-4), 231–357.
  6. Garrigos, G., & Gower, R. M. (2024). Handbook of convergence theorems for (stochastic) gradient methods. Cornell University, Volume 3.
  7. Polyak, B. T. (1987). Introduction to optimization. Optimization Software.