Yurii Nesterov

Source: Wikipedia, the free encyclopedia.
Yurii Nesterov
USSR
CitizenshipBelgium
Alma materMoscow State University (1977)
Awards
Scientific career
Fields
Institutions
Doctoral advisorBoris Polyak

Yurii Nesterov is a Russian

University of Louvain
(UCLouvain).

Biography

In 1977, Yurii Nesterov graduated in

UCLouvain, specifically in the Department of Mathematical Engineering from the Louvain School of Engineering, Center for Operations Research and Econometrics
.

In 2000, Nesterov received the

In 2009, Nesterov won the John von Neumann Theory Prize.[3]

In 2016, Nesterov received the EURO Gold Medal.[4]

In 2023, Yurii Nesterov and Arkadi Nemirovski received the WLA Prize in Computer Science or Mathematics, "for their seminal work in convex optimization theory".[5]

Academic work

Nesterov is most famous for his work in convex optimization, including his 2004 book, considered a canonical reference on the subject.[6] His main novel contribution is an accelerated version of gradient descent that converges considerably faster than ordinary gradient descent (commonly referred as Nesterov momentum, Nesterov Acceleration or Nesterov accelerated gradient, in short — NAG).[7][8][9][10][11] This method, sometimes called "FISTA", was further developed by Beck & Teboulle in their 2009 paper "A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems".[12]

His work with

interior point method can solve convex optimization problems, and the first to make a systematic study of semidefinite programming (SDP). Also in this book, they introduced the self-concordant functions which are useful in the analysis of Newton's method.[14]

References

  1. ^ "2023 WLA Prize Laureates". 2023. Retrieved September 14, 2023.
  2. ^ "The George B. Dantzig Prize". 2000. Retrieved December 12, 2014.
  3. ^ "John Von Neumann Theory Prize". 2009. Retrieved June 4, 2014.
  4. ^ "EURO Gold Medal". 2016. Retrieved August 20, 2016.
  5. ^ "Laureates of the 2023 WLA Prize Announced". 2023. Retrieved October 4, 2023.
  6. .
  7. ^ Nesterov, Y (1983). "A method for unconstrained convex minimization problem with the rate of convergence ". Doklady AN USSR. 269: 543–547.
  8. ISSN 0036-1445
    .
  9. ^ Bubeck, Sebastien (April 1, 2013). "ORF523: Nesterov's Accelerated Gradient Descent". Retrieved June 4, 2014.
  10. ^ Bubeck, Sebastien (March 6, 2014). "Nesterov's Accelerated Gradient Descent for Smooth and Strongly Convex Optimization". Retrieved June 4, 2014.
  11. ^ "The zen of gradient descent". blog.mrtz.org. Retrieved 2023-05-13.
  12. .
  13. .
  14. . Retrieved October 15, 2011.

External links