img Leseprobe Leseprobe

Numerical Optimization with Computational Errors

Alexander J. Zaslavski

PDF
ca. 96,29
Amazon iTunes Thalia.de Weltbild.de Hugendubel Bücher.de ebook.de kobo Osiander Google Books Barnes&Noble bol.com Legimi yourbook.shop Kulturkaufhaus ebooks-center.de
* Affiliatelinks/Werbelinks
Hinweis: Affiliatelinks/Werbelinks
Links auf reinlesen.de sind sogenannte Affiliate-Links. Wenn du auf so einen Affiliate-Link klickst und über diesen Link einkaufst, bekommt reinlesen.de von dem betreffenden Online-Shop oder Anbieter eine Provision. Für dich verändert sich der Preis nicht.

Springer International Publishing img Link Publisher

Naturwissenschaften, Medizin, Informatik, Technik / Sonstiges

Beschreibung

This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors  are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative.

 

This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s method.

  

Weitere Titel von diesem Autor
Weitere Titel in dieser Kategorie
Cover The Blue Book
Stamatina Th. Rassia
Cover Virtual Barrels
Ilia Bouchouev

Kundenbewertungen

Schlagwörter

nonlinear programming, mathematical programming, proximal point methods, continuous subgradient method, extragradient methods