Millions of books in English, Spanish and other languages. Free UK delivery 

menu

0
  • argentina
  • chile
  • colombia
  • españa
  • méxico
  • perú
  • estados unidos
  • internacional
portada Optimization for Machine Learning (Neural Information Processing Series)
Type
Physical Book
Publisher
Year
2011
Language
English
Pages
512
Format
Paperback
Dimensions
25.1 x 20.1 x 2.5 cm
Weight
1.07 kg.
ISBN13
9780262537766

Optimization for Machine Learning (Neural Information Processing Series)

Suvrit Sra (Illustrated by) · Sebastian Nowozin (Illustrated by) · Stephen J. Wright (Illustrated by) · MIT Press · Paperback

Optimization for Machine Learning (Neural Information Processing Series) - Sra, Suvrit ; Nowozin, Sebastian ; Wright, Stephen J.

New Book

£ 72.35

  • Condition: New
Origin: U.S.A. (Import costs included in the price)
It will be shipped from our warehouse between Tuesday, June 11 and Thursday, June 27.
You will receive it anywhere in United Kingdom between 1 and 3 business days after shipment.

Synopsis "Optimization for Machine Learning (Neural Information Processing Series)"

An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities.The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields.Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Customers reviews

More customer reviews
  • 0% (0)
  • 0% (0)
  • 0% (0)
  • 0% (0)
  • 0% (0)

Frequently Asked Questions about the Book

All books in our catalog are Original.
The book is written in English.
The binding of this edition is Paperback.

Questions and Answers about the Book

Do you have a question about the book? Login to be able to add your own question.

Opinions about Bookdelivery

More customer reviews