The goal of a learning machine is often formalized in terms of an optimization problem, i.e., minimization or maximization of a mathematical function. Numerical methods are crucial when an analytical solution to the optimization does not exist. The most popular numerical optimization algorithms in ML use derivative information (for example, gradient descent, accelerated gradient descent methods including Momentum, Nestorov method etc., Newton's method, L-BFGS).
Application area of research group:
Faculty Associated with the research group:
|
|
|