- In mathematics,
subderivatives (or
subgradient)
generalizes the
derivative to
convex functions which are not
necessarily differentiable. The set of subderivatives...
-
Subgradient methods are
convex optimization methods which use subderivatives.
Originally developed by Naum Z. Shor and
others in the 1960s and 1970s,...
- Cutting-plane
methods Ellipsoid method Subgradient method Dual
subgradients and the drift-plus-penalty
method Subgradient methods can be
implemented simply...
- size rules,
which were
first developed for
classical subgradient methods.
classical subgradient methods using divergent-series
rules are much
slower than...
- have a "
subgradient oracle": a
routine that can
compute a
subgradient of f at any
given point (if f is differentiable, then the only
subgradient is the...
- z_{t}\rangle } . To
generalise the
algorithm to any
convex loss function, the
subgradient ∂ v t ( w t ) {\displaystyle \partial v_{t}(w_{t})} of v t {\displaystyle...
-
include coordinate descent,
subgradient methods, least-angle
regression (LARS), and
proximal gradient methods.
Subgradient methods are the
natural generalization...
- 3570770. Kiwiel,
Krzysztof C. (2001). "Convergence and
efficiency of
subgradient methods for
quasiconvex minimization".
Mathematical Programming, Series...
- non-differentiable
convex minimization,
where a
convex objective function and its
subgradient can be
evaluated efficiently but
usual gradient methods for differentiable...
- 604861. Kiwiel,
Krzysztof C. (2001). "Convergence and
efficiency of
subgradient methods for
quasiconvex minimization".
Mathematical Programming, Series...