- {\displaystyle \mathrm {LSE} (x_{1},\dots ,x_{n})=\
log \left(\
exp(x_{1})+\cdots +\
exp(x_{n})\right).} The
LogSumExp function domain is R n {\displaystyle \mathbb...
- The
multivariable generalization of single-variable
softplus is the
LogSumExp with the
first argument set to zero: L S E 0 + ( x 1 , … , x n ) :=...
-
Boltzmann distribution.
Another smooth maximum is
LogSumExp: L S E α ( x 1 , … , x n ) = 1 α
log ∑ i = 1 n
exp α x i {\displaystyle \mathrm {LSE} _{\alpha...
- \mathbb {R} ^{K},}
where the
LogSumExp function is
defined as LSE ( z 1 , … , z n ) =
log (
exp ( z 1 ) + ⋯ +
exp ( z n ) ) {\displaystyle \operatorname...
- The
multivariable generalization of single-variable
softplus is the
LogSumExp with the
first argument set to zero: L S E 0 + ( x 1 , … , x n ) :=...
- (
log multiplication), and
takes addition to
log addition (
LogSumExp),
giving an
isomorphism of
semirings between the
probability semiring and the
log semiring...
- then the
exponential function of Y, X =
exp(Y) , has a
log-normal distribution. A
random variable which is
log-normally
distributed takes only positive...
- operation,
logadd (for
multiple terms,
LogSumExp) can be
viewed as a
deformation of
maximum or minimum. The
log semiring has
applications in mathematical...
-
product of
separate exponentials,
exp ( x + y ) =
exp x ⋅
exp y {\displaystyle \
exp(x+y)=\
exp x\cdot \
exp y} . Its
inverse function, the natural...
-
entropy (negentropy)
function is convex, and its
convex conjugate is
LogSumExp. The
inspiration for
adopting the word
entropy in
information theory came...