User:PAR/Test

From Wikipedia, the free encyclopedia

Entropy

The information entropy of the Von Mises distribution is defined as[1]:

where is any interval of length . The logarithm of the density of the Von Mises distribution is straightforward:

The characteristic function representation for the Von Mises distribution is:

where . Substituting these expressions into the entropy integral, exchanging the order of integration and summation, and using the orthogonality of the cosines, the entropy may be written:

Collapsible

More information , where ...
Table of closed-form stable distribution PDF's
where is a Lommel function. (Reference: Garoni & Frankel[2].)
where S(x) and C(x) are Fresnel Integrals (Reference: Hopcraft et. al.[3].)
where is a Whittaker function.) (Reference: Uchaikin & Zolotarev [4].)
Cauchy distribution
(Reference: Garoni & Frankel [2].)
(Reference: Garoni & Frankel[2].) The following are asymmetric distributions (specifically, where β = 1).
where Kv(x) is a modified Bessel function of the second kind. (Reference: Hopcraft et. al.[3].)
(Reference: Zolotarev 1961 [5].)
  • .
(Reference: Kagan et. al.[6].) The symmetric distributions for which α = p / q and p > q can be derived from a result in Garoni & Frankel<ref name="G&F"\>. Also Meijer functions (Zolatarev)
Close

Relationship to Tsallis entropy

Related Articles

Wikiwand AI