title: |
Towards an Axiomatization for the Generalization of the Kullback-Leibler Divergence to Belief Functions |
|
publication: |
||
part of series: |
Advances in Intelligent Systems Research | |
| pages: | 1090 - 1097 | |
DOI: |
To be assigned soon (how to use a DOI) | |
author(s): |
H¨¦l¨¨ne Soubaras |
|
publication date: |
July 2011 |
|
keywords: |
Dempster-Shafer theory of belief functions,
channel capacity, Kullback-Leibler divergence |
|
abstract: |
In his information theory, Shannon [1] defined a notion
of uncertainty, the entropy, which has been generalized
in several wways to belief functions [2]. He also defined
the channel capacity for which we propose in this paper
the first generalization to belief functions. To do that, we
need first to generalize the Kullback-Leibler (KL) divergence, for which the present work proposes some axioms. Their list is still not exhaustive since the proposed
solution is not unique. But there are many practical interests, since the notion of channel capacity is useful to
characterize and optimize for example systems of sensors; its generalization to belief functions allows us to
include imprecise sensors such as the human. Finally
we show an example of gradient algorithm to compute
the generalized channel capacity. |
|
copyright: |
©
Atlantis Press. This article is distributed under the
terms of the Creative Commons Attribution License, which permits
non-commercial use, distribution and reproduction in any medium,
provided the original work is properly cited. |
|
full text: |