Probabilistic Merging Operators

Martin Adamčík, George Wilmers


The present work presents a general theoretical framework for the study of operators which merge partial probabilistic knowledge from different sources which are individually consistent, but may be collectively inconsistent. We consider a number of principles for such an operator to satisfy including a set of principles derived from those of Konieczny and Pino Pérez [14] which were formulated for the different context of propositional merging. Finally we investigate two specific such merging operators derived from the Kullback-Leibler notion of informational distance: the social entropy operator, and its dual, the linear entropy operator. The first of these is strongly related to both the multi-agent normalised geometric mean pooling operator and the single agent maximum entropy inference process, ME. By contrast the linear entropy operator is similarly related to both the arithmetic mean pooling operator and the limit centre of mass inference process, CM.


  • There are currently no refbacks.