Full Name
Peter Harremoes
Job Title
Ass. Prof.
Company
Copenhagen Business College
Speaker Bio
My research is centered on information theory. One of my interests is how to use ideas from information theory to derive results in probability theory. Many of the most important reslts in probability theory are convergence theorems, and many of these convergence theorems can be reformulated so that they state that the entropy of a system increases to a maximum or that a divergence converge to a minimum. These ideas are also relevant in the theory of statistical tests. Recently I have formalized a method for deriving Jeffreys prior as the optimal prior using the minimum description length principle.
I am also interested in quantum information theory, and I think that information theory sheds new light on the problems of the foundation of quantum mechanics. In a sense the distinction between matter and information about matter disappear on the quantum level. Combining this idea with group representations should be a key to a better understanding of quantum theory.
I have also worked on the relation between Bayesian networks and irreversibility, and my ultimate goal is to build a bridge between these ideas and information theory. I am working on a new theory where methods from lattice theory are used. I think lattices of functional dependence will provide a more transparent framework to describe causation. Hopefully it will lead to better algorithms for detecting causal relationship, but the most important application might be in our description of quantum systems, where we know that our usual notion of causation break down. http://www.harremoes.dk/Peter/
Abstract
Inspired by some problems and resutls in the intersection of probability theory, statistics and information theory we will take a new look at the foundation of probability theory. Since the seminal work of Kolmogorov, probability theory has been based on measure with total mass one, i.e. so-called probability measures. In Kolmogorov's theory a probability measure is used to model an experiment with a single outcome that will belong to exactly one out of several disjoint sets. We will present a different basic model where an experiment results in a multiset, i.e. for each of the disjoint sets we get the number of observations in the set. This new framework is formally equivalent with Kolmogorov's theory, but the focus is on expected values rather than probabilities. We present examples from testing Goodness-of-Fit, information theory and quantum information theory, where the shifted focus gives new insight or better performance.
Peter Harremoes