Information about Information
01 January 2001
In a situation where one can obtain probabilistic information about an event, it may also be possible to learn something about the reliability of that information. This "information about information" may not improve one's guessing position in a single instance, but will help when there are multiple independent trials based on one event. We offer a means of measuring the amount of this elusive type of information, and gauging its effect. We analyze here only the simplest possible circumstance, where the event in question is the result of a fair coinflip. By fixing our measure of information about information and minimizing error when there are multiple observations, we create a family of channels, each with at most a three-point range, stretching forma binary symmetric channel to a binary erasure channel.