Distinguishing Stable Probability Measures - Part II: Continuous Time
01 October 1976
In this paper, the work begun in Part I 1 on discrete-time hypothesis testing of stable probability measures is extended to continuous time. In contrast to the earlier work, analytic closed-form expressions are found for both the log likelihood functional and Chernoff-type upper and lower bounds on various error probabilities for the log likelihood test. As in Part I, the singular role played by the gaussian probability measure within the family of stable probability measures is emphasized, both in terms of the form of the log likelihood functional and the expressions for Chernoff-type bounds on error probabilities. The earlier work dealt with observing N samples from a stable process with one of two sets of parameters at time instants At apart; here, we fix the observation interval at duration T, and allow the number of observations to become infinite while the spacing between samples shrinks to zero (N -- A t --> 0, such that N ยท At = T). Section II briefly reviews some properties of independent-increment 1183