Skip to main content

KL-Divergence Kernel Regression for Non-Gaussian Fingerprint Based Localization

21 September 2011

New Image

Various methods have been developed for indoor localization using WLAN signals. Algorithms that fingerprint the Received Signal Strength Indication (RSSI) of WiFi for different locations can achieve tracking accuracies of the order of a few meters. RSSI fingerprinting suffers though from two main limitations: first, as the signal environment changes, so does the fingerprint database, which needs recalibration; second, it has been reported that, in practice, certain devices record more complex (e.g bimodal) distributions of WiFi signals, precluding algorithms based on the mean RSSI. We propose in this article a simple methodology that takes into account the full distribution in computing similarities with fingerprints using Kullback-Leibler divergence, and that performs localization through kernel regression. Our method provides a natural way of smoothing over time and trajectories. Moreover, we propose an unsupervised KL-divergence based recalibration of the training fingerprints. Finally, we apply our method to work with histograms of WiFi connections with access points, ignoring RSSI distributions and thus removing the need for recalibration. We demonstrate that our results outperform nearest neighbors or Kalman and Particle Filters, achieving 1m accuracy in office environments, and we show that our method generalizes to non-Gaussian RSSI distributions.