Automated iBeacon-Based Community Detection: Data-Driven Approach to Recover Face-to-Face Interaction from Noisy & Incomplete Sensor Data

01 January 2018

New Image

In this paper, we address the detection of face-to-face (f2f, i.e., physical) interactions between people from noisy sensor data and determination of groups and communities that individuals form in the physical world. The sensor data may include, for example, received signal strength indicator (RSSI) information based on iBeacon, Wi-Fi, Zigbee or similar technologies. This is a challenging problem since this type of sensor data is very noisy, is often incomplete with a lot of missing values, and is easily perturbed by the mobility of people and nearby obstacles (for example, people and/or objects between or around receivers and transmitters). These effects are especially visible in very dynamic indoor environments. Furthermore, detection of interaction needs to be updated in real-time and be quickly adaptable to indoor mobility of people. The key idea of the paper is to transform the original noisy sensor data into specific feature vectors, to enable measurement of proximity between the sensors. To obtain reliable features that well reflect interaction between people, the feature engineering to generate the feature vectors is done by a series of statistical methods such as time series smoothing, change point detection, and exploratory data analysis (EDA). Clustering similar feature vectors allows correlation of people with similar patterns depending on their co-varying information, without the need to rely on localization information. Therefore, this solely data-driven approach provides robust community detection that is not sensitive to noise and missing/dropped signals, but that automatically captures the dynamic interaction between people.