7,328 research outputs found

    Are You in the Line? RSSI-based Queue Detection in Crowds

    Full text link
    Crowd behaviour analytics focuses on behavioural characteristics of groups of people instead of individuals' activities. This work considers human queuing behaviour which is a specific crowd behavior of groups. We design a plug-and-play system solution to the queue detection problem based on Wi-Fi/Bluetooth Low Energy (BLE) received signal strength indicators (RSSIs) captured by multiple signal sniffers. The goal of this work is to determine if a device is in the queue based on only RSSIs. The key idea is to extract features not only from individual device's data but also mobility similarity between data from multiple devices and mobility correlation observed by multiple sniffers. Thus, we propose single-device feature extraction, cross-device feature extraction, and cross-sniffer feature extraction for model training and classification. We systematically conduct experiments with simulated queue movements to study the detection accuracy. Finally, we compare our signal-based approach against camera-based face detection approach in a real-world social event with a real human queue. The experimental results indicate that our approach can reach minimum accuracy of 77% and it significantly outperforms the camera-based face detection because people block each other's visibility whereas wireless signals can be detected without blocking.Comment: This work has been partially funded by the European Union's Horizon 2020 research and innovation programme within the project "Worldwide Interoperability for SEmantics IoT" under grant agreement Number 72315

    Differentially Private Empirical Risk Minimization

    Full text link
    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ϵ\epsilon-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance.Comment: 40 pages, 7 figures, accepted to the Journal of Machine Learning Researc

    Privately Connecting Mobility to Infectious Diseases via Applied Cryptography

    Get PDF
    Human mobility is undisputedly one of the critical factors in infectious disease dynamics. Until a few years ago, researchers had to rely on static data to model human mobility, which was then combined with a transmission model of a particular disease resulting in an epidemiological model. Recent works have consistently been showing that substituting the static mobility data with mobile phone data leads to significantly more accurate models. While prior studies have exclusively relied on a mobile network operator's subscribers' aggregated data, it may be preferable to contemplate aggregated mobility data of infected individuals only. Clearly, naively linking mobile phone data with infected individuals would massively intrude privacy. This research aims to develop a solution that reports the aggregated mobile phone location data of infected individuals while still maintaining compliance with privacy expectations. To achieve privacy, we use homomorphic encryption, zero-knowledge proof techniques, and differential privacy. Our protocol's open-source implementation can process eight million subscribers in one and a half hours. Additionally, we provide a legal analysis of our solution with regards to the EU General Data Protection Regulation.Comment: Added differentlial privacy experiments and new benchmark
    • …
    corecore