15 research outputs found
Bayesian Active Malware Analysis
We propose a novel technique for Active Malware Analysis (AMA) formalized as a Bayesian game between an analyzer agent and a malware agent, focusing on the decision making strategy for the analyzer. In our model, the analyzer performs an action on the system to trigger the malware into showing a malicious behavior, i.e., by activating its payload. The formalization is built upon the link between malware families and the notion of types in Bayesian games. A key point is the design of the utility function, which reflects the amount of uncertainty on the type of the adversary after the execution of an analyzer action. This allows us to devise an algorithm to play the game with the aim of minimizing the entropy of the analyzer's belief at every stage of the game in a myopic fashion. Empirical evaluation indicates that our approach results in a significant improvement both in terms of learning speed and classification score when compared to other state-of-the-art AMA techniques
User-Defined Privacy Location-Sharing System in Mobile Online Social Networks
With the fusion of social networks and location-based services, location sharing is one of the most important services in mobile online social networks (mOSNs). In location-sharing services, users have to provide their location information to service provider. However, location information is sensitive to users, which may cause a privacy-preserving issue needs to be solved. In the existing research, location-sharing services, such as friends’ query, does not consider the attacks from friends. In fact, a user may not trust all of his/her friends, so just a part of his/her friends will be allowed to obtain the user’s location information. In addition, users’ location privacy and social network privacy should be guaranteed. In order to solve the above problems, we propose a new architecture and a new scheme called User-Defined Privacy Location-Sharing (UDPLS) system for mOSNs. In our scheme, the query time is almost irrelevant to the number of friends. We also evaluate the performance and validate the correctness of our proposed algorithm through extensive simulations
Big privacy: challenges and opportunities of privacy study in the age of big data
One of the biggest concerns of big data is privacy. However, the study on big data privacy is still at a very early stage. We believe the forthcoming solutions and theories of big data privacy root from the in place research output of the privacy discipline. Motivated by these factors, we extensively survey the existing research outputs and achievements of the privacy field in both application and theoretical angles, aiming to pave a solid starting ground for interested readers to address the challenges in the big data case. We first present an overview of the battle ground by defining the roles and operations of privacy systems. Second, we review the milestones of the current two major research categories of privacy: data clustering and privacy frameworks. Third, we discuss the effort of privacy study from the perspectives of different disciplines, respectively. Fourth, the mathematical description, measurement, and modeling on privacy are presented. We summarize the challenges and opportunities of this promising topic at the end of this paper, hoping to shed light on the exciting and almost uncharted land
Efficient location privacy algorithm for Internet of Things (IoT) services and applications
© 2016 Elsevier Ltd. Location-based Services (LBS) have become a very important area for research with the rapid development of Internet of Things (IoT) technology and the ubiquitous use of smartphones and social networks in our daily lives. Although users can enjoy a lot of flexibility and conveniences from the LBS with IoT, they may also lose their privacy. Untrusted or malicious LBS servers with all users' information can track users in various ways or release personal data to third parties. In this work, we first analyze the current dummy-location selection (DLS) algorithm-an efficient location privacy preservation approach and design an attack algorithm for DLS (ADLS) for test emerging IoT security. For efficiently preserving user's location privacy, we propose a novel dummy location privacy-preserving (DLP) algorithm by considering both computational costs and various privacy requirements of different users. Extensive simulation experiments have been carried out to evaluate the efficiency of the proposed schemes. Evaluation results show that the ADLS algorithm has a high probability of identifying the user's real location out from chosen dummy locations in the DLS algorithm. Our proposed DLP algorithm has clear advantages over the DLS algorithm in term of lower probability of revealing the user's real location and improved computational cost and efficiency (i.e., time, speed, accuracy, and complexity) while preserve the same privacy level as DLS algorithm
Recommended from our members
INFORMATION-THEORETIC LIMITS ON STATISTICAL MATCHING WITH APPLICATIONS TO PRIVACY
Modern applications significantly enhance the user experience by adapting to each user\u27s individual condition and/or preferences. While this adaptation can greatly improve a user\u27s experience or be essential for the application to work, the exposure of user data to the application presents a significant privacy threat to the users- even when the traces are anonymized (since the statistical matching of an anonymized trace to prior user behavior can identify a user and their habits). Because of the current and growing algorithmic and computational capabilities of adversaries, provable privacy guarantees as a function of the degree of anonymization and obfuscation of the traces are necessary. This dissertation focuses on deriving the theoretical bounds on the privacy of users in such a scenario. Here we derive the fundamental limits of user privacy when both anonymization and obfuscation-based protection mechanisms are applied to users\u27 time series of data. We investigate the impact of such mechanisms on the trade-off between privacy protection and user utility. In the first part, the requirements on anonymization and obfuscation in the case that data traces are independent between users are obtained. However, the data traces of different users will be dependent in many applications, and an adversary can potentially exploit such. So in the next part, we consider the impact of dependency between user traces on their privacy. In order to do that, we demonstrate that the adversary can readily identify the association graph of the obfuscated and anonymized version of the data, revealing which user data traces are dependent, and then, we demonstrate that the adversary can use this association graph to break user privacy with significantly shorter traces than in the case of independent users. As a result, we show inter-user dependency degrades user privacy. We show that obfuscating data traces independently across users is often insufficient to remedy such leakage. Therefore, we discuss how users can improve privacy by employing joint obfuscation that removes the data dependency. Finally, we discuss how the remapping technique came to our help to improve user utility and how much remapping is leaking to the adversary when the adversary does not have the full prior information