766 research outputs found

    Quality of Information in Mobile Crowdsensing: Survey and Research Challenges

    Full text link
    Smartphones have become the most pervasive devices in people's lives, and are clearly transforming the way we live and perceive technology. Today's smartphones benefit from almost ubiquitous Internet connectivity and come equipped with a plethora of inexpensive yet powerful embedded sensors, such as accelerometer, gyroscope, microphone, and camera. This unique combination has enabled revolutionary applications based on the mobile crowdsensing paradigm, such as real-time road traffic monitoring, air and noise pollution, crime control, and wildlife monitoring, just to name a few. Differently from prior sensing paradigms, humans are now the primary actors of the sensing process, since they become fundamental in retrieving reliable and up-to-date information about the event being monitored. As humans may behave unreliably or maliciously, assessing and guaranteeing Quality of Information (QoI) becomes more important than ever. In this paper, we provide a new framework for defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the current state-of-the-art on the topic. We also outline novel research challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN

    Data-Driven Models, Techniques, and Design Principles for Combatting Healthcare Fraud

    Get PDF
    In the U.S., approximately 700billionofthe700 billion of the 2.7 trillion spent on healthcare is linked to fraud, waste, and abuse. This presents a significant challenge for healthcare payers as they navigate fraudulent activities from dishonest practitioners, sophisticated criminal networks, and even well-intentioned providers who inadvertently submit incorrect billing for legitimate services. This thesis adopts Hevner’s research methodology to guide the creation, assessment, and refinement of a healthcare fraud detection framework and recommended design principles for fraud detection. The thesis provides the following significant contributions to the field:1. A formal literature review of the field of fraud detection in Medicaid. Chapters 3 and 4 provide formal reviews of the available literature on healthcare fraud. Chapter 3 focuses on defining the types of fraud found in healthcare. Chapter 4 reviews fraud detection techniques in literature across healthcare and other industries. Chapter 5 focuses on literature covering fraud detection methodologies utilized explicitly in healthcare.2. A multidimensional data model and analysis techniques for fraud detection in healthcare. Chapter 5 applies Hevner et al. to help develop a framework for fraud detection in Medicaid that provides specific data models and techniques to identify the most prevalent fraud schemes. A multidimensional schema based on Medicaid data and a set of multidimensional models and techniques to detect fraud are presented. These artifacts are evaluated through functional testing against known fraud schemes. This chapter contributes a set of multidimensional data models and analysis techniques that can be used to detect the most prevalent known fraud types.3. A framework for deploying outlier-based fraud detection methods in healthcare. Chapter 6 proposes and evaluates methods for applying outlier detection to healthcare fraud based on literature review, comparative research, direct application on healthcare claims data, and known fraudulent cases. A method for outlier-based fraud detection is presented and evaluated using Medicaid dental claims, providers, and patients.4. Design principles for fraud detection in complex systems. Based on literature and applied research in Medicaid healthcare fraud detection, Chapter 7 offers generalized design principles for fraud detection in similar complex, multi-stakeholder systems.<br/

    Enabling Trustworthy Service Evaluation in Service-Oriented Mobile Social Network

    Get PDF
    We propose a Trustworthy Service Evaluation (TSE) system to enable users to share service reviews inservice-oriented mobile social networks (S-MSNs). Each service provider independently maintains a TSE for itself, which collects andstores users’ reviews about its services without requiring any third trusted authority. The service reviews can then be made available tointerested users in making wise service selection decisions. It identify three unique service review attacks, i.e., linkability, rejection, and modification attacks, and develop sophisticated security mechanisms for the TSE to deal with these attacks. Specifically, the basicTSE (bTSE) enables users to distributedly and cooperatively submit their reviews in an integrated chain form by using hierarchical and aggregate signature techniques. It restricts the service providers to reject, modify, or delete the reviews. Thus, the integrity and authenticity of reviews are improved. Further, It extend the bTSE to a Sybil-resisted TSE (SrTSE) to enable the detection of two typical sybil attacks. In the SrTSE, if a user generates multiple reviews toward a vendor in a predefined time slot with differentpseudonyms, the real identity of that user will be revealed. Through security analysis and numerical results, It show that the bTSE and the SrTSE effectively resist the service review attacks and the SrTSE additionally detects the Sybil attacks in an efficient manner.Through performance evaluation, It show that the bTSE achieves better performance in terms of submission rate and delay than a service review system that does not adopt user cooperation

    Evaluating Leniency with Missing Information on Undetected Cartels: Exploring Time-Varying Policy Impacts on Cartel Duration

    Get PDF
    This paper examines the effects of European Commission’s (EC) new leniency program on the EC’s capabilities in detecting and deterring cartels. As a supplementary analysis, the US leniency is studied. I discuss a dynamic model of cartel formation and dissolution to illustrate how changes in antitrust policies and economic conditions might affect cartel duration. Comparative statics results are then corroborated with empirical estimates of hazard functions adjusted to account for both the heterogeneity of cartels and the time-varying policy impacts suggested by theory. Contrary to earlier studies, my statistical tests are consistent with the theoretic predictions that following an efficacious leniency program, the average duration of discovered cartels rises in the short run and falls in the long run. The results shed light on the design of enforcement programs against cartels and other forms of conspiracy

    Predictive Analytics For Controlling Tax Evasion

    Get PDF
    Tax evasion is an illegal practice where a person or a business entity intentionally avoids paying his/her true tax liability. Any business entity is required by the law to file their tax return statements following a periodical schedule. Avoiding to file the tax return statement is one among the most rudimentary forms of tax evasion. The dealers committing tax evasion in such a way are called return defaulters. We constructed a logistic regression model that predicts with high accuracy whether a business entity is a potential return defaulter for the upcoming tax-filing period. For the same, we analyzed the effect of the amount of sales/purchases transactions among the business entities (dealers) and the mean absolute deviation (MAD) value of the �rst digit Benford's analysis on sales transactions by a business entity. We developed and deployed this model for the commercial taxes department, government of Telangana, India. Another technique, which is a much more sophisticated one, used for tax evasion, is known as Circular trading. Circular trading is a fraudulent trading scheme used by notorious tax evaders with the motivation to trick the tax enforcement authorities from identifying their suspicious transactions. Dealers make use of this technique to collude with each other and hence do heavy illegitimate trade among themselves to hide suspicious sales transactions. We developed an algorithm to detect the group of colluding dealers who do heavy illegitimate trading among themselves. For the same, we formulated the problem as finding clusters in a weighted directed graph. Novelty of our approach is that we used Benford's analysis to define weights and defined a measure similar to F1 score to find similarity between two clusters. The proposed algorithm is run on the commercial tax data set, and the results obtained contains a group of several colluding dealers

    The determinants of cartel success : an empirical approach

    Get PDF
    The collusive agreements between firms have been on the focus of empirical research because of the effects they produce on the society. However, it is challenging to determine which factors contribute the success of these agreements. There still several disagreements among economists regarding what determines the cartel success. This difficulty is partially attributable to the dearth of accessible data and limited number of indicators reflecting cartel performance. This bachelor's thesis utilizes the Private International Cartels database to study what determines the success of cartels. To answer this question, we assess the influence of macroeconomic variables on cartel duration and overcharge. The analysis indicates that cartels are more likely to success in periods of lower economic growth. In addition, higher market concentration is associated with lower duration

    Ensuring the resilience of wireless sensor networks to malicious data injections through measurements inspection

    Get PDF
    Malicious data injections pose a severe threat to the systems based on \emph{Wireless Sensor Networks} (WSNs) since they give the attacker control over the measurements, and on the system's status and response in turn. Malicious measurements are particularly threatening when used to spoof or mask events of interest, thus eliciting or preventing desirable responses. Spoofing and masking attacks are particularly difficult to detect since they depict plausible behaviours, especially if multiple sensors have been compromised and \emph{collude} to inject a coherent set of malicious measurements. Previous work has tackled the problem through \emph{measurements inspection}, which analyses the inter-measurements correlations induced by the physical phenomena. However, these techniques consider simplistic attacks and are not robust to collusion. Moreover, they assume highly predictable patterns in the measurements distribution, which are invalidated by the unpredictability of events. We design a set of techniques that effectively \emph{detect} malicious data injections in the presence of sophisticated collusion strategies, when one or more events manifest. Moreover, we build a methodology to \emph{characterise} the likely compromised sensors. We also design \emph{diagnosis} criteria that allow us to distinguish anomalies arising from malicious interference and faults. In contrast with previous work, we test the robustness of our methodology with automated and sophisticated attacks, where the attacker aims to evade detection. We conclude that our approach outperforms state-of-the-art approaches. Moreover, we estimate quantitatively the WSN degree of resilience and provide a methodology to give a WSN owner an assured degree of resilience by automatically designing the WSN deployment. To deal also with the extreme scenario where the attacker has compromised most of the WSN, we propose a combination with \emph{software attestation techniques}, which are more reliable when malicious data is originated by a compromised software, but also more expensive, and achieve an excellent trade-off between cost and resilience.Open Acces

    Evaluating Leniency with Missing Information on Undetected Cartels: Exploring Time-Varying Policy Impacts on Cartel Duration

    Get PDF
    This paper examines the effects of European Commission’s (EC) new leniency program on the EC’s capabilities in detecting and deterring cartels. As a supplementary analysis, the US leniency is studied. I discuss a dynamic model of cartel formation and dissolution to illustrate how changes in antitrust policies and economic conditions might affect cartel duration. Comparative statics results are then corroborated with empirical estimates of hazard functions adjusted to account for both the heterogeneity of cartels and the time-varying policy impacts suggested by theory. Contrary to earlier studies, my statistical tests are consistent with the theoretic predictions that following an efficacious leniency program, the average duration of discovered cartels rises in the short run and falls in the long run. The results shed light on the design of enforcement programs against cartels and other forms of conspiracy.
    corecore