202,903 research outputs found

    Efficient and Privacy-Preserving Ride Sharing Organization for Transferable and Non-Transferable Services

    Full text link
    Ride-sharing allows multiple persons to share their trips together in one vehicle instead of using multiple vehicles. This can reduce the number of vehicles in the street, which consequently can reduce air pollution, traffic congestion and transportation cost. However, a ride-sharing organization requires passengers to report sensitive location information about their trips to a trip organizing server (TOS) which creates a serious privacy issue. In addition, existing ride-sharing schemes are non-flexible, i.e., they require a driver and a rider to have exactly the same trip to share a ride. Moreover, they are non-scalable, i.e., inefficient if applied to large geographic areas. In this paper, we propose two efficient privacy-preserving ride-sharing organization schemes for Non-transferable Ride-sharing Services (NRS) and Transferable Ride-sharing Services (TRS). In the NRS scheme, a rider can share a ride from its source to destination with only one driver whereas, in TRS scheme, a rider can transfer between multiple drivers while en route until he reaches his destination. In both schemes, the ride-sharing area is divided into a number of small geographic areas, called cells, and each cell has a unique identifier. Each driver/rider should encrypt his trip's data and send an encrypted ride-sharing offer/request to the TOS. In NRS scheme, Bloom filters are used to compactly represent the trip information before encryption. Then, the TOS can measure the similarity between the encrypted trips data to organize shared rides without revealing either the users' identities or the location information. In TRS scheme, drivers report their encrypted routes, an then the TOS builds an encrypted directed graph that is passed to a modified version of Dijkstra's shortest path algorithm to search for an optimal path of rides that can achieve a set of preferences defined by the riders

    Premium Incentives to Drive Wellness in the Workplace: A Review of the Issues and Recommendations for Policymakers

    Get PDF
    Outlines trends in workplace wellness programs; healthcare reform law provisions allowing greater financial incentives for employees; policy considerations for vulnerable populations, privacy issues, and affordability of coverage; and recommendations

    After Over-Privileged Permissions: Using Technology and Design to Create Legal Compliance

    Get PDF
    Consumers in the mobile ecosystem can putatively protect their privacy with the use of application permissions. However, this requires the mobile device owners to understand permissions and their privacy implications. Yet, few consumers appreciate the nature of permissions within the mobile ecosystem, often failing to appreciate the privacy permissions that are altered when updating an app. Even more concerning is the lack of understanding of the wide use of third-party libraries, most which are installed with automatic permissions, that is permissions that must be granted to allow the application to function appropriately. Unsurprisingly, many of these third-party permissions violate consumers’ privacy expectations and thereby, become “over-privileged” to the user. Consequently, an obscurity of privacy expectations between what is practiced by the private sector and what is deemed appropriate by the public sector is exhibited. Despite the growing attention given to privacy in the mobile ecosystem, legal literature has largely ignored the implications of mobile permissions. This article seeks to address this omission by analyzing the impacts of mobile permissions and the privacy harms experienced by consumers of mobile applications. The authors call for the review of industry self-regulation and the overreliance upon simple notice and consent. Instead, the authors set out a plan for greater attention to be paid to socio-technical solutions, focusing on better privacy protections and technology embedded within the automatic permission-based application ecosystem

    An Automated Approach to Auditing Disclosure of Third-Party Data Collection in Website Privacy Policies

    Full text link
    A dominant regulatory model for web privacy is "notice and choice". In this model, users are notified of data collection and provided with options to control it. To examine the efficacy of this approach, this study presents the first large-scale audit of disclosure of third-party data collection in website privacy policies. Data flows on one million websites are analyzed and over 200,000 websites' privacy policies are audited to determine if users are notified of the names of the companies which collect their data. Policies from 25 prominent third-party data collectors are also examined to provide deeper insights into the totality of the policy environment. Policies are additionally audited to determine if the choice expressed by the "Do Not Track" browser setting is respected. Third-party data collection is wide-spread, but fewer than 15% of attributed data flows are disclosed. The third-parties most likely to be disclosed are those with consumer services users may be aware of, those without consumer services are less likely to be mentioned. Policies are difficult to understand and the average time requirement to read both a given site{\guillemotright}s policy and the associated third-party policies exceeds 84 minutes. Only 7% of first-party site policies mention the Do Not Track signal, and the majority of such mentions are to specify that the signal is ignored. Among third-party policies examined, none offer unqualified support for the Do Not Track signal. Findings indicate that current implementations of "notice and choice" fail to provide notice or respect choice

    Privacy as a Public Good

    Get PDF
    Privacy is commonly studied as a private good: my personal data is mine to protect and control, and yours is yours. This conception of privacy misses an important component of the policy problem. An individual who is careless with data exposes not only extensive information about herself, but about others as well. The negative externalities imposed on nonconsenting outsiders by such carelessness can be productively studied in terms of welfare economics. If all relevant individuals maximize private benefit, and expect all other relevant individuals to do the same, neoclassical economic theory predicts that society will achieve a suboptimal level of privacy. This prediction holds even if all individuals cherish privacy with the same intensity. As the theoretical literature would have it, the struggle for privacy is destined to become a tragedy. But according to the experimental public-goods literature, there is hope. Like in real life, people in experiments cooperate in groups at rates well above those predicted by neoclassical theory. Groups can be aided in their struggle to produce public goods by institutions, such as communication, framing, or sanction. With these institutions, communities can manage public goods without heavy-handed government intervention. Legal scholarship has not fully engaged this problem in these terms. In this Article, we explain why privacy has aspects of a public good, and we draw lessons from both the theoretical and the empirical literature on public goods to inform the policy discourse on privacy
    • …
    corecore