181,209 research outputs found

    Releasing Individually Identifiable Microdata with Privacy Protection Against Stochastic Threat: An Application to Health Information

    Get PDF
    The ability to collect and disseminate individually identifiable microdata is becoming increasingly important in a number of arenas. This is especially true in health care and national security, where this data is considered vital for a number of public health and safety initiatives. In some cases legislation has been used to establish some standards for limiting the collection of and access to such data. However, all such legislative efforts contain many provisions that allow for access to individually identifiable microdata without the consent of the data subject. Furthermore, although legislation is useful in that penalties are levied for violating the law, these penalties occur after an individual’s privacy has been compromised. Such deterrent measures can only serve as disincentives and offer no true protection. This paper considers security issues involved in releasing microdata, including individual identifiers. The threats to the confidentiality of the data subjects come from the users possessing statistical information that relates the revealed microdata to suppressed confidential information. The general strategy is to recode the initial data, in which some subjects are “safe” and some are at risk, into a data set in which no subjects are at risk. We develop a technique that enables the release of individually identifiable microdata in a manner that maximizes the utility of the released data while providing preventive protection of confidential data. Extensive computational results show that the proposed method is practical and viable and that useful data can be released even when the level of risk in the data is high

    Common Sense: Rethinking the New Common Rule\u27s Week Protections for Human Subjects

    Get PDF
    Since 1991, the Federal Policy for the Protection of Human Subjects, known as the Common Rule, has protected the identifiable private information of human subjects who participate in federally funded research initiatives. Although the research landscape has drastically changed since 1991, the Common Rule has remained mostly unchanged since its promulgation. In an effort to modernize the Common Rule, the Federal Policy for the Protection of Human Subjects Final Rule ( Final Rule\u27) was published on January 19, 2017. The Final Rule, however, decreases human-subject protections by increasing access to identifiable data with limited administrative oversight. Accordingly, the Final Rule demands reconsideration. This Note conducts a comparative analysis of the Final Rule and the Health Insurance Portability and Accountability Act Standards for Privacy of Individually Identifiable Health Information ( Privacy Rule\u27). Ultimately, this Note argues that a revised Final Rule should incorporate a modified version of the Privacy Rule that in turn provides human subjects with legally enforceable rights, remedies, and control over how information about them is used

    Beyond Classification: Latent User Interests Profiling from Visual Contents Analysis

    Full text link
    User preference profiling is an important task in modern online social networks (OSN). With the proliferation of image-centric social platforms, such as Pinterest, visual contents have become one of the most informative data streams for understanding user preferences. Traditional approaches usually treat visual content analysis as a general classification problem where one or more labels are assigned to each image. Although such an approach simplifies the process of image analysis, it misses the rich context and visual cues that play an important role in people's perception of images. In this paper, we explore the possibilities of learning a user's latent visual preferences directly from image contents. We propose a distance metric learning method based on Deep Convolutional Neural Networks (CNN) to directly extract similarity information from visual contents and use the derived distance metric to mine individual users' fine-grained visual preferences. Through our preliminary experiments using data from 5,790 Pinterest users, we show that even for the images within the same category, each user possesses distinct and individually-identifiable visual preferences that are consistent over their lifetime. Our results underscore the untapped potential of finer-grained visual preference profiling in understanding users' preferences.Comment: 2015 IEEE 15th International Conference on Data Mining Workshop

    Hacking HIPAA: Best Practices for Avoiding Oversight in the Sale of Your Identifiable Medical Information

    Get PDF
    In light of the confusion invited by applying the label de-identified to information that can be used to identify patients, it is paramount that regulators, compliance professionals, patient advocates and the general public understand the significant differences between the standards applied by HIPAA and those applied by permissive de-identification guidelines. This Article discusses those differences in detail. The discussion proceeds in four Parts. Part II (HIPAA’s Heartbeat: Why HIPAA Protects Identifiable Patient Information) examines Congress’s motivations for defining individually identifiable health information broadly, which included to stop the harms patients endured prior to 1996 arising from the commercial sale of their medical records. Part III (Taking the I Out of Identifiable Information: HIPAA’s Requirements for De-Identified Health Information) discusses HIPAA’s requirements for de-identification that were never intended to create a loophole for identifiable patient information to escape HIPAA’s protections. Part IV (Anatomy of a Hack: Methods for Labeling Identifiable information De-Identified ) examines the goals, methods, and results of permissive de-identification guidelines and compares them to HIPAA’s requirements. Part V (Protecting Un-Protected Health Information) evaluates the suitability of permissive de-identification guidelines, concluding that the vulnerabilities inherent in their current articulation render them ineffective as a data protection standard. It also discusses ways in which compliance professionals, regulators, and advocates can foster accountability and transparency in the utilization of health information that can be used to identify patients

    Personal Privacy and Common Goods: A Framework for Balancing Under the National Health Information Privacy Rule

    Get PDF
    In this Article, we discuss how these principles for balancing apply in a number of important contexts where individually identifiable health data are shared. In Part I, we analyze the modern view favoring autonomy and privacy. In the last several decades, individual autonomy has been used as a justification for preventing sharing of information irrespective of the good to be achieved. Although respect for privacy can sometimes be important for achieving public purposes (e.g., fostering the physician/patient relationship), it can also impair the achievement of goals that are necessary for any healthy and prosperous society. A framework for balancing that strictly favors privacy can lead to reduced efficiencies in clinical care, research, and public health. We reason that society would be better served, and individuals would be only marginally less protected, if privacy rules permitted exchange of data for important public benefits. In Part II, we explain the national health information privacy regulations: (1) what do they cover?; (2) to whom do they apply?; and (3) how do they safeguard personal privacy? Parts III and IV focus on whether the standards adhere, or fail to adhere, to the privacy principles discussed in Part I. In Part III, we examine two autonomy rules established in the national privacy regulations: informed consent (for uses or disclosures of identifiable health data for health-care related purposes) and written authorization (for uses or disclosures of health data for non-health care related purposes). We observe that the informed consent rule is neither informed nor consensual. The rule is likely to thwart the effective management of health organizations without benefiting the individual. Requiring written authorization, on the other hand, protects individual privacy to prevent disclosures to entities that do not perform health-related functions, such as employers and life insurers. In Part IV, we examine various contexts in which data can be shared for public purposes under the national privacy rule: public health, research, law enforcement, familial notification, and commercial marketing. We apply our framework for balancing in each context and observe the relative strengths and weaknesses of the privacy regulations in achieving a fair balance of private and public interests

    HIPAA and Advanced Scientific Computing

    Get PDF
    Demand for compute cycles and massive data storage has been growing rapidly in biomedical research. Activities on topics such as electronic health record analytics and gene sequencing are placing an increasing burden on academic medical college IT departments with limited ability to scale. As a result, campus and national advanced scientific computing centers (ASCCs) are being asked to accommodate biomedical researchers. This presents a challenge to these organizations since clinical research data or electronic health records contain identifiable patient information protected by the federal Privacy and Security Rules promulgated under the Health Insurance Portability and Accountability Act (HIPAA) of 1996. The HIPAA Privacy and Security Rules require entities to protect the privacy of individually identifiable health information or protected health information (PHI). The rules specify the types of safeguards that must be put in place including required security controls to ensure patient privacy

    An efficient solution for privacy-preserving, secure remote access to sensitive data

    Full text link
    Sharing data that contains personally identifiable or sensitive information, such as medical records, always has privacy and security implications. The issues can become rather complex when the methods of access can vary, and accurate individual data needs to be provided whilst mass data release for specific purposes (for example for medical research) also has to be catered for. Although various solutions have been proposed to address the different aspects individually, a comprehensive approach is highly desirable. This paper presents a solution for maintaining the privacy of data released en masse in a controlled manner, and for providing secure access to the original data for authorized users. The results show that the solution is provably secure and maintains privacy in a more efficient manner than previous solutions

    Automatic blurring of specific faces in video

    Get PDF
    With the introduction of the General Data Protection Regulation (GDPR) into European Union law, it became more important than ever before to properly handle personal data. This is an issue for media companies which distribute large amounts of media containing identifiable people, which thus may require the subjects' permission for distribution. In this Master's thesis, I propose a solution which supports and facilitates compliance with GDPR regarding the distribution of video containing identifiable subjects by automatically blurring a select group of people in the videos. The proposed solution is a pipeline for detecting, identifying and blurring select faces, where the video frames are processed like individual images to detect and recognize faces, and the interrelatedness of adjacent frames in continuous videos is exploited to both to improve their prediction quality and running time. Each part of the pipeline is interchangeable and may be replaced individually, and the deployment of the entire pipeline has been automated. Aspects related to video processing, facial detection and facial recognition were explored for this purpose, and various existing tools and solutions were utilized.Masteroppgave i informatikkINF399MAMN-PROGMAMN-IN
    • …
    corecore