247 research outputs found

    Biometric Boom: How the Private Sector Commodifies Human Characteristics

    Get PDF
    Biometric technology has become an increasingly common part of daily life. Although biometrics have been used for decades, recent ad- vances and new uses have made the technology more prevalent, particu- larly in the private sector. This Note examines how widespread use of biometrics by the private sector is commodifying human characteristics. As the use of biometrics has become more extensive, it exacerbates and exposes individuals and industry to a number of risks and problems asso- ciated with biometrics. Despite public belief, biometric systems may be bypassed, hacked, or even fail. The more a characteristic is utilized, the less value it will hold for security purposes. Once compromised, a biome- tric cannot be replaced as would a password or other security device. This Note argues that there are strong justifications for a legal struc- ture that builds hurdles to slow the adoption of biometrics in the private sector. By examining the law and economics and personality theories of commodification, this Note identifies market failure and potential harm to personhood due to biometrics. The competing theories justify a reform to protect human characteristics from commodification. This Note presents a set of principles and tools based on defaults, disclosures, incen- tives, and taxation to discourage use of biometrics, buying time to streng- then the technology, educate the public, and establish legal safeguards for when the technology is compromised or fails

    Improving k-nn search and subspace clustering based on local intrinsic dimensionality

    Get PDF
    In several novel applications such as multimedia and recommender systems, data is often represented as object feature vectors in high-dimensional spaces. The high-dimensional data is always a challenge for state-of-the-art algorithms, because of the so-called curse of dimensionality . As the dimensionality increases, the discriminative ability of similarity measures diminishes to the point where many data analysis algorithms, such as similarity search and clustering, that depend on them lose their effectiveness. One way to handle this challenge is by selecting the most important features, which is essential for providing compact object representations as well as improving the overall search and clustering performance. Having compact feature vectors can further reduce the storage space and the computational complexity of search and learning tasks. Support-Weighted Intrinsic Dimensionality (support-weighted ID) is a new promising feature selection criterion that estimates the contribution of each feature to the overall intrinsic dimensionality. Support-weighted ID identifies relevant features locally for each object, and penalizes those features that have locally lower discriminative power as well as higher density. In fact, support-weighted ID measures the ability of each feature to locally discriminate between objects in the dataset. Based on support-weighted ID, this dissertation introduces three main research contributions: First, this dissertation proposes NNWID-Descent, a similarity graph construction method that utilizes the support-weighted ID criterion to identify and retain relevant features locally for each object and enhance the overall graph quality. Second, with the aim to improve the accuracy and performance of cluster analysis, this dissertation introduces k-LIDoids, a subspace clustering algorithm that extends the utility of support-weighted ID within a clustering framework in order to gradually select the subset of informative and important features per cluster. k-LIDoids is able to construct clusters together with finding a low dimensional subspace for each cluster. Finally, using the compact object and cluster representations from NNWID-Descent and k-LIDoids, this dissertation defines LID-Fingerprint, a new binary fingerprinting and multi-level indexing framework for the high-dimensional data. LID-Fingerprint can be used for hiding the information as a way of preventing passive adversaries as well as providing an efficient and secure similarity search and retrieval for the data stored on the cloud. When compared to other state-of-the-art algorithms, the good practical performance provides an evidence for the effectiveness of the proposed algorithms for the data in high-dimensional spaces

    How Can and Would People Protect From Online Tracking?

    Get PDF
    Online tracking is complex and users find itchallenging to protect themselves from it. While the aca-demic community has extensively studied systems andusers for tracking practices, the link between the dataprotection regulations, websites’ practices of presentingprivacy-enhancing technologies (PETs), and how userslearn about PETs and practice them is not clear. Thispaper takes a multidimensional approach to find such alink. We conduct a study to evaluate the 100 top EUwebsites, where we find that information about PETsis provided far beyond the cookie notice. We also findthat opting-out from privacy settings is not as easy asopting-in and becomes even more difficult (if not impos-sible) when the user decides to opt-out of previously ac-cepted privacy settings. In addition, we conduct an on-line survey with 614 participants across three countries(UK, France, Germany) to gain a broad understand-ing of users’ tracking protection practices. We find thatusers mostly learn about PETs for tracking protectionvia their own research or with the help of family andfriends. We find a disparity between what websites offeras tracking protection and the ways individuals reportto do so. Observing such a disparity sheds light on whycurrent policies and practices are ineffective in support-ing the use of PETs by users

    Meeting the Stranger: Closing the Distance in Ernest Hemingway’s \u3ci\u3eA Moveable Feast\u3c/i\u3e

    Get PDF
    This thesis provides an in-depth analysis of Ernest Hemingway’s memoir, A Moveable Feast. The analysis focuses on how AMF functions as a memoir, given its complicated publication history. The thesis uses the 2009 Restored Edition, which is most closely associated with Hemingway’s original manuscripts. He crafts his memories of Paris between 1921-1926, develops interactive scenes for twenty-first century readers to discover his story, and constructs a blended voice that closes the distance between his present and his past by writing about his writing process. This thesis adds to the academic conversation of A Moveable Feast, attempting to present how important Hemingway’s final work is to the rest of his writing and how relevant it is to twenty-first century readers

    The \u27Uncanny\u27 and The Android

    Get PDF
    The character of the android is found widely in film and literature. While she appears across the entire spectrum of genres, she most often makes her appearance in the uncanny text. This appearance is nearly always accompanied by some variation of the vision motif. Despite wide spread interest in both the Uncanny\u27 and the android, to date, there is not a theory which accounts for the uncanny nature of the android and the prevalence of the vision motif in the android text. This paper will attempt to develop just such a theory. Any paper that addresses the \u27Uncanny\u27 must begin with Freud\u27s 1919 essay, The Uncanny. While this paper does not propose a psychoanalytic reading of the android, Freud\u27s work establishes the relationship between the android and the binary oppositions of strange/familiar, alive/dead and animate/inanimate. This discussion of binary oppositions leads to Ernst Jentsch\u27s 1909 publication, \u27On the Psychology of the Uncanny. Jentsch\u27s work is used to develop the uncanniness of the mechanical nature of life. Following Jentsch, Masahiro Mori\u27s 1970 publication, \u27The Uncanny Valley,\u27 places the human and the android on the same continuum, thus eliminating the opposition of man/machine. This, in turn, leads into a discussion of Donna Haraway\u27s The Cyborg Manifesto. Haraway\u27s model of the cyborg moves the discussion even further from dichotomous thought. The \u27Uncanny,\u27 it is concluded, is located at the midpoint of the binary pair. The android is uncanny because of her pivotal role in the dissolution of such pairs. Specifically, she compromises the mechanical/organic dichotomy. The android illustrates the mechanical nature of all life, thus making all life uncanny. The absolute foregrounding of vision in the android text requires a rethinking of the android. While android life is no different than human life in its mechanical qualities, the android nonetheless retains one fundamental difference: the android is designed. Thus androids, through an adaptation of Laura Mulvey\u27s \u27Visual Pleasure in Narrative Cinema,\u27 can be thought of as to-be-looked-at-ness machines. This enters the android into a reciprocal relationship with the camera, the looking-at-machine. It is this reciprocal machine-machine relationship which explains the ubiquitous pairing of the android with themes of vision

    Applications of a Graph Theoretic Based Clustering Framework in Computer Vision and Pattern Recognition

    Full text link
    Recently, several clustering algorithms have been used to solve variety of problems from different discipline. This dissertation aims to address different challenging tasks in computer vision and pattern recognition by casting the problems as a clustering problem. We proposed novel approaches to solve multi-target tracking, visual geo-localization and outlier detection problems using a unified underlining clustering framework, i.e., dominant set clustering and its extensions, and presented a superior result over several state-of-the-art approaches.Comment: doctoral dissertatio

    Motion Scalability for Video Coding with Flexible Spatio-Temporal Decompositions

    Get PDF
    PhDThe research presented in this thesis aims to extend the scalability range of the wavelet-based video coding systems in order to achieve fully scalable coding with a wide range of available decoding points. Since the temporal redundancy regularly comprises the main portion of the global video sequence redundancy, the techniques that can be generally termed motion decorrelation techniques have a central role in the overall compression performance. For this reason the scalable motion modelling and coding are of utmost importance, and specifically, in this thesis possible solutions are identified and analysed. The main contributions of the presented research are grouped into two interrelated and complementary topics. Firstly a flexible motion model with rateoptimised estimation technique is introduced. The proposed motion model is based on tree structures and allows high adaptability needed for layered motion coding. The flexible structure for motion compensation allows for optimisation at different stages of the adaptive spatio-temporal decomposition, which is crucial for scalable coding that targets decoding on different resolutions. By utilising an adaptive choice of wavelet filterbank, the model enables high compression based on efficient mode selection. Secondly, solutions for scalable motion modelling and coding are developed. These solutions are based on precision limiting of motion vectors and creation of a layered motion structure that describes hierarchically coded motion. The solution based on precision limiting relies on layered bit-plane coding of motion vector values. The second solution builds on recently established techniques that impose scalability on a motion structure. The new approach is based on two major improvements: the evaluation of distortion in temporal Subbands and motion search in temporal subbands that finds the optimal motion vectors for layered motion structure. Exhaustive tests on the rate-distortion performance in demanding scalable video coding scenarios show benefits of application of both developed flexible motion model and various solutions for scalable motion coding

    Worst-Case Execution Time Analysis for C++ based Real-Time On-Board Software Systems

    Get PDF
    Autonomous systems are today’s trend in the aerospace domain. These systems require more on-board data processing capabilities. They follow data-flow programming, and have similar software architecture. Developing a framework that is applicable for these architectures reduces the development efforts and improves the re-usability. However, its design’s essential requirement is to use a programming language that can offer both abstraction and static memory capabilities. As a result, C++ was chosen to develop the Tasking Framework, which is used to develop on-board data-flow-oriented applications. Validating the timing requirements for such a framework is a long, complicated process. Estimating the worst-case execution time (WCET) is the first step within this process. Thus, in this thesis, we focus on performing WCET analysis for C++ model-based applications developed by the Tasking Framework. This work deals with two main challenges that emerged from using C++: using objects impose the need for a memory model and using virtual methods implicate indirect jumps. To this end, we developed a tool based on symbolic execution that can handle both challenges. The tool showed high precision of early 90 % in bounding loops of the Benchmark suit. We then integrated our advanced analysis with an open toolbox for adaptive WCET analysis. Finally, we evaluated our approach for estimating the WCET for tasks developed by the Tasking Framework

    Pleasure Patents

    Full text link
    The United States Patent and Trademark Office has granted thousands of patents for inventions whose purpose is to facilitate the sexual pleasure of their users. These pleasure patents raise a range of novel questions about both patent theory and the relationship between law and sexuality more broadly. Given that immoral inventions were long excluded from the patent system, and that sexual devices were widely criminalized for much of the past 150 years, how have patentees successfully framed the contributions of their sexual inventions? If a patentable invention must be both new and useful, how have patentees described the utility of sexual pleasure? This Article identifies several hundred patents that the USPTO has formally classified as improving sexual stimulation and intercourse, and it closely examines how patentees have described the utility of sexual pleasure over time. In describing the utility of technologies such as phalluses, vibrators, and virtual reality systems, patentees employ a diverse and rich set of themes about the purposes and social values of sexual pleasure. By facilitating sexual pleasure, these patented technologies can, according to their inventors: improve marital harmony, overcome female frigidity, calm fears of HIV transmission, reduce sexual assault, suppress demand for sex work, minimize the loneliness of single people, facilitate LGBTQIA relationships, and promote the emotional well-being of people with disabilities. As social and sexual norms have changed over time, so too have the various explanations for the social value of pleasure patents. This Article shows that the patent system is an underappreciated, and perhaps unexpected, archive of historical and contemporary sexual norms
    • …
    corecore