5,368 research outputs found

    Do Predictive Brain Implants Threaten Patient's Autonomy or Authenticity?

    Get PDF
    The development of predictive brain implant (PBI) technology that is able to forecast specific neuronal events and advise and/or automatically administer appropriate therapy for diseases of the brain raises a number of ethical issues. Provided that this technology satisfies basic safety and functionality conditions, one of the most pressing questions to address is its relation to the autonomy of patients. As Frederic Gilbert in his article asks, if autonomy implies a certain idea of freedom, or self-government, how can an individual be considered to decide freely if the implanted device stands at the inception of the causal chain producing his decisions? He claims that PBIs threaten persons’ autonomy by diminishing their post-operative experience of self-control. In this commentary, I wish to discuss this claim. Contrary to Gilbert, I will suggest that PBIs do not pose a significant threat to patient’s autonomy, as self-control, but rather to his/her sense of authenticity. My claim is that the language of authenticity, already introduced in the recent bioethical literature, may offer a better way to voice some of the concerns with PBIs that Gilbert recognized

    Digital Expungement

    Get PDF
    Digital technology might lead to the extinction of criminal rehabilitation. In the digital era, criminal history records that were expunged by the state remain widely available through commercial vendors (data brokers) who sell this information to interested parties, or simply through a basic search of the Internet. The wide availability of information on expunged criminal history records increases the collateral consequences a criminal record entails, thereby eliminating the possibility of reintegration into society. Acknowledging the social importance of rehabilitation, policymakers attempted to regulate the practices of data brokers by imposing various legal obligations and restrictions, usually relating to the nature and accuracy of criminal records and the purposes for which they may be used. These regulations have been proven insufficient to ensure rehabilitation. But regardless of future outcomes of such regulatory attempts, policymakers have largely overlooked the risks of the Internet to expungement. Many online service providers and hosting services enable the wide dissemination and accessibility of criminal history records that were expunged. Legal research websites, websites that publish booking photographs taken during investigation (mugshots), social media platforms, and media archives all offer access to expunged criminal histories, many times without charge, and all with the simple use of a search engine. Without legal intervention, rehabilitation in the digital age in the U.S. has become nearly impossible. This Article offers a legal framework for reducing the collateral consequences of expunged criminal records by offering to re-conceptualize the public nature of criminal records. It proceeds as follows. After an introduction, Part II examines rehabilitation and expungement as facets of criminal law. Part III explores the challenges of digital technology to rehabilitation measures. Part IV evaluates and discusses potential ex-ante and ex-post measures that could potentially enable rehabilitation in the digital age. It argues that while ex-post measures are both unconstitutional and unrealistic for enabling digital expungement, ex-ante measures could be a viable solution. Accordingly, this Article suggests implanting a graduated approach towards the public nature of criminal history records, which would be narrowly tailored to serve the interests of rehabilitation-by-expungement. Finally, the last Part concludes the discussion and warns against reluctance in regulating expunged criminal histories

    Vote-Trading in International Institutions

    Get PDF
    There is evidence that countries trade votes among each other in international institutions on a wide range of issues, including the use of force, trade issues and elections of judges. Vote-trading has been criticized as being a form of corruption, undue influence and coercion. Contrary to common wisdom, however, I argue in this paper that the case for introducing policy measures against vote-trading cannot be made out on the basis of available evidence. This paper sets out an analytical framework for analyzing vote-trading in international institutions, focusing on three major contexts in which vote-trading may generate benefits and costs: (1) agency costs (collective good), (2) coercive tendering and (3) agency costs (constituents). The applicability of each context depends primarily on the type of decision in question - i.e. preference-decision or judgment-decision - and the interests that countries are expected to maximize when voting. The analytical framework is applied to evidence of vote-trading in four institutions, the Security Council, the General Assembly, the World Trade Organization and the International Whaling Commission. The application of the analysis reveals that while vote-trading can create significant costs, there is only equivocal evidence to this effect, and in several cases vote-trading generates important benefits

    Generalized SURE for Exponential Families: Applications to Regularization

    Full text link
    Stein's unbiased risk estimate (SURE) was proposed by Stein for the independent, identically distributed (iid) Gaussian model in order to derive estimates that dominate least-squares (LS). In recent years, the SURE criterion has been employed in a variety of denoising problems for choosing regularization parameters that minimize an estimate of the mean-squared error (MSE). However, its use has been limited to the iid case which precludes many important applications. In this paper we begin by deriving a SURE counterpart for general, not necessarily iid distributions from the exponential family. This enables extending the SURE design technique to a much broader class of problems. Based on this generalization we suggest a new method for choosing regularization parameters in penalized LS estimators. We then demonstrate its superior performance over the conventional generalized cross validation approach and the discrepancy method in the context of image deblurring and deconvolution. The SURE technique can also be used to design estimates without predefining their structure. However, allowing for too many free parameters impairs the performance of the resulting estimates. To address this inherent tradeoff we propose a regularized SURE objective. Based on this design criterion, we derive a wavelet denoising strategy that is similar in sprit to the standard soft-threshold approach but can lead to improved MSE performance.Comment: to appear in the IEEE Transactions on Signal Processin

    A Semidefinite Programming Approach to Optimal Unambiguous Discrimination of Quantum States

    Full text link
    In this paper we consider the problem of unambiguous discrimination between a set of linearly independent pure quantum states. We show that the design of the optimal measurement that minimizes the probability of an inconclusive result can be formulated as a semidefinite programming problem. Based on this formulation, we develop a set of necessary and sufficient conditions for an optimal quantum measurement. We show that the optimal measurement can be computed very efficiently in polynomial time by exploiting the many well-known algorithms for solving semidefinite programs, which are guaranteed to converge to the global optimum. Using the general conditions for optimality, we derive necessary and sufficient conditions so that the measurement that results in an equal probability of an inconclusive result for each one of the quantum states is optimal. We refer to this measurement as the equal-probability measurement (EPM). We then show that for any state set, the prior probabilities of the states can be chosen such that the EPM is optimal. Finally, we consider state sets with strong symmetry properties and equal prior probabilities for which the EPM is optimal. We first consider geometrically uniform state sets that are defined over a group of unitary matrices and are generated by a single generating vector. We then consider compound geometrically uniform state sets which are generated by a group of unitary matrices using multiple generating vectors, where the generating vectors satisfy a certain (weighted) norm constraint.Comment: To appear in IEEE Transactions on Information Theor

    Unicity conditions for low-rank matrix recovery

    Get PDF
    Low-rank matrix recovery addresses the problem of recovering an unknown low-rank matrix from few linear measurements. Nuclear-norm minimization is a tractible approach with a recent surge of strong theoretical backing. Analagous to the theory of compressed sensing, these results have required random measurements. For example, m >= Cnr Gaussian measurements are sufficient to recover any rank-r n x n matrix with high probability. In this paper we address the theoretical question of how many measurements are needed via any method whatsoever --- tractible or not. We show that for a family of random measurement ensembles, m >= 4nr - 4r^2 measurements are sufficient to guarantee that no rank-2r matrix lies in the null space of the measurement operator with probability one. This is a necessary and sufficient condition to ensure uniform recovery of all rank-r matrices by rank minimization. Furthermore, this value of mm precisely matches the dimension of the manifold of all rank-2r matrices. We also prove that for a fixed rank-r matrix, m >= 2nr - r^2 + 1 random measurements are enough to guarantee recovery using rank minimization. These results give a benchmark to which we may compare the efficacy of nuclear-norm minimization
    • …
    corecore