149 research outputs found

    Editing the gene editing debate: Reassessing the normative discussions on emerging genetic technologies

    Get PDF
    The revolutionary potential of the CRISPR-Cas9 gene editing technique has created a resurgence in enthusiasm and concern in genetic research perhaps not seen since the mapping of the human genome at the turn of the century. Some such concerns and anxieties revolve around crossing lines between somatic and germline interventions as well as treatment and enhancement applications. Underpinning these concerns, there are familiar concepts of safety, unintended consequences and damage to genetic identity and the creation of designer children through pursuing human enhancement and eugenics. In the policy realm, these morally laden distinctions and anxieties are emerging as the basis for making important and applied measures to respond to the fast-evolving scientific developments. This paper argues that the dominant normative framing for such responses is insufficient for this task. This paper illustrates this insufficiency as arising from a continued reliance on misleading genetic essentialist assumptions that generate groundless speculation and over-reactionary normative responses. This phenomenon is explicit with regard to prospective human (germ line) genetic enhancements. While many normative theorists and state-of-the-art reports continue to gesture toward the influence of environmental and social influences on a person and their traits and capacities, this recognition does not extend to the substance of the arguments themselves which tend to revert to the debunked genetic determinist framework. Given the above, this paper argues that there is a pressing need for a more central role for sociological input into particular aspects of this “enhancement myth” in order to give added weight, detail and substance to these environmental influences and influence from social structures

    Genome editing and ‘disenhancement’: Considerations on issues of non-identity and genetic pluralism

    Get PDF
    In the decade prior to CRISPR-Cas9, Michael Parker criticised Julian Savulescu’s Procreative Beneficence (PB) Principle by arguing against the confidence to know what’s best in terms of genetic traits for our offspring. One important outcome of this criticism was a greater moral acceptance of deaf people genetically selecting deaf children. Although this outcome may have been morally controversial in an impersonal harm context, in such genetic selection (PGD) cases, a deaf child is not harmed in person-affecting terms because no other life is available to that child. We highlight that the person-affecting versus impersonal harm distinction is still held by many as making a significant moral difference to their overall argument (i.e. Savulescu, Parker, Boardman, De Miguel Beriain) and so for the purposes of this paper, we will assume it makes ‘some difference’ (even if only at the level of the message it sends out). Insofar as one considers the presence person-affecting harm to be morally important (and to whatever extent), the impersonal harm context in which the Parker–Savulescu debate arose thereby blunts an arguably even more radical outcome—that of genetically engineering, or gene editing, deafness into pre-existing embryos of future children. Now, the potential of CRISPR-Cas9 has revitalised such debates by reframing impersonal and person-affecting benefits/harms in the context of such disputes on the harm or not of a (chosen) disability. Replacing the genetic selection context with a genome editing context, we argue that Parker’s argument should also deem it morally acceptable for people who are deaf to genetically edit embryos to become children who are also deaf. Felicity Boardman’s recent comments suggest a similar radical potential as Parker’s, with the radicalness also blunted by an impersonal context (a context that Boardman, at least, sees as significant). We conclude that the genome editing reframing will push such arguments beyond what were originally intended, and this will create a more radical message that may help further define the relationship between new genomic technologies and disability

    Genuine participation in participant-centred research initiatives : the rhetoric and the potential reality

    Get PDF
    The introduction of Web 2.0 technology, along with a population increasingly proficient in Information and Communications Technology (ICT), coupled with the rapid advancements in genetic testing methods, has seen an increase in the presence of participant-centred research initiatives. Such initiatives, aided by the centrality of ICT interconnections, and the ethos they propound seem to further embody the ideal of increasing the participatory nature of research, beyond what might be possible in non-ICT contexts alone. However, the majority of such research seems to actualise a much narrower definition of ‘participation’—where it is merely the case that such research initiatives have increased contact with participants through ICT but are otherwise non-participatory in any important normative sense. Furthermore, the rhetoric of participant-centred initiatives tends to inflate this minimalist form of participation into something that it is not, i.e. something genuinely participatory, with greater connections with both the ICT-facilitated political contexts and the largely non-ICT participatory initiatives that have expanded in contemporary health and research contexts. In this paper, we highlight that genuine (ICT-based) ‘participation’ should enable a reasonable minimum threshold of participatory engagement through, at least, three central participatory elements: educative, sense of being involved and degree of control. While we agree with criticisms that, at present, genuine participation seems more rhetoric than reality, we believe that there is clear potential for a greater ICT-facilitated participatory engagement on all three participatory elements. We outline some practical steps such initiatives could take to further develop these elements and thereby their level of ICT-facilitated participatory engagement.Peer reviewe

    An EU comparative analysis of the regulation of clinical trials supervisory bodies in the aftermath of Regulation 536/2014

    Get PDF
    The new EU regulation on clinical trials is intended to promote a greater level of harmonization of European Union rules in this area. However, it does not elaborate a common normative framework regarding the functioning of research ethics committees, leaving this responsibility to the Member States. This article offers a comparative analysis of the resulting regulatory situation. It demonstrates that this scenario is defined by considerable variability in the regulation of ethics monitoring between the EU Member States. We argue that this disparity should not necessarily be a negative factor for the optimization of the trial supervision regime in the EU. Moreover, we consider that it may be a stimulus for the achievement of excellence in the performance of this monitoring task. On the other hand, we also highlight risks for the rights of participants if an adequate monitoring framework is not ensured. Under these circumstances, we observe how the EU faces a dilemma. On the one hand, it may promote a rigid uniformity between the regulation of ethics committees between Member States, but this might diminish the quality of their performance. On the other hand, it may opt for maintaining the current situation, but this might increase differences in the performance of the ethics committees between Member States, including the number trials performed by country. A third option would be to allow the competitive framework to remain for a set period of time, in order to learn from the best practices reached in individual Member States before finally harmonizing national legislative provisions on this basis.This work was supported by Eusko Jaurlaritza [grant number Ayudas a grupos de investigación IT-1066-16]; H2020 Science with and for Society [grant number GRANT AGREEMENT NUMBER — 788039 — PANELFIT]

    Structural study of a new HIV-1 entry inhibitor and interaction with the HIV-1 fusion peptide in dodecylphosphocholine micelles

    Get PDF
    Previous studies support the hypothesis that the envelope GB virus C (GBV-C) E1 protein interferes the HIV-1 entry and that a peptide, derived from the region 139-156 of this protein, has been defined as a novel HIV-1 entry inhibitor. In this work, we firstly focus on the characterization of the structural features of this peptide, which are determinant for its anti-HIV-1 activity and secondly, on the study of its interaction with the proposed viral target (i.e., the HIV-1 fusion peptide). We report the structure of the peptide determined by NMR spectroscopy in dodecylphosphocholine (DPC) micelles solved by using restrained molecular dynamics calculations. The acquisition of different NMR experiments in DPC micelles (i.e., peptide-peptide titration, diffusion NMR spectroscopy, and addition of paramagnetic relaxation agents) allows a proposal of an inhibition mechanism. We conclude that a 18-mer peptide from the non-pathogenic E1 GBV-C protein, with a helix-turn-helix structure inhibits HIV-1 by binding to the HIV-1 fusion peptide at the membrane level, thereby interfering with those domains in the HIV-1, which are critical for stabilizing the six-helix bundle formation in a membranous environment.Peer ReviewedPostprint (author's final draft

    Making maps of cosmic microwave background polarization for B-mode studies: The POLARBEAR example

    Get PDF
    Analysis of cosmic microwave background (CMB) datasets typically requires some filtering of the raw time-ordered data. For instance, in the context of ground-based observations, filtering is frequently used to minimize the impact of low frequency noise, atmospheric contributions and/or scan synchronous signals on the resulting maps. In this work we have explicitly constructed a general filtering operator, which can unambiguously remove any set of unwanted modes in the data, and then amend the map-making procedure in order to incorporate and correct for it. We show that such an approach is mathematically equivalent to the solution of a problem in which the sky signal and unwanted modes are estimated simultaneously and the latter are marginalized over. We investigated the conditions under which this amended map-making procedure can render an unbiased estimate of the sky signal in realistic circumstances. We then discuss the potential implications of these observations on the choice of map-making and power spectrum estimation approaches in the context of B-mode polarization studies. Specifically, we have studied the effects of time-domain filtering on the noise correlation structure in the map domain, as well as impact it may haveon the performance of the popular pseudo-spectrum estimators. We conclude that although maps produced by the proposed estimators arguably provide the most faithful representation of the sky possible given the data, they may not straightforwardly lead to the best constraints on the power spectra of the underlying sky signal and special care may need to be taken to ensure this is the case. By contrast, simplified map-makers which do not explicitly correct for time-domain filtering, but leave it to subsequent steps in the data analysis, may perform equally well and be easier and faster to implement. We focused on polarization-sensitive measurements targeting the B-mode component of the CMB signal and apply the proposed methods to realistic simulations based on characteristics of an actual CMB polarization experiment, POLARBEAR. Our analysis and conclusions are however more generally applicable. \ua9 ESO, 2017

    Identification of unique neoantigen qualities in long-term survivors of pancreatic cancer

    Get PDF
    Pancreatic ductal adenocarcinoma is a lethal cancer with fewer than 7% of patients surviving past 5 years. T-cell immunity has been linked to the exceptional outcome of the few long-term survivors1,2, yet the relevant antigens remain unknown. Here we use genetic, immunohistochemical and transcriptional immunoprofiling, computational biophysics, and functional assays to identify T-cell antigens in long-term survivors of pancreatic cancer. Using whole-exome sequencing and in silico neoantigen prediction, we found that tumours with both the highest neoantigen number and the most abundant CD8+ T-cell infiltrates, but neither alone, stratified patients with the longest survival. Investigating the specific neoantigen qualities promoting T-cell activation in long-term survivors, we discovered that these individuals were enriched in neoantigen qualities defined by a fitness model, and neoantigens in the tumour antigen MUC16 (also known as CA125). A neoantigen quality fitness model conferring greater immunogenicity to neoantigens with differential presentation and homology to infectious disease-derived peptides identified long-term survivors in two independent datasets, whereas a neoantigen quantity model ascribing greater immunogenicity to increasing neoantigen number alone did not. We detected intratumoural and lasting circulating T-cell reactivity to both high-quality and MUC16 neoantigens in long-term survivors of pancreatic cancer, including clones with specificity to both high-quality neoantigens and predicted cross-reactive microbial epitopes, consistent with neoantigen molecular mimicry. Notably, we observed selective loss of high-quality and MUC16 neoantigenic clones on metastatic progression, suggesting neoantigen immunoediting. Our results identify neoantigens with unique qualities as T-cell targets in pancreatic ductal adenocarcinoma. More broadly, we identify neoantigen quality as a biomarker for immunogenic tumours that may guide the application of immunotherapies
    • 

    corecore