4,913 research outputs found

    Designing personalised cancer treatments

    Get PDF
    The concept of personalised medicine for cancer is not new. It arguably began with the attempts by Salmon and Hamburger to produce a viable cellular chemosensitivity assay in the 1970s, and continues to this day. While clonogenic assays soon fell out of favour due to their high failure rate, other cellular assays fared better and although they have not entered widespread clinical practice, they have proved to be very useful research tools. For instance, the ATP-based chemosensitivity assay was developed in the early 1990s and is highly standardised. It has proved useful for evaluating new drugs and combinations, and in recent years has been used to understand the molecular basis of drug resistance and sensitivity to anti-cancer drugs. Recent developments allow unparalleled genotyping and phenotyping of tumours, providing a plethora of targets for the development of new cancer treatments. However, validation of such targets and new agents to permit translation to the clinic remains difficult. There has been one major disappointment in that cell lines, though useful, do not often reflect the behaviour of their parent cancers with sufficient fidelity to be useful. Low passage cell lines — either in culture or xenografts are being used to overcome some of these issues, but have several problems of their own. Primary cell culture remains useful, but large tumours are likely to receive neo-adjuvant treatment before removal and that limits the tumour types that can be studied. The development of new treatments remains difficult and prediction of the clinical efficacy of new treatments from pre-clinical data is as hard as ever. One lesson has certainly been that one cannot buck the biology — and that understanding the genome alone is not sufficient to guarantee success. Nowhere has this been more evident than in the development of EGFR inhibitors. Despite overexpression of EGFR by many tumour types, only those with activating EGFR mutations and an inability to circumvent EGFR blockade have proved susceptible to treatment. The challenge is how to use advanced molecular understanding with limited cellular assay information to improve both drug development and the design of companion diagnostics to guide their use. This has the capacity to remove much of the guesswork from the process and should improve success rates

    Designing personalised cancer treatments

    Get PDF
    The concept of personalised medicine for cancer is not new. It arguably began with the attempts by Salmon and Hamburger to produce a viable cellular chemosensitivity assay in the 1970s, and continues to this day. While clonogenic assays soon fell out of favour due to their high failure rate, other cellular assays fared better and although they have not entered widespread clinical practice, they have proved to be very useful research tools. For instance, the ATP-based chemosensitivity assay was developed in the early 1990s and is highly standardised. It has proved useful for evaluating new drugs and combinations, and in recent years has been used to understand the molecular basis of drug resistance and sensitivity to anti-cancer drugs. Recent developments allow unparalleled genotyping and phenotyping of tumours, providing a plethora of targets for the development of new cancer treatments. However, validation of such targets and new agents to permit translation to the clinic remains difficult. There has been one major disappointment in that cell lines, though useful, do not often reflect the behaviour of their parent cancers with sufficient fidelity to be useful. Low passage cell lines — either in culture or xenografts are being used to overcome some of these issues, but have several problems of their own. Primary cell culture remains useful, but large tumours are likely to receive neo-adjuvant treatment before removal and that limits the tumour types that can be studied. The development of new treatments remains difficult and prediction of the clinical efficacy of new treatments from pre-clinical data is as hard as ever. One lesson has certainly been that one cannot buck the biology — and that understanding the genome alone is not sufficient to guarantee success. Nowhere has this been more evident than in the development of EGFR inhibitors. Despite overexpression of EGFR by many tumour types, only those with activating EGFR mutations and an inability to circumvent EGFR blockade have proved susceptible to treatment. The challenge is how to use advanced molecular understanding with limited cellular assay information to improve both drug development and the design of companion diagnostics to guide their use. This has the capacity to remove much of the guesswork from the process and should improve success rates

    The 2016 Reactivations of Main-Belt Comets 238P/Read and 288P/(300163) 2006 VW139

    Full text link
    We report observations of the reactivations of main-belt comets 238P/Read and 288P/(300163) 2006 VW139, that also track the evolution of each object's activity over several months in 2016 and 2017. We additionally identify and analyze archival SDSS data showing 288P to be active in 2000, meaning that both 238P and 288P have now each been confirmed to be active near perihelion on three separate occasions. From data obtained of 288P from 2012-2015 when it appeared inactive, we find best-fit R-band H,G phase function parameters of H_R=16.80+/-0.12 mag and G_R=0.18+/-0.11, corresponding to effective component radii of r_c=0.80+/-0.04 km, assuming a binary system with equally-sized components. Fitting linear functions to ejected dust masses inferred for 238P and 288P soon after their observed reactivations in 2016, we find an initial average net dust production rate of 0.7+/-0.3 kg/s and a best-fit start date of 2016 March 11 (when the object was at a true anomaly of -63 deg) for 238P, and an initial average net dust production rate of 5.6+/-0.7 kg/s and a best-fit start date of 2016 August 5 (when the object was at a true anomaly of -27 deg) for 288P. Applying similar analyses to archival data, we find similar start points for previous active episodes for both objects, suggesting that minimal mantle growth or ice recession occurred between the active episodes in question. Some changes in dust production rates between active episodes are detected, however. More detailed dust modeling is suggested to further clarify the process of activity evolution in main-belt comets.Comment: 21 pages, 9 figures, accepted by A

    Study of aluminoborane compound AlB_4H_(11) for hydrogen storage

    Get PDF
    Aluminoborane compounds AlB_4H_(11), AlB_5H_(12), and AlB_6H_(13) were reported by Himpsl and Bond in 1981, but they have eluded the attention of the worldwide hydrogen storage research community for more than a quarter of a century. These aluminoborane compounds have very attractive properties for hydrogen storage: high hydrogen capacity (i.e., 13.5, 12.9, and 12.4 wt % H, respectively) and attractive hydrogen desorption temperature (i.e., AlB_4H_(11) decomposes at ~125 °C). We have synthesized AlB_4H_(11) and studied its thermal desorption behavior using temperature-programmed desorption with mass spectrometry, gas volumetric (Sieverts) measurement, infrared (IR) spectroscopy, and solid state nuclear magnetic resonance (NMR). Rehydrogenation of hydrogen-desorbed products was performed and encouraging evidence of at least partial reversibility for hydrogenation at relatively mild conditions is observed. Our chemical analysis indicates that the formula for the compound is closer to AlB_4H_(12) than AlB_4H_(11)

    Simulation of quantum random walks using interference of classical field

    Full text link
    We suggest a theoretical scheme for the simulation of quantum random walks on a line using beam splitters, phase shifters and photodetectors. Our model enables us to simulate a quantum random walk with use of the wave nature of classical light fields. Furthermore, the proposed set-up allows the analysis of the effects of decoherence. The transition from a pure mean photon-number distribution to a classical one is studied varying the decoherence parameters.Comment: extensively revised version; title changed; to appear on Phys. Rev.

    Fashioning Circuits

    Get PDF
    Curatorial note from Digital Pedagogy in the Humanities: Kim Knight’s course materials for Fashioning Circuits bring together the history of fashion and wearable electronics to explore the effects of media on bodies at the intersections of race, gender, class, ability, and sexuality. After doing small-scale projects to introduce students to creating wearables as well as participating in class discussions about intersectionality and media culture, students produce social-justice-oriented wearable projects intended to provide a solution to a problem, make a statement, or create a social intervention. This hands-on experience in critical making is accompanied by discussion of the affordances and limitations of fashion and its relationship with wearable electronics. While other course materials that blend new media with intersectionality tend to emphasize analysis and multimodal writing to assess student outcomes, Fashioning Circuits asks students to perform the critiques they are making by creating digital objects of a different kind—LED safety jackets for dogs, a carbon-monoxide-sensing hat, or an antianxiety bracelet—to demonstrate their understanding of intersectionality and technology. Instructors can incorporate Knight’s course materials, whether prototyping or implementation exercises, to offer students hands-on experiences of social justice innovation

    Optimal estimation of joint parameters in phase space

    Get PDF
    We address the joint estimation of the two defining parameters of a displacement operation in phase space. In a measurement scheme based on a Gaussian probe field and two homodyne detectors, it is shown that both conjugated parameters can be measured below the standard quantum limit when the probe field is entangled. We derive the most informative Cram\'er-Rao bound, providing the theoretical benchmark on the estimation and observe that our scheme is nearly optimal for a wide parameter range characterizing the probe field. We discuss the role of the entanglement as well as the relation between our measurement strategy and the generalized uncertainty relations.Comment: 8 pages, 3 figures; v2: references added and sections added to the supplemental material; v3: minor changes (published version

    Entanglement induced by a single-mode heat environment

    Get PDF
    A thermal field, which frequently appears in problems of decoherence, provides us with minimal information about the field. We study the interaction of the thermal field and a quantum system composed of two qubits and find that such a chaotic field with minimal information can nevertheless entangle the qubits which are prepared initially in a separable state. This simple model of a quantum register interacting with a noisy environment allows us to understand how memory of the environment affects the state of a quantum register.Comment: 13pages, 3 figure
    • …
    corecore