1,185 research outputs found

    Algorithm and performance of a clinical IMRT beam-angle optimization system

    Full text link
    This paper describes the algorithm and examines the performance of an IMRT beam-angle optimization (BAO) system. In this algorithm successive sets of beam angles are selected from a set of predefined directions using a fast simulated annealing (FSA) algorithm. An IMRT beam-profile optimization is performed on each generated set of beams. The IMRT optimization is accelerated by using a fast dose calculation method that utilizes a precomputed dose kernel. A compact kernel is constructed for each of the predefined beams prior to starting the FSA algorithm. The IMRT optimizations during the BAO are then performed using these kernels in a fast dose calculation engine. This technique allows the IMRT optimization to be performed more than two orders of magnitude faster than a similar optimization that uses a convolution dose calculation engine.Comment: Final version that appeared in Phys. Med. Biol. 48 (2003) 3191-3212. Original EPS figures have been converted to PNG files due to size limi

    Organization of Block Copolymers using NanoImprint Lithography: Comparison of Theory and Experiments

    Full text link
    We present NanoImprint lithography experiments and modeling of thin films of block copolymers (BCP). The NanoImprint lithography is used to align perpendicularly lamellar phases, over distances much larger than the natural lamellar periodicity. The modeling relies on self-consistent field calculations done in two- and three-dimensions. We get a good agreement with the NanoImprint lithography setups. We find that, at thermodynamical equilibrium, the ordered BCP lamellae are much better aligned than when the films are deposited on uniform planar surfaces

    Long-Term Outcome of Otherwise Healthy Individuals with Incidentally Discovered Borderline Thrombocytopenia

    Get PDF
    BACKGROUND: The long-term outcome of individuals with mild degrees of thrombocytopenia is unknown. METHODS AND FINDINGS: In a prospective study conducted between August 1992 and December 2002, 260 apparently healthy individuals with incidentally discovered platelet counts between 100 × 10(9)/l and 150 × 10(9)/l were monitored for 6 mo to determine whether their condition persisted. The monitoring period was completed in 217 cases, of whom 191 (88%) maintained stable platelet counts. These 191 individuals were included in a long-term follow-up study to gain knowledge of their natural history. With a median time of observation of 64 mo, the thrombocytopenia resolved spontaneously or persisted with no other disorders becoming apparent in 64% of cases. The most frequent event during the study period was the subsequent development of an autoimmune disease. The 10-y probability of developing idiopathic thrombocytopenic purpura (ITP), as defined by platelet counts persistently below 100 × 10(9)/l, was 6.9% (95% confidence interval [CI]: 4.0%–12.0%). The 10-y probability of developing autoimmune disorders other than ITP was 12.0% (95% CI: 6.9%–20.8%). Most of the cases (85%) of autoimmune disease occurred in women. CONCLUSIONS: Healthy individuals with a sustained platelet count between 100 × 10(9)/l and 150 × 10(9)/l have a 10-y probability of developing autoimmune disorders of 12%. Further investigation is required to establish whether this risk is higher than in the general population and whether an intensive follow-up results in an improvement of prognosis

    The Expression and Localization of N-Myc Downstream-Regulated Gene 1 in Human Trophoblasts

    Get PDF
    The protein N-Myc downstream-regulated gene 1 (NDRG1) is implicated in the regulation of cell proliferation, differentiation, and cellular stress response. NDRG1 is expressed in primary human trophoblasts, where it promotes cell viability and resistance to hypoxic injury. The mechanism of action of NDRG1 remains unknown. To gain further insight into the intracellular action of NDRG1, we analyzed the expression pattern and cellular localization of endogenous NDRG1 and transfected Myc-tagged NDRG1 in human trophoblasts exposed to diverse injuries. In standard conditions, NDRG1 was diffusely expressed in the cytoplasm at a low level. Hypoxia or the hypoxia mimetic cobalt chloride, but not serum deprivation, ultraviolet (UV) light, or ionizing radiation, induced the expression of NDRG1 in human trophoblasts and the redistribution of NDRG1 into the nucleus and cytoplasmic membranes associated with the endoplasmic reticulum (ER) and microtubules. Mutation of the phosphopantetheine attachment site (PPAS) within NDRG1 abrogated this pattern of redistribution. Our results shed new light on the impact of cell injury on NDRG1 expression patterns, and suggest that the PPAS domain plays a key role in NDRG1's subcellular distribution. © 2013 Shi et al

    Net Efficacy Adjusted for Risk (NEAR): A Simple Procedure for Measuring Risk:Benefit Balance

    Get PDF
    BACKGROUND: Although several mathematical models have been proposed to assess the risk:benefit of drugs in one measure, their use in practice has been rather limited. Our objective was to design a simple, easily applicable model. In this respect, measuring the proportion of patients who respond favorably to treatment without being affected by adverse drug reactions (ADR) could be a suitable endpoint. However, remarkably few published clinical trials report the data required to calculate this proportion. As an approach to the problem, we calculated the expected proportion of this type of patients. METHODOLOGY/PRINCIPAL FINDINGS: Theoretically, responders without ADR may be obtained by multiplying the total number of responders by the total number of subjects that did not suffer ADR, and dividing the product by the total number of subjects studied. When two drugs are studied, the same calculation may be repeated for the second drug. Then, by constructing a 2 x 2 table with the expected frequencies of responders with and without ADR, and non-responders with and without ADR, the odds ratio and relative risk with their confidence intervals may be easily calculated and graphically represented on a logarithmic scale. Such measures represent "net efficacy adjusted for risk" (NEAR). We assayed the model with results extracted from several published clinical trials or meta-analyses. On comparing our results with those originally reported by the authors, marked differences were found in some cases, with ADR arising as a relevant factor to balance the clinical benefit obtained. The particular features of the adverse reaction that must be weighed against benefit is discussed in the paper. CONCLUSION: NEAR representing overall risk-benefit may contribute to improving knowledge of drug clinical usefulness. As most published clinical trials tend to overestimate benefits and underestimate toxicity, our measure represents an effort to change this trend

    Decoherence, einselection, and the quantum origins of the classical

    Full text link
    Decoherence is caused by the interaction with the environment. Environment monitors certain observables of the system, destroying interference between the pointer states corresponding to their eigenvalues. This leads to environment-induced superselection or einselection, a quantum process associated with selective loss of information. Einselected pointer states are stable. They can retain correlations with the rest of the Universe in spite of the environment. Einselection enforces classicality by imposing an effective ban on the vast majority of the Hilbert space, eliminating especially the flagrantly non-local "Schr\"odinger cat" states. Classical structure of phase space emerges from the quantum Hilbert space in the appropriate macroscopic limit: Combination of einselection with dynamics leads to the idealizations of a point and of a classical trajectory. In measurements, einselection replaces quantum entanglement between the apparatus and the measured system with the classical correlation.Comment: Final version of the review, with brutally compressed figures. Apart from the changes introduced in the editorial process the text is identical with that in the Rev. Mod. Phys. July issue. Also available from http://www.vjquantuminfo.or

    Architectures for Multinode Superconducting Quantum Computers

    Full text link
    Many proposals to scale quantum technology rely on modular or distributed designs where individual quantum processors, called nodes, are linked together to form one large multinode quantum computer (MNQC). One scalable method to construct an MNQC is using superconducting quantum systems with optical interconnects. However, a limiting factor of these machines will be internode gates, which may be two to three orders of magnitude noisier and slower than local operations. Surmounting the limitations of internode gates will require a range of techniques, including improvements in entanglement generation, the use of entanglement distillation, and optimized software and compilers, and it remains unclear how improvements to these components interact to affect overall system performance, what performance from each is required, or even how to quantify the performance of each. In this paper, we employ a `co-design' inspired approach to quantify overall MNQC performance in terms of hardware models of internode links, entanglement distillation, and local architecture. In the case of superconducting MNQCs with microwave-to-optical links, we uncover a tradeoff between entanglement generation and distillation that threatens to degrade performance. We show how to navigate this tradeoff, lay out how compilers should optimize between local and internode gates, and discuss when noisy quantum links have an advantage over purely classical links. Using these results, we introduce a roadmap for the realization of early MNQCs which illustrates potential improvements to the hardware and software of MNQCs and outlines criteria for evaluating the landscape, from progress in entanglement generation and quantum memory to dedicated algorithms such as distributed quantum phase estimation. While we focus on superconducting devices with optical interconnects, our approach is general across MNQC implementations.Comment: 23 pages, white pape

    Search for the standard model Higgs boson in the H to ZZ to 2l 2nu channel in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for the standard model Higgs boson in the H to ZZ to 2l 2nu decay channel, where l = e or mu, in pp collisions at a center-of-mass energy of 7 TeV is presented. The data were collected at the LHC, with the CMS detector, and correspond to an integrated luminosity of 4.6 inverse femtobarns. No significant excess is observed above the background expectation, and upper limits are set on the Higgs boson production cross section. The presence of the standard model Higgs boson with a mass in the 270-440 GeV range is excluded at 95% confidence level.Comment: Submitted to JHE

    Search for New Physics with Jets and Missing Transverse Momentum in pp collisions at sqrt(s) = 7 TeV

    Get PDF
    A search for new physics is presented based on an event signature of at least three jets accompanied by large missing transverse momentum, using a data sample corresponding to an integrated luminosity of 36 inverse picobarns collected in proton--proton collisions at sqrt(s)=7 TeV with the CMS detector at the LHC. No excess of events is observed above the expected standard model backgrounds, which are all estimated from the data. Exclusion limits are presented for the constrained minimal supersymmetric extension of the standard model. Cross section limits are also presented using simplified models with new particles decaying to an undetected particle and one or two jets
    corecore