259 research outputs found

    Time Averaged Quantum Dynamics and the Validity of the Effective Hamiltonian Model

    Full text link
    We develop a technique for finding the dynamical evolution in time of an averaged density matrix. The result is an equation of evolution that includes an Effective Hamiltonian, as well as decoherence terms in Lindblad form. Applying the general equation to harmonic Hamiltonians, we confirm a previous formula for the Effective Hamiltonian together with a new decoherence term which should in general be included, and whose vanishing provides the criteria for validity of the Effective Hamiltonian approach. Finally, we apply the theory to examples of the AC Stark Shift and Three- Level Raman Transitions, recovering a new decoherence effect in the latter.Comment: 7 pages, 2 figure

    Facilitators and barriers to the successful implementation of pediatric antibacterial drug trials: Findings from CTTI's survey of investigators.

    Get PDF
    An urgent need exists to develop new antibacterial drugs for children. We conducted research with investigators of pediatric antibacterial drug trials to identify facilitators and barriers in the conduct of these trials. Seventy-three investigators completed an online survey assessing the importance of 15 facilitators (grouped in 5 topical categories) and the severity of 36 barriers (grouped in 6 topical categories) to implementing pediatric antibacterial drug trials. Analysis focused on the identification of key factors that facilitate the successful implementation of pediatric antibacterial drug trials and the key barriers to implementation. Almost all investigators identified two factors as very important facilitators: having site personnel for enrollment and having adequate funding. Other top factors were related to staffing. Among the barriers, factors related to parent concerns and consent were prominent, particularly obtaining parental consent when there was disagreement between parents, concerns about the number of blood draws, and concerns about the number of invasive procedures. Having overly narrow eligibility criteria was also identified as a major barrier. The survey findings suggest three areas in which to focus efforts to help facilitate ongoing drug development: (1) improving engagement with parents of children who may be eligible to enroll in a pediatric antibacterial drug trial, (2) broadening inclusion criteria to allow more participants to enroll, and (3) ensuring adequate staffing and establishing sustainable financial strategies, such as funding pediatric trial networks. The pediatric antibacterial drug trials enterprise is likely to benefit from focused efforts by all stakeholders to remove barriers and enhance facilitation

    Integration of NEMO into an existing particle physics environment through virtualization

    Get PDF
    With the ever-growing amount of data collected with the experiments at the Large Hadron Collider (LHC) (Evans et al., 2008), the need for computing resources that can handle the analysis of this data is also rapidly increasing. This increase will even be amplified after upgrading to the High Luminosity LHC (Apollinari et al., 2017). High-Performance Computing (HPC) and other cluster computing resources provided by universities can be useful supplements to the resources dedicated to the experiment as part of the Worldwide LHC Computing Grid (WLCG) (Eck et al., 2005) for data analysis and production of simulated event samples. Computing resources in the WLCG are structured in four layers – so-called Tiers. The first layer comprises two Tier-0 computing centres located at CERN in Geneva, Switzerland and at the Wigner Research Centre for Physics in Budapest, Hungary. The second layer consists of thirteen Tier-1 centres, followed by 160 Tier-2 sites, which are typically universities and other scientific institutes. The final layer are Tier-3 sites which are directly used by local users. The University of Freiburg is operating a combined Tier-2/Tier-3, the ATLAS-BFG (Backofen et al., 2006). The shared HPC cluster »NEMO« at the University of Freiburg has been made available to local ATLAS (Aad et al., 2008) users through the provisioning of virtual machines incorporating the ATLAS software environment analogously to the bare metal system at the Tier-3. In addition to the provisioning of the virtual environment, the on-demand integration of these resources into the Tier-3 scheduler in a dynamic way is described. In order to provide the external NEMO resources to the user in a transparent way, an intermediate layer connecting the two batch systems is put into place. This resource scheduler monitors requirements on the user-facing system and requests resources on the backend-system

    A system of ODEs for a Perturbation of a Minimal Mass Soliton

    Full text link
    We study soliton solutions to a nonlinear Schrodinger equation with a saturated nonlinearity. Such nonlinearities are known to possess minimal mass soliton solutions. We consider a small perturbation of a minimal mass soliton, and identify a system of ODEs similar to those from Comech and Pelinovsky (2003), which model the behavior of the perturbation for short times. We then provide numerical evidence that under this system of ODEs there are two possible dynamical outcomes, which is in accord with the conclusions of Pelinovsky, Afanasjev, and Kivshar (1996). For initial data which supports a soliton structure, a generic initial perturbation oscillates around the stable family of solitons. For initial data which is expected to disperse, the finite dimensional dynamics follow the unstable portion of the soliton curve.Comment: Minor edit

    Minimum follow-up time required for the estimation of statistical cure of cancer patients: verification using data from 42 cancer sites in the SEER database

    Get PDF
    BACKGROUND: The present commonly used five-year survival rates are not adequate to represent the statistical cure. In the present study, we established the minimum number of years required for follow-up to estimate statistical cure rate, by using a lognormal distribution of the survival time of those who died of their cancer. We introduced the term, threshold year, the follow-up time for patients dying from the specific cancer covers most of the survival data, leaving less than 2.25% uncovered. This is close enough to cure from that specific cancer. METHODS: Data from the Surveillance, Epidemiology and End Results (SEER) database were tested if the survival times of cancer patients who died of their disease followed the lognormal distribution using a minimum chi-square method. Patients diagnosed from 1973–1992 in the registries of Connecticut and Detroit were chosen so that a maximum of 27 years was allowed for follow-up to 1999. A total of 49 specific organ sites were tested. The parameters of those lognormal distributions were found for each cancer site. The cancer-specific survival rates at the threshold years were compared with the longest available Kaplan-Meier survival estimates. RESULTS: The characteristics of the cancer-specific survival times of cancer patients who died of their disease from 42 cancer sites out of 49 sites were verified to follow different lognormal distributions. The threshold years validated for statistical cure varied for different cancer sites, from 2.6 years for pancreas cancer to 25.2 years for cancer of salivary gland. At the threshold year, the statistical cure rates estimated for 40 cancer sites were found to match the actuarial long-term survival rates estimated by the Kaplan-Meier method within six percentage points. For two cancer sites: breast and thyroid, the threshold years were so long that the cancer-specific survival rates could yet not be obtained because the SEER data do not provide sufficiently long follow-up. CONCLUSION: The present study suggests a certain threshold year is required to wait before the statistical cure rate can be estimated for each cancer site. For some cancers, such as breast and thyroid, the 5- or 10-year survival rates inadequately reflect statistical cure rates, and highlight the need for long-term follow-up of these patients

    Survival of patients with metastatic breast cancer: twenty-year data from two SEER registries

    Get PDF
    BACKGROUND: Many researchers are interested to know if there are any improvements in recent treatment results for metastatic breast cancer in the community, especially for 10- or 15-year survival. METHODS: Between 1981 and 1985, 782 and 580 female patients with metastatic breast cancer were extracted respectively from the Connecticut and San Francisco-Oakland registries of the Surveillance, Epidemiology, and End Results (SEER) database. The lognormal statistical method to estimate survival was retrospectively validated since the 15-year cause-specific survival rates could be calculated using the standard life-table actuarial method. Estimated rates were compared to the actuarial data available in 2000. Between 1991 and 1995, further 752 and 632 female patients with metastatic breast cancer were extracted respectively from the Connecticut and San Francisco-Oakland registries. The data were analyzed to estimate the 15-year cause-specific survival rates before the year 2005. RESULTS: The 5-year period (1981–1985) was chosen, and patients were followed as a cohort for an additional 3 years. The estimated 15-year cause-specific survival rates were 7.1% (95% confidence interval, CI, 1.8–12.4) and 9.1% (95% CI, 3.8–14.4) by the lognormal model for the two registries of Connecticut and San Francisco-Oakland respectively. Since the SEER database provides follow-up information to the end of the year 2000, actuarial calculation can be performed to confirm (validate) the estimation. The Kaplan-Meier calculation for the 15-year cause-specific survival rates were 8.3% (95% CI, 5.8–10.8) and 7.0% (95% CI, 4.3–9.7) respectively. Using the 1991–1995 5-year period cohort and followed for an additional 3 years, the 15-year cause-specific survival rates were estimated to be 9.1% (95% CI, 3.8–14.4) and 14.7% (95% CI, 9.8–19.6) for the two registries of Connecticut and San Francisco-Oakland respectively. CONCLUSIONS: For the period 1981–1985, the 15-year cause-specific survival for the Connecticut and the San Francisco-Oakland registries were comparable. For the period 1991–1995, there was not much change in survival for the Connecticut registry patients, but there was an improvement in survival for the San Francisco-Oakland registry patients

    AUDITOR: Accounting for opportunistic resources

    Get PDF
    The increasing computational demand in High Energy Physics (HEP) as well as increasing concerns about energy efficiency in highperformance/high-throughput computing are driving forces in the search for more efficient ways to utilise available resources. Since avoiding idle resources is key in achieving high efficiency, an appropriate measure is sharing of idle resources of underutilised sites with fully occupied sites. The software COBalD/TARDIS can automatically, transparently, and dynamically (dis)integrate such resources in an opportunistic manner. Sharing resources however also requires accounting. In this work we introduce AUDITOR (AccoUnting DatahandlIng Toolbox for Opportunistic Resources), a flexible and extensible accounting system that is able to cover a wide range of use cases and infrastructures. AUDITOR gathers accounting data via so-called collectors which are designed to monitor batch systems, COBalD/TARDIS, cloud schedulers, or other sources of information. The data is stored in a database and provided to so-called plugins, which act based on accounting records. An action could for instance be creating a bill of utilised resources, computing the CO2 footprint, adjusting parameters of a service, or forwarding accounting information to other accounting systems. Depending on the use case, a suitable collector and plugin can be chosen from a growing ecosystem of collectors and plugins. Libraries for interacting with AUDITOR are provided to facilitate the development of collectors and plugins by the community
    • …
    corecore