284 research outputs found

    Dimensionality effects in restricted bosonic and fermionic systems

    Full text link
    The phenomenon of Bose-like condensation, the continuous change of the dimensionality of the particle distribution as a consequence of freezing out of one or more degrees of freedom in the low particle density limit, is investigated theoretically in the case of closed systems of massive bosons and fermions, described by general single-particle hamiltonians. This phenomenon is similar for both types of particles and, for some energy spectra, exhibits features specific to multiple-step Bose-Einstein condensation, for instance the appearance of maxima in the specific heat. In the case of fermions, as the particle density increases, another phenomenon is also observed. For certain types of single particle hamiltonians, the specific heat is approaching asymptotically a divergent behavior at zero temperature, as the Fermi energy ϵF\epsilon_{\rm F} is converging towards any value from an infinite discrete set of energies: ϵii1{\epsilon_i}_{i\ge 1}. If ϵF=ϵi\epsilon_{\rm F}=\epsilon_i, for any i, the specific heat is divergent at T=0 just in infinite systems, whereas for any finite system the specific heat approaches zero at low enough temperatures. The results are particularized for particles trapped inside parallelepipedic boxes and harmonic potentials. PACS numbers: 05.30.Ch, 64.90.+b, 05.30.Fk, 05.30.JpComment: 7 pages, 3 figures (included

    Bose-Einstein condensation as symmetry breaking in compact curved spacetimes

    Get PDF
    We examine Bose-Einstein condensation as a form of symmetry breaking in the specific model of the Einstein static universe. We show that symmetry breaking never occursin the sense that the chemical potential μ\mu never reaches its critical value.This leads us to some statements about spaces of finite volume in general. In an appendix we clarify the relationship between the standard statistical mechanical approaches and the field theory method using zeta functions.Comment: Revtex, 25 pages, 3 figures, uses EPSF.sty. To be published in Phys. Rev.

    Photon-Photon Scattering, Pion Polarizability and Chiral Symmetry

    Get PDF
    Recent attempts to detect the pion polarizability via analysis of γγππ\gamma\gamma\rightarrow\pi\pi measurements are examined. The connection between calculations based on dispersion relations and on chiral perturbation theory is established by matching the low energy chiral amplitude with that given by a full dispersive treatment. Using the values for the polarizability required by chiral symmetry, predicted and experimental cross sections are shown to be in agreement.Comment: 21 pages(+10 figures available on request), LATEX, UMHEP-38

    Bose-Einstein Condensation in a Harmonic Potential

    Full text link
    We examine several features of Bose-Einstein condensation (BEC) in an external harmonic potential well. In the thermodynamic limit, there is a phase transition to a spatial Bose-Einstein condensed state for dimension D greater than or equal to 2. The thermodynamic limit requires maintaining constant average density by weakening the potential while increasing the particle number N to infinity, while of course in real experiments the potential is fixed and N stays finite. For such finite ideal harmonic systems we show that a BEC still occurs, although without a true phase transition, below a certain ``pseudo-critical'' temperature, even for D=1. We study the momentum-space condensate fraction and find that it vanishes as 1/N^(1/2) in any number of dimensions in the thermodynamic limit. In D less than or equal to 2 the lack of a momentum condensation is in accord with the Hohenberg theorem, but must be reconciled with the existence of a spatial BEC in D=2. For finite systems we derive the N-dependence of the spatial and momentum condensate fractions and the transition temperatures, features that may be experimentally testable. We show that the N-dependence of the 2D ideal-gas transition temperature for a finite system cannot persist in the interacting case because it violates a theorem due to Chester, Penrose, and Onsager.Comment: 34 pages, LaTeX, 6 Postscript figures, Submitted to Jour. Low Temp. Phy

    Tradeoffs in jet inlet design: a historical perspective

    No full text
    The design of the inlet(s) is one of the most demanding tasks of the development process of any gas turbine-powered aircraft. This is mainly due to the multi-objective and multidisciplinary nature of the exercise. The solution is generally a compromise between a number of conflicting goals and these conflicts are the subject of the present paper. We look into how these design tradeoffs have been reflected in the actual inlet designs over the years and how the emphasis has shifted from one driver to another. We also review some of the relevant developments of the jet age in aerodynamics and design and manufacturing technology and we examine how they have influenced and informed inlet design decision

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Proprioceptive performance of bilateral upper and lower limb joints: side-general and site-specific effects

    Get PDF
    Superiority of the left upper limb in proprioception tasks performed by right-handed individuals has been attributed to better utilization of proprioceptive information by a non-preferred arm/hemisphere system. However, it is undetermined whether this holds for multiple upper and lower limb joints. Accordingly, the present study tested active movement proprioception at four pairs of upper and lower limb joints, after selecting twelve participants with both strong right arm and right leg preference. A battery of versions of the active movement extent discrimination apparatus were employed to generate the stimuli for movements of different extents at the ankle, knee, shoulder and fingers on the right and left sides of the body, and discrimination scores were derived from participants’ responses. Proprioceptive performance on the non-preferred left side was significantly better than the preferred right side at all four joints tested (overall F(1, 11) = 36.36, p < 0.001, partial η(2) = 0.77). In the 8 × 8 matrix formed by all joints, only correlations between the proprioceptive accuracy scores for the right and left sides at the same joint were significant (ankles 0.93, knees 0.89, shoulders 0.87, fingers 0.91, p ≤ 0.001; all others r ≤ 0.40, p ≥ 0.20). The results point to both a side-general effect and a site-specific effect in the integration of proprioceptive information during active movement tasks, whereby the non-preferred limb/hemisphere system is specialized in the utilization of the best proprioceptive sources available at each specific joint, but the combination of sources employed differs between body sites

    A randomized, phase III trial to evaluate rucaparib monotherapy as maintenance treatment in patients with newly diagnosed ovarian cancer (ATHENA–MONO/GOG-3020/ENGOT-ov45)

    Get PDF
    PURPOSE: ATHENA (ClinicalTrials.gov identifier: NCT03522246) was designed to evaluate rucaparib first-line maintenance treatment in a broad patient population, including those without BRCA1 or BRCA2 (BRCA) mutations or other evidence of homologous recombination deficiency (HRD), or high-risk clinical characteristics such as residual disease. We report the results from the ATHENA–MONO comparison of rucaparib versus placebo. METHODS: Patients with stage III-IV high-grade ovarian cancer undergoing surgical cytoreduction (R0/complete resection permitted) and responding to first-line platinum-doublet chemotherapy were randomly assigned 4:1 to oral rucaparib 600 mg twice a day or placebo. Stratification factors were HRD test status, residual disease after chemotherapy, and timing of surgery. The primary end point of investigator-assessed progression-free survival was assessed in a step-down procedure, first in the HRD population (BRCA-mutant or BRCA wild-type/loss of heterozygosity high tumor), and then in the intent-to-treat population. RESULTS: As of March 23, 2022 (data cutoff), 427 and 111 patients were randomly assigned to rucaparib or placebo, respectively (HRD population: 185 v 49). Median progression-free survival (95% CI) was 28.7 months (23.0 to not reached) with rucaparib versus 11.3 months (9.1 to 22.1) with placebo in the HRD population (log-rank P = .0004; hazard ratio [HR], 0.47; 95% CI, 0.31 to 0.72); 20.2 months (15.2 to 24.7) versus 9.2 months (8.3 to 12.2) in the intent-to-treat population (log-rank P < .0001; HR, 0.52; 95% CI, 0.40 to 0.68); and 12.1 months (11.1 to 17.7) versus 9.1 months (4.0 to 12.2) in the HRD-negative population (HR, 0.65; 95% CI, 0.45 to 0.95). The most common grade ≥ 3 treatment-emergent adverse events were anemia (rucaparib, 28.7% v placebo, 0%) and neutropenia (14.6% v 0.9%). CONCLUSION: Rucaparib monotherapy is effective as first-line maintenance, conferring significant benefit versus placebo in patients with advanced ovarian cancer with and without HRD

    Theoretical and technological building blocks for an innovation accelerator

    Get PDF
    The scientific system that we use today was devised centuries ago and is inadequate for our current ICT-based society: the peer review system encourages conservatism, journal publications are monolithic and slow, data is often not available to other scientists, and the independent validation of results is limited. Building on the Innovation Accelerator paper by Helbing and Balietti (2011) this paper takes the initial global vision and reviews the theoretical and technological building blocks that can be used for implementing an innovation (in first place: science) accelerator platform driven by re-imagining the science system. The envisioned platform would rest on four pillars: (i) Redesign the incentive scheme to reduce behavior such as conservatism, herding and hyping; (ii) Advance scientific publications by breaking up the monolithic paper unit and introducing other building blocks such as data, tools, experiment workflows, resources; (iii) Use machine readable semantics for publications, debate structures, provenance etc. in order to include the computer as a partner in the scientific process, and (iv) Build an online platform for collaboration, including a network of trust and reputation among the different types of stakeholders in the scientific system: scientists, educators, funding agencies, policy makers, students and industrial innovators among others. Any such improvements to the scientific system must support the entire scientific process (unlike current tools that chop up the scientific process into disconnected pieces), must facilitate and encourage collaboration and interdisciplinarity (again unlike current tools), must facilitate the inclusion of intelligent computing in the scientific process, must facilitate not only the core scientific process, but also accommodate other stakeholders such science policy makers, industrial innovators, and the general public
    corecore