2,885 research outputs found

    Smoothed Complexity Theory

    Get PDF
    Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case analysis have accompanying complexity classes, like P and AvgP, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and compensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into computational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first hardness results (of bounded halting and tiling) and tractability results (binary optimization problems, graph coloring, satisfiability). Furthermore, we discuss extensions and shortcomings of our model and relate it to semi-random models.Comment: to be presented at MFCS 201

    Scintillation light produced by low-energy beams of highly-charged ions

    Full text link
    Measurements have been performed of scintillation light intensities emitted from various inorganic scintillators irradiated with low-energy beams of highly-charged ions from an electron beam ion source (EBIS) and an electron cyclotron resonance ion source (ECRIS). Beams of xenon ions Xeq+^{q+} with various charge states between qq=2 and qq=18 have been used at energies between 5 keV and 17.5 keV per charge generated by the ECRIS. The intensity of the beam was typically varied between 1 and 100 nA. Beams of highly charged residual gas ions have been produced by the EBIS at 4.5 keV per charge and with low intensities down to 100 pA. The scintillator materials used are flat screens of P46 YAG and P43 phosphor. In all cases, scintillation light emitted from the screen surface was detected by a CCD camera. The scintillation light intensity has been found to depend linearly on the kinetic ion energy per time deposited into the scintillator, while up to qq=18 no significant contribution from the ions' potential energy was found. We discuss the results on the background of a possible use as beam diagnostics e.g. for the new HITRAP facility at GSI, Germany.Comment: 6 pages, 8 figure

    Evidence for the absence of regularization corrections to the partial-wave renormalization procedure in one-loop self energy calculations in external fields

    Full text link
    The equivalence of the covariant renormalization and the partial-wave renormaliz ation (PWR) approach is proven explicitly for the one-loop self-energy correction (SE) of a bound electron state in the presence of external perturbation potentials. No spurious correctio n terms to the noncovariant PWR scheme are generated for Coulomb-type screening potentia ls and for external magnetic fields. It is shown that in numerical calculations of the SE with Coulombic perturbation potential spurious terms result from an improper treatment of the unphysical high-energy contribution. A method for performing the PWR utilizing the relativistic B-spline approach for the construction of the Dirac spectrum in external magnetic fields is proposed. This method is applied for calculating QED corrections to the bound-electron gg-factor in H-like ions. Within the level of accuracy of about 0.1% no spurious terms are generated in numerical calculations of the SE in magnetic fields.Comment: 22 pages, LaTeX, 1 figur

    Recoil correction to the bound-electron g factor in H-like atoms to all orders in αZ\alpha Z

    Get PDF
    The nuclear recoil correction to the bound-electron g factor in H-like atoms is calculated to first order in m/Mm/M and to all orders in αZ\alpha Z. The calculation is performed in the range Z=1-100. A large contribution of terms of order (αZ)5(\alpha Z)^5 and higher is found. Even for hydrogen, the higher-order correction exceeds the (αZ)4(\alpha Z)^4 term, while for uranium it is above the leading (αZ)2(\alpha Z)^2 correction.Comment: 6 pages, 3 tables, 1 figur

    Where to restore ecological connectivity? Detecting barriers and quantifying restoration benefits

    Get PDF
    Landscape connectivity is crucial for many ecological processes, including dispersal, gene flow, demographic rescue, and movement in response to climate change. As a result, governmental and non-governmental organizations are focusing efforts to map and conserve areas that facilitate movement to maintain population connectivity and promote climate adaptation. In contrast, little focus has been placed on identifying barriers—landscape features which impede movement between ecologically important areas—where restoration could most improve connectivity. Yet knowing where barriers most strongly reduce connectivity can complement traditional analyses aimed at mapping best movement routes. We introduce a novel method to detect important barriers and provide example applications. Our method uses GIS neighborhood analyses in conjunction with effective distance analyses to detect barriers that, if removed, would significantly improve connectivity. Applicable in least-cost, circuit-theoretic, and simulation modeling frameworks, the method detects both complete (impermeable) barriers and those that impede but do not completely block movement. Barrier mapping complements corridor mapping by broadening the range of connectivity conservation alternatives available to practitioners. The method can help practitioners move beyond maintaining currently important areas to restoring and enhancing connectivity through active barrier removal. It can inform decisions on trade-offs between restoration and protection; for example, purchasing an intact corridor may be substantially more costly than restoring a barrier that blocks an alternative corridor. And it extends the concept of centrality to barriers, highlighting areas that most diminish connectivity across broad networks. Identifying which modeled barriers have the greatest impact can also help prioritize error checking of land cover data and collection of field data to improve connectivity maps. Barrier detection provides a different way to view the landscape, broadening thinking about connectivity and fragmentation while increasing conservation options

    The Evolution of Central Volcanoes in Ultraslow Rift Systems : Constraints From D. Joao de Castro Seamount, Azores

    Get PDF
    The Dom Joao de Castro seamount in the Hirondelle Basin (Azores) is a central volcano on the ultraslow diverging Terceira Rift axis. The combination of structural and geochemical data provides insights into the evolution of central volcanoes in oceanic rift systems above the Azores melting anomaly. The orientation of fault scarps and volcanic structures at D. Joao de Castro and the adjacent Castro fissure zone indicate that the regional SW-NE extending stress field dominates the morphology of the NW Hirondelle Basin. The regional tectonic stress field controls the crustal melt pathways and leads to dike emplacement along fissure zones and the prevalent eruption of mafic lavas. The occurrence of mafic to felsic lavas at D. Joao de Castro gives evidence for both a deep and a shallow crustal melt reservoir generating a subordinate local stress field at the seamount. New Sr-Nd-Pb isotope data along with incompatible trace element ratios indicate that D. Joao de Castro and the Castro Ridges originated from similarly heterogeneous mantle source but did not form simultaneously. Our new model implies that central volcanoes along the Terceira Rift form by the growth of volcanic ridges and transitioned into circular edifices after magmatic systems generate local changes in the regional lithospheric stress field. The geometry of D. Joao de Castro and other magmatic systems along the Terceira Rift combined with the alkaline nature of the erupted lavas, and the large lithosphere thickness indicates that young oceanic rifts are more similar to continental rifts rather than mid-ocean ridges.Peer reviewe

    QED theory of the nuclear recoil effect on the atomic g factor

    Full text link
    The quantum electrodynamic theory of the nuclear recoil effect on the atomic g factor to all orders in \alpha Z and to first order in m/M is formulated. The complete \alpha Z-dependence formula for the recoil correction to the bound-electron g factor in a hydrogenlike atom is derived. This formula is used to calculate the recoil correction to the bound-electron g factor in the order (\alpha Z)^2 m/M for an arbitrary state of a hydrogenlike atom.Comment: 17 page

    J. Bacteriol.

    Get PDF

    Recoil correction to the ground state energy of hydrogenlike atoms

    Get PDF
    The recoil correction to the ground state energy of hydrogenlike atoms is calculated to all orders in \alpha Z in the range Z = 1-110. The nuclear size corrections to the recoil effect are partially taken into account. In the case of hydrogen, the relativistic recoil correction beyond the Salpeter contribution and the nonrelativistic nuclear size correction to the recoil effect, amounts to -7.2(2) kHz. The total recoil correction to the ground state energy in hydrogenlike uranium (^{238}U^{91+}) constitutes 0.46 eV.Comment: 16 pages, 1 figure (eps), Latex, submitted to Phys.Rev.

    The legacy of 1300 years of land use in Jamaica

    Get PDF
    Despite decades of archaeological research on Jamaica, little is known about how settlers influenced landscape change on the island over time. Here, we examine the impact of human occupation through a multi-proxy approach using phytolith, charcoal, and stratigraphic analyses. White Marl was a continuously inhabited village settlement (ca. 1050–450 cal yrs BP) with large mounded midden areas, precolonial house structures, and human landscape management practices. We have shown that the local vegetation at White Marl was directly affected by human settlement through the use of agroforestry and burning, and suggest that fire was used to modify vegetation. Manioc phytoliths were found throughout human occupation and are broadly associated with increases in evidence for burning, suggesting fire was used to modify the landscape and clear vegetation for crop cultivation. The phytolith assemblages relate to three distinct temporal vegetation phases: (1) the earliest occupation dominated by arboreal vegetation (pre-ca. 870 cal yrs BP); (2) a transition to palm-dominated vegetation (ca. 870–670 cal yrs BP); and (3) the latest occupation representing European colonization associated with a more open, grass-dominated landscape (after ca. 670 cal yrs BP). These transitions occur independent of changes in paleoclimate records, suggesting humans were the dominant driver of vegetation change.Introduction Archaeological context Archaeobotany in Jamaica The White Marl site Materials and methods - Sampling, stratigraphic analysis, and recording - Phytoliths - Phytolith extraction - Phytolith identification, counting, and quantification - Charcoal extraction and quantification Results - Vegetation phase 1: Arboreal-dominated canopy - Vegetation phase 2: Palm-dominated canopy - Vegetation phase 3: Open grassland-dominated landscape - Crops - Burning indicators - Vegetation changes and climate Discussion Conclusio
    • …
    corecore