2,236 research outputs found

    A moored array along the southern boundary of the Brazil Basin for the Deep Basin Experiment : report on a joint experiment 1991-1992

    Get PDF
    The Deep Basin Experiment (DBE) is an international effort and a part of the World Ocean Circulation Experiment with the principal objective of improving our knowledge of the subthermocline circulation. The DBE fieldwork is focussed on the Brazil Basin and this report is concerned with a moored array situated along its southern boundary which was installed in early 1991 to measure the inflow and outflow to the Basin and to investigate the Brazil Current near 30S. This moored array was a joint undertaking by the Institut für Meereskunde of the University of Kiel and the Woods Hole Oceanographic Institution. Moorings were deployed on Meteor Cruise 15, leg 1 and retrieved on Meteor cruise 22, legs 3 and 4. A total of 57 conventional current meters and two Acoustic Doppler Current Profilers were set on 13 moorings with some concentration within the Brazil Current and the Vema Channel. CTDs were taken at each mooring site as well as in between. Some of the recovered instruments were reset in the Hunter Channel, a suspected additional connection between the Argentine Basin and the Brazil Basin. A later report will summarize this data after it is recovered in May 1994.Funding was provided by the Deutsche Forschungsgemeinschaft (Si 111/38-1, Si 111/39-1) the Bundesministerium für Forschung und Technologie (03F0535A, 03F0050D) and the National Science Foundation under Grant OCE-9004396

    The Baum-Connes Conjecture via Localisation of Categories

    Get PDF
    We redefine the Baum-Connes assembly map using simplicial approximation in the equivariant Kasparov category. This new interpretation is ideal for studying functorial properties and gives analogues of the assembly maps for all equivariant homology theories, not just for the K-theory of the crossed product. We extend many of the known techniques for proving the Baum-Connes conjecture to this more general setting

    Measuring structure deformations of a composite glider by optical means with on-ground and in-flight testing

    Get PDF
    © 2016 IOP Publishing Ltd. In aeronautical research experimental data sets of high quality are essential to verify and improve simulation algorithms. For this reason the experimental techniques need to be constantly refined. The shape, movement or deformation of structural aircraft elements can be measured implicitly in multiple ways; however, only optical, correlation-based techniques are able to deliver direct high-order and spatial results. In this paper two different optical metrologies are used for on-ground preparation and the actual execution of in-flight wing deformation measurements on a PW-6U glider. Firstly, the commercial PONTOS system is used for static tests on the ground and for wind tunnel investigations to successfully certify an experimental sensor pod mounted on top of the test bed fuselage. Secondly, a modification of the glider is necessary to implement the optical method named image pattern correlation technique (IPCT), which has been developed by the German Aerospace Center DLR. This scientific technology uses a stereoscopic camera set-up placed inside the experimental pod and a stochastic dot matrix applied to the area of interest on the glider wing to measure the deformation of the upper wing surface in-flight. The flight test installation, including the preparation, is described and results are presented briefly. Focussing on the compensation for typical error sources, the paper concludes with a recommended procedure to enhance the data processing for better results. Within the presented project IPCT has been developed and optimized for a new type of test bed. Adapted to the special requirements of the glider, the IPCT measurements were able to deliver a valuable wing deformation data base which now can be used to improve corresponding numerical models and simulations

    Predicting the risk of falling – efficacy of a risk assessment tool compared to nurses' judgement: a cluster-randomised controlled trial [ISRCTN37794278]

    Get PDF
    BACKGROUND: Older people living in nursing homes are at high risk of falling because of their general frailty and multiple pathologies. Prediction of falls might lead to an efficient allocation of preventive measures. Although several tools to assess the risk of falling have been developed, their impact on clinically relevant endpoints has never been investigated. The present study will evaluate the clinical efficacy and consequences of different fall risk assessment strategies. STUDY DESIGN: Cluster-randomised controlled trial with nursing home clusters randomised either to the use of a standard fall risk assessment tool alongside nurses' clinical judgement or to nurses' clinical judgement alone. Standard care of all clusters will be optimised by structured education on best evidence strategies to prevent falls and fall related injuries. 54 nursing home clusters including 1,080 residents will be recruited. Residents must be ≥ 70 years, not bedridden, and living in the nursing home for more than three months. The primary endpoint is the number of participants with at least one fall at 12 months. Secondary outcome measures are the number of falls, clinical consequences including side effects of the two risk assessment strategies. Other measures are fall related injuries, hospital admissions and consultations with a physician, and costs

    Creation of multiple nanodots by single ions

    Full text link
    In the challenging search for tools that are able to modify surfaces on the nanometer scale, heavy ions with energies of several 10 MeV are becoming more and more attractive. In contrast to slow ions where nuclear stopping is important and the energy is dissipated into a large volume in the crystal, in the high energy regime the stopping is due to electronic excitations only. Because of the extremely local (< 1 nm) energy deposition with densities of up to 10E19 W/cm^2, nanoscaled hillocks can be created under normal incidence. Usually, each nanodot is due to the impact of a single ion and the dots are randomly distributed. We demonstrate that multiple periodically spaced dots separated by a few 10 nanometers can be created by a single ion if the sample is irradiated under grazing angles of incidence. By varying this angle the number of dots can be controlled.Comment: 12 pages, 6 figure

    Models of natural pest control : Towards predictions across agricultural landscapes

    Get PDF
    Natural control of invertebrate crop pests has the potential to complement or replace conventional insecticide based practices, but its mainstream application is hampered by predictive unreliability across agroecosystems. Inconsistent responses of natural pest control to changes in landscape characteristics have been attributed to ecological complexity and system-specific conditions. Here, we review agroecological models and their potential to provide predictions of natural pest control across agricultural landscapes. Existing models have used a multitude of techniques to represent specific crop-pest-enemy systems at various spatiotemporal scales, but less wealthy regions of the world are underrepresented. A realistic representation of natural pest control across systems appears to be hindered by a practical trade-off between generality and realism. Nonetheless, observations of context-sensitive, trait-mediated responses of natural pest control to land-use gradients indicate the potential of ecological models that explicitly represent the underlying mechanisms. We conclude that modelling natural pest control across agroecosystems should exploit existing mechanistic techniques towards a framework of contextually bound generalizations. Observed similarities in causal relationships can inform the functional grouping of diverse agroecosystems worldwide and the development of the respective models based on general, but context-sensitive, ecological mechanisms. The combined use of qualitative and quantitative techniques should allow the flexible integration of empirical evidence and ecological theory for robust predictions of natural pest control across a wide range of agroecological contexts and levels of knowledge availability. We highlight challenges and promising directions towards developing such a general modelling framework.Peer reviewe

    The numerical renormalization group method for quantum impurity systems

    Full text link
    In the beginning of the 1970's, Wilson developed the concept of a fully non-perturbative renormalization group transformation. Applied to the Kondo problem, this numerical renormalization group method (NRG) gave for the first time the full crossover from the high-temperature phase of a free spin to the low-temperature phase of a completely screened spin. The NRG has been later generalized to a variety of quantum impurity problems. The purpose of this review is to give a brief introduction to the NRG method including some guidelines of how to calculate physical quantities, and to survey the development of the NRG method and its various applications over the last 30 years. These applications include variants of the original Kondo problem such as the non-Fermi liquid behavior in the two-channel Kondo model, dissipative quantum systems such as the spin-boson model, and lattice systems in the framework of the dynamical mean field theory.Comment: 55 pages, 27 figures, submitted to Rev. Mod. Phy

    Graph-based description of tertiary lymphoid organs at single-cell level

    Get PDF
    Our aim is to complement observer-dependent approaches of immune cell evaluation in microscopy images with reproducible measures for spatial composition of lymphocytic infiltrates. Analyzing such patterns of inflammation is becoming increasingly important for therapeutic decisions, for example in transplantation medicine or cancer immunology. We developed a graph-based assessment of lymphocyte clustering in full whole slide images. Based on cell coordinates detected in the full image, a Delaunay triangulation and distance criteria are used to build neighborhood graphs. The composition of nodes and edges are used for classification, e.g. using a support vector machine. We describe the variability of these infiltrates on CD3/CD20 duplex staining in renal biopsies of long-term functioning allografts, in breast cancer cases, and in lung tissue of cystic fibrosis patients. The assessment includes automated cell detection, identification of regions of interest, and classification of lymphocytic clusters according to their degree of organization. We propose a neighborhood feature which considers the occurrence of edges with a certain type in the graph to distinguish between phenotypically different immune infiltrates. Our work addresses a medical need and provides a scalable framework that can be easily adjusted to the requirements of different research questions

    Reconciliation of essential process parameters for an enhanced predictability of Arctic stratospheric ozone loss and its climate interactions

    Get PDF
    Significant reductions in stratospheric ozone occur inside the polar vortices each spring when chlorine radicals produced by heterogeneous reactions on cold particle surfaces in winter destroy ozone mainly in two catalytic cycles, the ClO dimer cycle and the ClO/BrO cycle. Chlorofluorocarbons (CFCs), which are responsible for most of the chlorine currently present in the stratosphere, have been banned by the Montreal Protocol and its amendments, and the ozone layer is predicted to recover to 1980 levels within the next few decades. During the same period, however, climate change is expected to alter the temperature, circulation patterns and chemical composition in the stratosphere, and possible geo-engineering ventures to mitigate climate change may lead to additional changes. To realistically predict the response of the ozone layer to such influences requires the correct representation of all relevant processes. The European project RECONCILE has comprehensively addressed remaining questions in the context of polar ozone depletion, with the objective to quantify the rates of some of the most relevant, yet still uncertain physical and chemical processes. To this end RECONCILE used a broad approach of laboratory experiments, two field missions in the Arctic winter 2009/10 employing the high altitude research aircraft M55-Geophysica and an extensive match ozone sonde campaign, as well as microphysical and chemical transport modelling and data assimilation. Some of the main outcomes of RECONCILE are as follows: (1) vortex meteorology: the 2009/10 Arctic winter was unusually cold at stratospheric levels during the six-week period from mid-December 2009 until the end of January 2010, with reduced transport and mixing across the polar vortex edge; polar vortex stability and how it is influenced by dynamic processes in the troposphere has led to unprecedented, synoptic-scale stratospheric regions with temperatures below the frost point; in these regions stratospheric ice clouds have been observed, extending over >106km2 during more than 3 weeks. (2) Particle microphysics: heterogeneous nucleation of nitric acid trihydrate (NAT) particles in the absence of ice has been unambiguously demonstrated; conversely, the synoptic scale ice clouds also appear to nucleate heterogeneously; a variety of possible heterogeneous nuclei has been characterised by chemical analysis of the non-volatile fraction of the background aerosol; substantial formation of solid particles and denitrification via their sedimentation has been observed and model parameterizations have been improved. (3) Chemistry: strong evidence has been found for significant chlorine activation not only on polar stratospheric clouds (PSCs) but also on cold binary aerosol; laboratory experiments and field data on the ClOOCl photolysis rate and other kinetic parameters have been shown to be consistent with an adequate degree of certainty; no evidence has been found that would support the existence of yet unknown chemical mechanisms making a significant contribution to polar ozone loss. (4) Global modelling: results from process studies have been implemented in a prognostic chemistry climate model (CCM); simulations with improved parameterisations of processes relevant for polar ozone depletion are evaluated against satellite data and other long term records using data assimilation and detrended fluctuation analysis. Finally, measurements and process studies within RECONCILE were also applied to the winter 2010/11, when special meteorological conditions led to the highest chemical ozone loss ever observed in the Arctic. In addition to quantifying the 2010/11 ozone loss and to understand its causes including possible connections to climate change, its impacts were addressed, such as changes in surface ultraviolet (UV) radiation in the densely populated northern mid-latitudes
    corecore