149 research outputs found

    Solid immersion lens applications for nanophotonic devices

    Get PDF
    Solid immersion lens (SIL) microscopy combines the advantages of conventional microscopy with those of near-field techniques, and is being increasingly adopted across a diverse range of technologies and applications. A comprehensive overview of the state-of-the-art in this rapidly expanding subject is therefore increasingly relevant. Important benefits are enabled by SIL-focusing, including an improved lateral and axial spatial profiling resolution when a SIL is used in laser-scanning microscopy or excitation, and an improved collection efficiency when a SIL is used in a light-collection mode, for example in fluorescence micro-spectroscopy. These advantages arise from the increase in numerical aperture (NA) that is provided by a SIL. Other SIL-enhanced improvements, for example spherical-aberration-free sub-surface imaging, are a fundamental consequence of the aplanatic imaging condition that results from the spherical geometry of the SIL. Beginning with an introduction to the theory of SIL imaging, the unique properties of SILs are exposed to provide advantages in applications involving the interrogation of photonic and electronic nanostructures. Such applications range from the sub-surface examination of the complex three-dimensional microstructures fabricated in silicon integrated circuits, to quantum photoluminescence and transmission measurements in semiconductor quantum dot nanostructures

    Does nature conservation enhance ecosystem services delivery?

    Get PDF
    Whilst a number of studies have examined the effects of biodiversity conservation on the delivery of ecosystems, they have been often limited by the scope of the ecosystem services (ES) assessed and often suffer from confounding spatial issues. This paper examines the impacts of nature conservation (designation) on the delivery of a full suite of ES across nine case-studies in the UK, using expert opinion. The case-studies covered a range of habitats and explore the delivery of ES from a ‘protected site’ and a comparable ‘non-protected’ site. By conducting pair-wise comparisons between comparable sites our study is one of the first to attempt to mitigate confounding cause and effect factors in relation to spatial context in correlative studies. Protected sites delivered higher levels of ecosystem services than nonprotected sites, with the main differences being in the cultural and regulating ecosystem services. Against expectations, there was no consistent negative impact of protection on provisioning services across the case-studies. Whilst the analysis demonstrated general patterns and differences in ecosystem delivery between protected and non-protected sites, the individual responses in each case-study highlights the importance of the social, biophysical, economic and temporal context of individual protected areas and the associated management

    Type-Decomposition of a Pseudo-Effect Algebra

    Full text link
    The theory of direct decomposition of a centrally orthocomplete effect algebra into direct summands of various types utilizes the notion of a type-determining (TD) set. A pseudo-effect algebra (PEA) is a (possibly) noncommutative version of an effect algebra. In this article we develop the basic theory of centrally orthocomplete PEAs, generalize the notion of a TD set to PEAs, and show that TD sets induce decompositions of centrally orthocomplete PEAs into direct summands.Comment: 18 page

    A Monitor of Beam Polarization Profiles for the TRIUMF Parity Experiment

    Get PDF
    TRIUMF experiment E497 is a study of parity violation in pp scattering at an energy where the leading term in the analyzing power is expected to vanish, thus measuring a unique combination of weak-interaction flavour conserving terms. It is desired to reach a level of sensitivity of 2x10^-8 in both statistical and systematic errors. The leading systematic errors depend on transverse polarization components and, at least, the first moment of transverse polarization. A novel polarimeter that measures profiles of both transverse components of polarization as a function of position is described.Comment: 19 pages LaTeX, 10 PostScript figures. To appear in Nuclear Instruments and Methods in Physics Research, Section

    The G0 Experiment: Apparatus for Parity-Violating Electron Scattering Measurements at Forward and Backward Angles

    Full text link
    In the G0 experiment, performed at Jefferson Lab, the parity-violating elastic scattering of electrons from protons and quasi-elastic scattering from deuterons is measured in order to determine the neutral weak currents of the nucleon. Asymmetries as small as 1 part per million in the scattering of a polarized electron beam are determined using a dedicated apparatus. It consists of specialized beam-monitoring and control systems, a cryogenic hydrogen (or deuterium) target, and a superconducting, toroidal magnetic spectrometer equipped with plastic scintillation and aerogel Cerenkov detectors, as well as fast readout electronics for the measurement of individual events. The overall design and performance of this experimental system is discussed.Comment: Submitted to Nuclear Instruments and Method

    Stable isotope food-web analysis and mercury biomagnification in polar bears ( Ursus maritimus )

    Full text link
    Mercury (Hg) biomagnification occurs in many ecosystems, resulting in a greater potential for toxicological effects in higher-level trophic feeders. However, Hg transport pathways through different food-web channels are not well known, particularly in high-latitude systems affected by the atmospheric Hg deposition associated with snow and ice. Here, we report on stable carbon and nitrogen isotope ratios, and Hg concentrations, determined for 26, late 19th and early 20th century, polar bear ( Ursus maritimus ) hair specimens, collected from catalogued museum collections. These data elucidate relationships between the high-latitude marine food-web structure and Hg concentrations in polar bears. The carbon isotope compositions of polar bear hairs suggest that polar bears derive nutrition from coupled food-web channels, based in pelagic and sympagic primary producers, whereas the nitrogen isotope compositions indicate that polar bears occupy > fourth-level trophic positions. Our results show a positive correlation between polar bear hair Hg concentrations and ÎŽ 15 N. Interpretation of the stable isotope data in combination with Hg concentrations tentatively suggests that polar bears participating in predominantly pelagic food webs exhibit higher mercury concentrations than polar bears participating in predominantly sympagic food webs.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/73930/1/j.1751-8369.2009.00114.x.pd

    High-dimensional maximum marginal likelihood item factor analysis by adaptive quadrature

    Full text link
    Although the Bock–Aitkin likelihood-based estimation method for factor analysis of dichotomous item response data has important advantages over classical analysis of item tetrachoric correlations, a serious limitation of the method is its reliance on fixed-point Gauss-Hermite (G-H) quadrature in the solution of the likelihood equations and likelihood-ratio tests. When the number of latent dimensions is large, computational considerations require that the number of quadrature points per dimension be few. But with large numbers of items, the dispersion of the likelihood, given the response pattern, becomes so small that the likelihood cannot be accurately evaluated with the sparse fixed points in the latent space. In this paper, we demonstrate that substantial improvement in accuracy can be obtained by adapting the quadrature points to the location and dispersion of the likelihood surfaces corresponding to each distinct pattern in the data. In particular, we show that adaptive G-H quadrature, combined with mean and covariance adjustments at each iteration of an EM algorithm, produces an accurate fast-converging solution with as few as two points per dimension. Evaluations of this method with simulated data are shown to yield accurate recovery of the generating factor loadings for models of upto eight dimensions. Unlike an earlier application of adaptive Gibbs sampling to this problem by Meng and Schilling, the simulations also confirm the validity of the present method in calculating likelihood-ratio chi-square statistics for determining the number of factors required in the model. Finally, we apply the method to a sample of real data from a test of teacher qualifications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/43596/1/11336_2003_Article_1141.pd

    Pre-operatieve hemodynamische optimalisatie door middel van het verhogen van het zuurstof-aanbod bij hoog-risico-chirurgie

    No full text
    Preoperative hemodynamic optimization by means of augmenting oxygen delivery is an attractive concept to reduce morbidity after major surgery. Several prospective controlled randomised studies demonstrate a positive effect on mortality and morbidity. However, it is still uncertain which patients will benefit most from such a policy and by what means preoperative optimization is best achieved. We believe that the target of optimization should not be a prefixed value for oxygen delivery, but rather one individualized for each patient ("tune-up"). In this regard, a better preoperative fluid-status appears to be most essential
    • 

    corecore