1,545 research outputs found

    Low-temperature thermochronology and thermokinematic modeling of deformation, exhumation, and development of topography in the central Southern Alps, New Zealand

    Get PDF
    Apatite and zircon (U-Th)/He and fission track ages were obtained from ridge transects across the central Southern Alps, New Zealand. Interpretation of local profiles is difficult because relationships between ages and topography or local faults are complex and the data contain large uncertainties, with poor reproducibility between sample duplicates. Data do form regional patterns, however, consistent with theoretical systematics and corroborating previous observations: young Neogene ages occur immediately southeast of the Alpine Fault (the main plate boundary structure on which rocks are exhumed); partially reset ages occur in the central Southern Alps; and older Mesozoic ages occur further toward the southeast. Zircon apparent ages are older than apatite apparent ages for the equivalent method. Three-dimensional thermokinematic modeling of plate convergence incorporates advection of the upper Pacific plate along a low-angle detachment then up an Alpine Fault ramp, adopting a generally accepted tectonic scenario for the Southern Alps. The modeling incorporates heat flow, evolving topography, and the detailed kinetics of different thermochronometric systems and explains both complex local variations and regional patterns. Inclusion of the effects of radiation damage on He diffusion in detrital apatite is shown to have dramatic effects on results. Geometric and velocity parameters are tuned to fit model ages to observed data. Best fit is achieved at 9 mm a−1 plate convergence, with Pacific plate delamination on a gentle 10°SE dipping detachment and more rapid uplift on a 45–60° dipping Alpine Fault ramp from 15 km depth. Thermokinematic modeling suggests dip-slip motion on reverse faults within the Southern Alps should be highest ∼22 km from the Alpine Fault and much lower toward the southeast

    Orbital Kondo behavior from dynamical structural defects

    Full text link
    The interaction between an atom moving in a model double-well potential and the conduction electrons is treated using renormalization group methods in next-to-leading logarithmic order. A large number of excited states is taken into account and the Kondo temperature TKT_K is computed as a function of barrier parameters. We find that for special parameters TKT_K can be close to 1K1 {\rm K} and it can be of the same order of magnitude as the renormalized splitting Δ\Delta. However, in the perturbative regime we always find that T_K \alt \Delta with a T_K \alt 1 {\rm K} [Aleiner {\em et al.}, Phys. Rev. Lett. {\bf 86}, 2629 (2001)]. We also find that Δ\Delta remains unrenormalized at energies above the Debye frequency, ωDebye\omega_{\rm Debye}.Comment: 9 pages, 9 figures, RevTe

    Statistical disclosure control in tabular data

    Get PDF
    Data disseminated by National Statistical Agencies (NSAs) can be classified as either microdata or tabular data. Tabular data is obtained from microdata by crossing one or more categorical variables. Although cell tables provide aggregated information, they also need to be protected. This chapter is a short introduction to tabular data protection. It contains three main sections. The first one shows the different types of tables that can be obtained, and how they are modeled. The second describes the practical rules for detection of sensitive cells that are used by NSAs. Finally, an overview of protection methods is provided, with a particular focus on two of them: “cell suppression problem” and “controlled tabular adjustment”.Postprint (published version

    Automatic Structure Detection in Constraints of Tabular Data

    Full text link
    Abstract. Methods for the protection of statistical tabular data—as controlled tabular adjustment, cell suppression, or controlled rounding— need to solve several linear programming subproblems. For large multi-dimensional linked and hierarchical tables, such subproblems turn out to be computationally challenging. One of the techniques used to reduce the solution time of mathematical programming problems is to exploit the constraints structure using some specialized algorithm. Two of the most usual structures are block-angular matrices with either linking rows (primal block-angular structure) or linking columns (dual block-angular structure). Although constraints associated to tabular data have intrin-sically a lot of structure, current software for tabular data protection neither detail nor exploit it, and simply provide a single matrix, or at most a set of smallest submatrices. We provide in this work an efficient tool for the automatic detection of primal or dual block-angular struc-ture in constraints matrices. We test it on some of the complex CSPLIB instances, showing that when the number of linking rows or columns is small, the computational savings are significant

    Exploring the effects of non-monetary reimbursement for participants in HCI research

    Get PDF
    When running experiments within the field of Human Computer Interaction (HCI) it is common practice to ask participants to come to a specified lab location, and reimburse them monetarily for their time and travel costs. This, however, is not the only means by which to encourage participation in scientific study. Citizen science projects, which encourage the public to become involved in scientific research, have had great success in getting people to act as sensors to collect data or to volunteer their idling computer or brain power to classify large data sets across a broad range of fields including biology, cosmology and physical and environmental science. This is often done without the expectation of payment. Additionally, data collection need not be done on behalf of an external researcher; the Quantified Self (QS) movement allows people to reflect on data they have collected about themselves. This too, then, is a form of non-reimbursed data collection. Here we investigate whether citizen HCI scientists and those interested in personal data produce reliable results compared to participants in more traditional lab-based studies. Through six studies, we explore how participation rates and data quality are affected by recruiting participants without monetary reimbursement: either by providing participants with data about themselves as reward (a QS approach), or by simply requesting help with no extrinsic reward (as in citizen science projects). We show that people are indeed willing to take part in online HCI research in the absence of extrinsic monetary reward, and that the data generated by participants who take part for selfless reasons, rather than for monetary reward, can be as high quality as data gathered in the lab and in addition may be of higher quality than data generated by participants given monetary reimbursement online. This suggests that large HCI experiments could be run online in the future, without having to incur the equally large reimbursement costs alongside the possibility of running experiments in environments outside of the lab

    Kondo effect in multielectron quantum dots at high magnetic fields

    Full text link
    We present a general description of low temperature transport through a quantum dot with any number of electrons at filling factor 1<ν<21<\nu <2. We provide a general description of a novel Kondo effect which is turned on by application of an appropriate magnetic field. The spin-flip scattering of carriers by the quantum dot only involves two states of the scatterer which may have a large spin. This process is described by spin-flip Hubbard operators, which change the angular momentum, leading to a Kondo Hamiltonian. We obtain antiferromagnetic exchange couplings depending on tunneling amplitudes and correlation effects. Since Kondo temperature has an exponential dependence on exchange couplings, quantitative variations of the parameters in different regimes have important experimental consequences. In particular, we discuss the {\it chess board} aspect of the experimental conductance when represented in a grey scale as a function of both the magnetic field and the gate potential affecting the quantum dot

    Auxiliary particle theory of threshold singularities in photoemission and X-ray absorption spectra: Test of a conserving T-matrix approximation

    Full text link
    We calculate the exponents of the threshold singularities in the photoemission spectrum of a deep core hole and its X-ray absorption spectrum in the framework of a systematic many-body theory of slave bosons and pseudofermions (for the empty and occupied core level). In this representation, photoemission and X-ray absorption can be understood on the same footing; no distinction between orthogonality catastrophe and excitonic effects is necessary. We apply the conserving slave particle T-matrix approximation (CTMA), recently developed to describe both Fermi and non-Fermi liquid behavior systems with strong local correlations, to the X-ray problem as a test case. The numerical results for both photoemission and X-ray absorption are found to be in agreement with the exact infrared powerlaw behavior in the weak as well as in the strong coupling regions. We point out a close relation of the CTMA with the parquet equation approach of Nozi{\`e}res et al.Comment: 10 pages, 9 figures, published versio

    Microdeletion in a FAAH pseudogene identified in a patient with high anandamide concentrations and pain insensitivity

    Get PDF
    The study of rare families with inherited pain insensitivity can identify new human-validated analgesic drug targets. Here, a 66-yr-old female presented with nil requirement for postoperative analgesia after a normally painful orthopaedic hand surgery (trapeziectomy). Further investigations revealed a lifelong history of painless injuries, such as frequent cuts and burns, which were observed to heal quickly. We report the causative mutations for this new pain insensitivity disorder: the co-inheritance of (i) a microdeletion in dorsal root ganglia and brain-expressed pseudogene, FAAH-OUT, which we cloned from the fatty-acid amide hydrolase (FAAH) chromosomal region; and (ii) a common functional single-nucleotide polymorphism in FAAH conferring reduced expression and activity. Circulating concentrations of anandamide and related fatty-acid amides (palmitoylethanolamide and oleoylethanolamine) that are all normally degraded by FAAH were significantly elevated in peripheral blood compared with normal control carriers of the hypomorphic single-nucleotide polymorphism. The genetic findings and elevated circulating fatty-acid amides are consistent with a phenotype resulting from enhanced endocannabinoid signalling and a loss of function of FAAH. Our results highlight previously unknown complexity at the FAAH genomic locus involving the expression of FAAH-OUT, a novel pseudogene and long non-coding RNA. These data suggest new routes to develop FAAH-based analgesia by targeting of FAAH-OUT, which could significantly improve the treatment of postoperative pain and potentially chronic pain and anxiety disorders. - 2019 The Author(s)Medical Research Council (Career Development Award G1100340 to JJC); Wellcome Trust ( 200183/Z/15/Z to JJC, 095698Z/11/Z and 202747/Z/16/Z to DLHB); Alzheimer's Society (research fellowship to JTB), University of Cambridge Academic Foundation Programme (to MCL); Molecular Nociception Group (to MCL); National Institutes of Health (Bethesda, MD, USA) Ruth L. Kirschstein Institutional National Research Service Award (to MCL); Wellcome Trust funded London Pain Consortium (to JDR); Colciencias through a Francisco Jose de Caldas Scholarship (LASPAU, Harvard University) (to JDR); Canadian Institutes of Health Research (CIHR; to MNH); CIHR (postdoctoral funding to MM)

    Transform-domain analysis of packet delay in network nodes with QoS-aware scheduling

    Get PDF
    In order to differentiate the perceived QoS between traffic classes in heterogeneous packet networks, equipment discriminates incoming packets based on their class, particularly in the way queued packets are scheduled for further transmission. We review a common stochastic modelling framework in which scheduling mechanisms can be evaluated, especially with regard to the resulting per-class delay distribution. For this, a discrete-time single-server queue is considered with two classes of packet arrivals, either delay-sensitive (1) or delay-tolerant (2). The steady-state analysis relies on the use of well-chosen supplementary variables and is mainly done in the transform domain. Secondly, we propose and analyse a new type of scheduling mechanism that allows precise control over the amount of delay differentiation between the classes. The idea is to introduce N reserved places in the queue, intended for future arrivals of class 1
    corecore