662 research outputs found

    Retained primary teeth in STAT3 hyper-IgE syndrome: early intervention in childhood is essential

    Get PDF
    Background STAT3 hyper-IgE syndrome (STAT3-HIES) is a rare primary immunodeficiency that clinically overlaps with atopic dermatitis. In addition to eczema, elevated serum-IgE, and recurrent infections, STAT3-HIES patients suffer from characteristic facies, midline defects, and retained primary teeth. To optimize dental management we assessed the development of dentition and the long-term outcomes of dental treatment in 13 molecularly defined STAT3-HIES patients using questionnaires, radiographs, and dental investigations. Results Primary tooth eruption was unremarkable in all STAT3-HIES patients evaluated. Primary tooth exfoliation and permanent tooth eruption was delayed in 83% of patients due to unresorbed tooth roots. A complex orthodontic treatment was needed for one patient receiving delayed extraction of primary molars and canines. Permanent teeth erupted spontaneously in all patients receiving primary teeth extraction of retained primary teeth during average physiologic exfoliation time. Conclusions The association of STAT3-HIES with retained primary teeth is important knowledge for dentists and physicians as timely extraction of retained primary teeth prevents dental complications. To enable spontaneous eruption of permanent teeth in children with STAT3-HIES, we recommend extracting retained primary incisors when the patient is not older than 9 years of age and retained primary canines and molars when the patient is not older than 13 years of age, after having confirmed the presence of the permanent successor teeth by radiograph

    Aided communication, mind understanding and co-construction of meaning

    Get PDF
    Mind understanding allows for the adaptation of expressive language to a listener and is a core element when communicating new information to a communication partner. There is limited knowledge about the relationship between aided language and mind understanding. This study investigates this relationship using a communication task. The participants were 71 aided communicators using graphic symbols or spelling for expression (38/33 girls/boys) and a reference group of 40 speaking children (21/19 girls/boys), aged 5;0-15;11 years. The task was to describe, but not name, drawings to a communication partner. The partner could not see the drawing and had to infer what was depicted from the child's explanation. Dyads with aided communicators solved fewer items than reference dyads (64% vs 93%). The aided spellers presented more precise details than the symbol users (46% vs 38%). In the aided group, number of correct items correlated with verbal comprehension and age.Peer reviewe

    Entangled Quantum Key Distribution with a Biased Basis Choice

    Full text link
    We investigate a quantum key distribution (QKD) scheme which utilizes a biased basis choice in order to increase the efficiency of the scheme. The optimal bias between the two measurement bases, a more refined error analysis, and finite key size effects are all studied in order to assure the security of the final key generated with the system. We then implement the scheme in a local entangled QKD system that uses polarization entangled photon pairs to securely distribute the key. A 50/50 non-polarizing beamsplitter with different optical attenuators is used to simulate a variable beamsplitter in order to allow us to study the operation of the system for different biases. Over 6 hours of continuous operation with a total bias of 0.9837/0.0163 (Z/X), we were able to generate 0.4567 secure key bits per raw key bit as compared to 0.2550 secure key bits per raw key bit for the unbiased case. This represents an increase in the efficiency of the key generation rate by 79%.Comment: v2: Revised paper based on referee reports, Theory section was revised (primarily regarding finite key effects), Results section almost completely rewritten with more experimental data. 16 pages, 5 figures. v1: 14 pages, 6 figures, higher resolution figures will be available in the published articl

    Measurement of (anti)deuteron and (anti)proton production in DIS at HERA

    Get PDF
    The first observation of (anti)deuterons in deep inelastic scattering at HERA has been made with the ZEUS detector at a centre-of-mass energy of 300--318 GeV using an integrated luminosity of 120 pb-1. The measurement was performed in the central rapidity region for transverse momentum per unit of mass in the range 0.3<p_T/M<0.7. The particle rates have been extracted and interpreted in terms of the coalescence model. The (anti)deuteron production yield is smaller than the (anti)proton yield by approximately three orders of magnitude, consistent with the world measurements.Comment: 26 pages, 9 figures, 5 tables, submitted to Nucl. Phys.

    Space-QUEST: Experiments with quantum entanglement in space

    Get PDF
    The European Space Agency (ESA) has supported a range of studies in the field of quantum physics and quantum information science in space for several years, and consequently we have submitted the mission proposal Space-QUEST (Quantum Entanglement for Space Experiments) to the European Life and Physical Sciences in Space Program. We propose to perform space-to-ground quantum communication tests from the International Space Station (ISS). We present the proposed experiments in space as well as the design of a space based quantum communication payload.Comment: 4 pages, 1 figure, accepted for the 59th International Astronautical Congress (IAC) 200

    High-E_T dijet photoproduction at HERA

    Get PDF
    The cross section for high-E_T dijet production in photoproduction has been measured with the ZEUS detector at HERA using an integrated luminosity of 81.8 pb-1. The events were required to have a virtuality of the incoming photon, Q^2, of less than 1 GeV^2 and a photon-proton centre-of-mass energy in the range 142 < W < 293 GeV. Events were selected if at least two jets satisfied the transverse-energy requirements of E_T(jet1) > 20 GeV and E_T(jet2) > 15 GeV and pseudorapidity requirements of -1 < eta(jet1,2) < 3, with at least one of the jets satisfying -1 < eta(jet) < 2.5. The measurements show sensitivity to the parton distributions in the photon and proton and effects beyond next-to-leading order in QCD. Hence these data can be used to constrain further the parton densities in the proton and photon.Comment: 36 pages, 13 figures, 20 tables, including minor revisions from referees. Accepted by Phys. Rev.

    NeuroBench: Advancing Neuromorphic Computing through Collaborative, Fair and Representative Benchmarking

    Full text link
    The field of neuromorphic computing holds great promise in terms of advancing computing efficiency and capabilities by following brain-inspired principles. However, the rich diversity of techniques employed in neuromorphic research has resulted in a lack of clear standards for benchmarking, hindering effective evaluation of the advantages and strengths of neuromorphic methods compared to traditional deep-learning-based methods. This paper presents a collaborative effort, bringing together members from academia and the industry, to define benchmarks for neuromorphic computing: NeuroBench. The goals of NeuroBench are to be a collaborative, fair, and representative benchmark suite developed by the community, for the community. In this paper, we discuss the challenges associated with benchmarking neuromorphic solutions, and outline the key features of NeuroBench. We believe that NeuroBench will be a significant step towards defining standards that can unify the goals of neuromorphic computing and drive its technological progress. Please visit neurobench.ai for the latest updates on the benchmark tasks and metrics

    NeuroBench:Advancing Neuromorphic Computing through Collaborative, Fair and Representative Benchmarking

    Get PDF
    The field of neuromorphic computing holds great promise in terms of advancing computing efficiency and capabilities by following brain-inspired principles. However, the rich diversity of techniques employed in neuromorphic research has resulted in a lack of clear standards for benchmarking, hindering effective evaluation of the advantages and strengths of neuromorphic methods compared to traditional deep-learning-based methods. This paper presents a collaborative effort, bringing together members from academia and the industry, to define benchmarks for neuromorphic computing: NeuroBench. The goals of NeuroBench are to be a collaborative, fair, and representative benchmark suite developed by the community, for the community. In this paper, we discuss the challenges associated with benchmarking neuromorphic solutions, and outline the key features of NeuroBench. We believe that NeuroBench will be a significant step towards defining standards that can unify the goals of neuromorphic computing and drive its technological progress. Please visit neurobench.ai for the latest updates on the benchmark tasks and metrics

    Recent Developments in Fluorescence Correlation Spectroscopy for Diffusion Measurements in Planar Lipid Membranes

    Get PDF
    Fluorescence correlation spectroscopy (FCS) is a single molecule technique used mainly for determination of mobility and local concentration of molecules. This review describes the specific problems of FCS in planar systems and reviews the state of the art experimental approaches such as 2-focus, Z-scan or scanning FCS, which overcome most of the artefacts and limitations of standard FCS. We focus on diffusion measurements of lipids and proteins in planar lipid membranes and review the contributions of FCS to elucidating membrane dynamics and the factors influencing it, such as membrane composition, ionic strength, presence of membrane proteins or frictional coupling with solid support

    NeuroBench:A Framework for Benchmarking Neuromorphic Computing Algorithms and Systems

    Get PDF
    Neuromorphic computing shows promise for advancing computing efficiency and capabilities of AI applications using brain-inspired principles. However, the neuromorphic research field currently lacks standardized benchmarks, making it difficult to accurately measure technological advancements, compare performance with conventional methods, and identify promising future research directions. Prior neuromorphic computing benchmark efforts have not seen widespread adoption due to a lack of inclusive, actionable, and iterative benchmark design and guidelines. To address these shortcomings, we present NeuroBench: a benchmark framework for neuromorphic computing algorithms and systems. NeuroBench is a collaboratively-designed effort from an open community of nearly 100 co-authors across over 50 institutions in industry and academia, aiming to provide a representative structure for standardizing the evaluation of neuromorphic approaches. The NeuroBench framework introduces a common set of tools and systematic methodology for inclusive benchmark measurement, delivering an objective reference framework for quantifying neuromorphic approaches in both hardware-independent (algorithm track) and hardware-dependent (system track) settings. In this article, we present initial performance baselines across various model architectures on the algorithm track and outline the system track benchmark tasks and guidelines. NeuroBench is intended to continually expand its benchmarks and features to foster and track the progress made by the research community
    corecore