2,967 research outputs found

    DELPHI - fast and adaptive computational laser point detection and visual footprint quantification for arbitrary underwater image collections

    Get PDF
    Marine researchers continue to create large quantities of benthic images e.g., using AUVs (Autonomous Underwater Vehicles). In order to quantify the size of sessile objects in the images, a pixel-to-centimeter ratio is required for each image, often indirectly provided through a geometric laser point (LP) pattern, projected onto the seafloor. Manual annotation of these LPs in all images is too time-consuming and thus infeasible for nowadays data volumes. Because of the technical evolution of camera rigs, the LP's geometrical layout and color features vary for different expeditions and projects. This makes the application of one algorithm, tuned to a strictly defined LP pattern, also ineffective. Here we present the web-tool DELPHI, that efficiently learns the LP layout for one image transect/collection from just a small number of hand labeled LPs and applies this layout model to the rest of the data. The efficiency in adapting to new data allows to compute the LPs and the pixel-to-centimeter ratio fully automatic and with high accuracy. DELPHI is applied to two real-world examples and shows clear improvements regarding reduction of tuning effort for new LP patterns as well as increasing detection performance

    Development of Wireless Techniques in Data and Power Transmission - Application for Particle Physics Detectors

    Full text link
    Wireless techniques have developed extremely fast over the last decade and using them for data and power transmission in particle physics detectors is not science- fiction any more. During the last years several research groups have independently thought of making it a reality. Wireless techniques became a mature field for research and new developments might have impact on future particle physics experiments. The Instrumentation Frontier was set up as a part of the SnowMass 2013 Community Summer Study [1] to examine the instrumentation R&D for the particle physics research over the coming decades: {\guillemotleft} To succeed we need to make technical and scientific innovation a priority in the field {\guillemotright}. Wireless data transmission was identified as one of the innovations that could revolutionize the transmission of data out of the detector. Power delivery was another challenge mentioned in the same report. We propose a collaboration to identify the specific needs of different projects that might benefit from wireless techniques. The objective is to provide a common platform for research and development in order to optimize effectiveness and cost, with the aim of designing and testing wireless demonstrators for large instrumentation systems

    759–5 Use of an Interactive Electronic Whiteboard to Teach Clinical Cardiology Decision Analysis to Medical Students

    Get PDF
    We used innovative state-of-the-art computer and collaboration technologies to teach first-year medical students an analytic methodology to solve difficult clinical cardiology problems to make informed medical decisions. Clinical examples included the decision to administer thrombolytic therapy considering the risk of hemorrhagic stroke, and activity recommendations for athletes at risk for sudden death. Students received instruction on the decision-analytic approach which integrates pathophysiology, treatment efficacy, diagnostic test interpretation, health outcomes, patient preferences, and cost-effectiveness into a decision-analytic model.The traditional environment of a small group and blackboard was significantly enhanced by using an electronic whiteboard, the Xerox LiveBoard™. The LiveBoard features an 80486-based personal computer, large (3’×4’) display, and wireless pens for input. It allowed the integration of decision-analytic software, statistical software, digital slides, and additional media. We developed TIDAL (Team Interactive Decision Analysis in the Large-screen environment), a software package to interactively construct decision trees, calculate expected utilities, and perform one- and two-way sensitivity analyses using pen and gesture inputs. The Live Board also allowed the novel incorporation of Gambler, a utility assessment program obtained from the New England Medical Center. Gambler was used to obtain utilities for outcomes such as non-disabling hemorrhagic stroke. The interactive nature of the LiveBoard allowed real-time decision model development by the class, followed by instantaneous calculation of expected utilities and sensitivity analyses. The multimedia aspect and interactivity were conducive to extensive class participation.Ten out of eleven students wanted decision-analytic software available for use during their clinical years and all students would recommend the course to next year's students. We plan to experiment with the electronic collaboration features of this technology and allow groups separated by time or space to collaborate on decisions and explore the models created

    Radiation-driven winds of hot luminous stars XVII. Parameters of selected central stars of PN from consistent optical and UV spectral analysis and the universality of the mass-luminosity relation

    Full text link
    Context: The commonly accepted mass-luminosity relation of central stars of planetary nebulae (CSPNs) might not be universally valid. While earlier optical analyses could not derive masses and luminosities independently (instead taking them from theoretical evolutionary models) hydrodynamically consistent modelling of the stellar winds allows using fits to the UV spectra to consistently determine also stellar radii, masses, and luminosities without assuming a mass-luminosity relation. Recent application to a sample of CSPNs raised questions regarding the validity of the theoretical mass-luminosity relation of CSPNs. Aims: The results of the earlier UV analysis are reassessed by means of a simultaneous comparison of observed optical and UV spectra with corresponding synthetic spectra. Methods: Using published stellar parameters (a) from a consistent UV analysis and (b) from fits to optical H and He lines, we calculate simultaneous optical and UV spectra with our model atmosphere code, which has been improved by implementing Stark broadening for H and He lines. Results: Spectra computed with the parameter sets from the UV analysis yield good agreement to the observations, but spectra computed with the stellar parameters from the published optical analysis and using corresponding consistent wind parameters show large discrepancies to both the observed optical and UV spectra. The published optical analyses give good fits to the observed spectrum only because the wind parameters assumed in these analyses are inconsistent with their stellar parameters. By enforcing consistency between stellar and wind parameters, stellar parameters are obtained which disagree with the core-mass-luminosity relation for the objects analyzed. This disagreement is also evident from a completely different approach: an investigation of the dynamical wind parameters.Comment: 22 pages, 18 fugre

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be 24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with δ<+34.5\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Measurement of the cross-section and charge asymmetry of WW bosons produced in proton-proton collisions at s=8\sqrt{s}=8 TeV with the ATLAS detector

    Get PDF
    This paper presents measurements of the W+μ+νW^+ \rightarrow \mu^+\nu and WμνW^- \rightarrow \mu^-\nu cross-sections and the associated charge asymmetry as a function of the absolute pseudorapidity of the decay muon. The data were collected in proton--proton collisions at a centre-of-mass energy of 8 TeV with the ATLAS experiment at the LHC and correspond to a total integrated luminosity of 20.2~\mbox{fb^{-1}}. The precision of the cross-section measurements varies between 0.8% to 1.5% as a function of the pseudorapidity, excluding the 1.9% uncertainty on the integrated luminosity. The charge asymmetry is measured with an uncertainty between 0.002 and 0.003. The results are compared with predictions based on next-to-next-to-leading-order calculations with various parton distribution functions and have the sensitivity to discriminate between them.Comment: 38 pages in total, author list starting page 22, 5 figures, 4 tables, submitted to EPJC. All figures including auxiliary figures are available at https://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/STDM-2017-13

    Search for chargino-neutralino production with mass splittings near the electroweak scale in three-lepton final states in √s=13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for supersymmetry through the pair production of electroweakinos with mass splittings near the electroweak scale and decaying via on-shell W and Z bosons is presented for a three-lepton final state. The analyzed proton-proton collision data taken at a center-of-mass energy of √s=13  TeV were collected between 2015 and 2018 by the ATLAS experiment at the Large Hadron Collider, corresponding to an integrated luminosity of 139  fb−1. A search, emulating the recursive jigsaw reconstruction technique with easily reproducible laboratory-frame variables, is performed. The two excesses observed in the 2015–2016 data recursive jigsaw analysis in the low-mass three-lepton phase space are reproduced. Results with the full data set are in agreement with the Standard Model expectations. They are interpreted to set exclusion limits at the 95% confidence level on simplified models of chargino-neutralino pair production for masses up to 345 GeV

    Search for squarks and gluinos with the ATLAS detector in final states with jets and missing transverse momentum using √s=8 TeV proton-proton collision data

    Get PDF
    A search for squarks and gluinos in final states containing high-p T jets, missing transverse momentum and no electrons or muons is presented. The data were recorded in 2012 by the ATLAS experiment in s√=8 TeV proton-proton collisions at the Large Hadron Collider, with a total integrated luminosity of 20.3 fb−1. Results are interpreted in a variety of simplified and specific supersymmetry-breaking models assuming that R-parity is conserved and that the lightest neutralino is the lightest supersymmetric particle. An exclusion limit at the 95% confidence level on the mass of the gluino is set at 1330 GeV for a simplified model incorporating only a gluino and the lightest neutralino. For a simplified model involving the strong production of first- and second-generation squarks, squark masses below 850 GeV (440 GeV) are excluded for a massless lightest neutralino, assuming mass degenerate (single light-flavour) squarks. In mSUGRA/CMSSM models with tan β = 30, A 0 = −2m 0 and μ > 0, squarks and gluinos of equal mass are excluded for masses below 1700 GeV. Additional limits are set for non-universal Higgs mass models with gaugino mediation and for simplified models involving the pair production of gluinos, each decaying to a top squark and a top quark, with the top squark decaying to a charm quark and a neutralino. These limits extend the region of supersymmetric parameter space excluded by previous searches with the ATLAS detector

    Search for squarks and gluinos in events with isolated leptons, jets and missing transverse momentum at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for supersymmetry in final states containing at least one isolated lepton (electron or muon), jets and large missing transverse momentum with the ATLAS detector at the Large Hadron Collider are reported. The search is based on proton-proton collision data at a centre-of-mass energy s√=8 TeV collected in 2012, corresponding to an integrated luminosity of 20 fb−1. No significant excess above the Standard Model expectation is observed. Limits are set on supersymmetric particle masses for various supersymmetric models. Depending on the model, the search excludes gluino masses up to 1.32 TeV and squark masses up to 840 GeV. Limits are also set on the parameters of a minimal universal extra dimension model, excluding a compactification radius of 1/R c = 950 GeV for a cut-off scale times radius (ΛR c) of approximately 30

    The Mu3e Data Acquisition

    Get PDF
    The Mu3e experiment aims to find or exclude the lepton flavor violating decay μ+→e+e−e+ with a sensitivity of one in 10 16 muon decays. The first phase of the experiment is currently under construction at the Paul Scherrer Institute (PSI, Switzerland), where beams with up to 10 8 muons per second are available. The detector will consist of an ultra-thin pixel tracker made from High-Voltage Monolithic Active Pixel Sensors (HV-MAPS) , complemented by scintillating tiles and fibers for precise timing measurements. The experiment produces about 100Gbit/s of zero-suppressed data, which are transported to a filter farm using a network of field programmable gate arrays (FPGAs) and fast optical links. On the filter farm, tracks and three-particle vertices are reconstructed using highly parallel algorithms running on graphics processing units, leading to a reduction of the data to 100 Mbyte/s for mass storage and offline analysis. This article introduces the system design and hardware implementation of the Mu3e data acquisition and filter farm
    corecore