784 research outputs found

    How stable are Transferability Metrics evaluations?

    Full text link
    Transferability metrics is a maturing field with increasing interest, which aims at providing heuristics for selecting the most suitable source models to transfer to a given target dataset, without fine-tuning them all. However, existing works rely on custom experimental setups which differ across papers, leading to inconsistent conclusions about which transferability metrics work best. In this paper we conduct a large-scale study by systematically constructing a broad range of 715k experimental setup variations. We discover that even small variations to an experimental setup lead to different conclusions about the superiority of a transferability metric over another. Then we propose better evaluations by aggregating across many experiments, enabling to reach more stable conclusions. As a result, we reveal the superiority of LogME at selecting good source datasets to transfer from in a semantic segmentation scenario, NLEEP at selecting good source architectures in an image classification scenario, and GBC at determining which target task benefits most from a given source model. Yet, no single transferability metric works best in all scenarios

    Preliminary Validation of a Low-Cost Motion Analysis System Based on RGB Cameras to Support the Evaluation of Postural Risk Assessment

    Get PDF
    This paper introduces a low-cost and low computational marker-less motion capture system based on the acquisition of frame images through standard RGB cameras. It exploits the open-source deep learning model CMU, from the tf-pose-estimation project. Its numerical accuracy and its usefulness for ergonomic assessment are evaluated by a proper experiment, designed and performed to: (1) compare the data provided by it with those collected from a motion capture golden standard system; (2) compare the RULA scores obtained with data provided by it with those obtained with data provided by the Vicon Nexus system and those estimated through video analysis, by a team of three expert ergonomists. Tests have been conducted in standardized laboratory conditions and involved a total of six subjects. Results suggest that the proposed system can predict angles with good consistency and give evidence about the tool’s usefulness for ergonomist

    JPEG XT: A Compression Standard for HDR and WCG Images [Standards in a Nutshell]

    Get PDF
    High bit depth data acquisition and manipulation have been largely studied at the academic level over the last 15 years and are rapidly attracting interest at the industrial level. An example of the increasing interest for high-dynamic range (HDR) imaging is the use of 32-bit floating point data for video and image acquisition and manipulation that allows a variety of visual effects that closely mimic the real-world visual experience of the end user [1] (see Figure 1). At the industrial level, we are witnessing increasing traction toward supporting HDR and wide color gamut (WCG). WCG leverages HDR for each color channel to display a wider range of colors. Consumer cameras are currently available with a 14- or 16-bit analog-to-digital converter. Rendering devices are also appearing with the capability to display HDR images and video with a peak brightness of up to 4,000 nits and to support WCG (ITU-R Rec. BT.2020 [2]) rather than the historical ITU-R Rec. BT.709 [3]. This trend calls for a widely accepted standard for higher bit depth support that can be seamlessly integrated into existing products and applications. While standard formats such as the Joint Photographic Experts Group (JPEG) 2000 [5] and JPEG XR [6] offer support for high bit depth image representations, their adoption requires a nonnegligible investment that may not always be affordable in existing imaging ecosystems, and induces a difficult transition, as they are not backward-compatible with the popular JPEG image format

    Overview and Evaluation of the JPEG XT HDR Image Compression Standard

    Get PDF
    Standards play an important role in providing a common set of specifications and allowing inter-operability between devices and systems. Until recently, no standard for High Dynamic Range (HDR) image coding had been adopted by the market, and HDR imaging relies on proprietary and vendor specific formats which are unsuitable for storage or exchange of such images. To resolve this situation, the JPEG Committee is developing a new coding standard called JPEG~XT that is backwards compatible to the popular JPEG compression, allowing it to be implemented using standard 8-bit JPEG coding hardware or software. In this paper, we present design principles and technical details of JPEG~XT. It is based on a two-layers design, a base layer containing a Low Dynamic Range (LDR) image accessible to legacy implementations, and an extension layer providing the full dynamic range. The paper introduces three of currently defined profiles in JPEG~XT, each constraining the common decoder architecture to a subset of allowable configurations. We assess the coding efficiency of each profile extensively through subjective assessments, using 24 naive subjects to evaluate 20 images, and objective evaluations, using 106 images with five different tone-mapping operators and at 100 different bit rates. The objective results (based on benchmarking with subjective scores) demonstrate that JPEG~XT can encode HDR images at bit rates varying from 1.1 to 1.9 bit/pixel for estimated mean opinion score (MOS) values above 4.5 out of 5, which is considered as fully transparent in many applications. This corresponds to 23-times bitstream reduction compared to lossless OpenEXR PIZ compression

    Spectral modeling of scintillator for the NEMO-3 and SuperNEMO detectors

    Full text link
    We have constructed a GEANT4-based detailed software model of photon transport in plastic scintillator blocks and have used it to study the NEMO-3 and SuperNEMO calorimeters employed in experiments designed to search for neutrinoless double beta decay. We compare our simulations to measurements using conversion electrons from a calibration source of 207Bi\rm ^{207}Bi and show that the agreement is improved if wavelength-dependent properties of the calorimeter are taken into account. In this article, we briefly describe our modeling approach and results of our studies.Comment: 16 pages, 10 figure

    Quasi-free photoproduction of η-mesons off 3He nuclei

    Get PDF
    Quasi-free photoproduction of η-mesons has been measured off nucleons bound in 3He nuclei for incident photon energies from the threshold region up to 1.4 GeV. The experiment was performed at the tagged photon facility of the Mainz MAMI accelerator with an almost 4π covering electromagnetic calorimeter, combining the TAPS and Crystal Ball detectors. The η-mesons were detected in coincidence with the recoil nucleons. This allowed a comparison of the production cross section off quasi-free protons and quasi-free neutrons and a full kinematic reconstruction of the final state, eliminating effects from nuclear Fermi motion. In the S11(1535) resonance peak, the data agree with the neutron/proton cross section ratio extracted from measurements with deuteron targets. More importantly, the prominent structure observed in photoproduction off quasi-free neutrons bound in the deuteron is also clearly observed. Its parameters (width, strength) are consistent with the expectations from the deuteron results. On an absolute scale the cross sections for both quasi-free protons and neutrons are suppressed with respect to the deuteron target pointing to significant nuclear final-state interaction effects

    Non-irradiation-derived reactive oxygen species (ROS) and cancer: therapeutic implications

    Get PDF
    Owing to their chemical reactivity, radicals have cytocidal properties. Destruction of cells by irradiation-induced radical formation is one of the most frequent interventions in cancer therapy. An alternative to irradiation-induced radical formation is in principle drug-induced formation of radicals, and the formation of toxic metabolites by enzyme catalysed reactions. Although these developments are currently still in their infancy, they nevertheless deserve consideration. There are now numerous examples known of conventional anti-cancer drugs that may at least in part exert cytotoxicity by induction of radical formation. Some drugs, such as arsenic trioxide and 2-methoxy-estradiol, were shown to induce programmed cell death due to radical formation. Enzyme-catalysed radical formation has the advantage that cytotoxic products are produced continuously over an extended period of time in the vicinity of tumour cells. Up to now the enzymatic formation of toxic metabolites has nearly exclusively been investigated using bovine serum amine oxidase (BSAO), and spermine as substrate. The metabolites of this reaction, hydrogen peroxide and aldehydes are cytotoxic. The combination of BSAO and spermine is not only able to prevent tumour cell growth, but prevents also tumour growth, particularly well if the enzyme has been conjugated with a biocompatible gel. Since the tumour cells release substrates of BSAO, the administration of spermine is not required. Combination with cytotoxic drugs, and elevation of temperature improves the cytocidal effect of spermine metabolites. The fact that multidrug resistant cells are more sensitive to spermine metabolites than their wild type counterparts makes this new approach especially attractive, since the development of multidrug resistance is one of the major problems of conventional cancer therapy

    Measurement of the Bs0→J/ψKS0B_s^0\to J/\psi K_S^0 branching fraction

    Get PDF
    The Bs0→J/ψKS0B_s^0\to J/\psi K_S^0 branching fraction is measured in a data sample corresponding to 0.41fb−1fb^{-1} of integrated luminosity collected with the LHCb detector at the LHC. This channel is sensitive to the penguin contributions affecting the sin2ÎČ\beta measurement from B0→J/ψKS0B^0\to J/\psi K_S^0 The time-integrated branching fraction is measured to be BF(Bs0→J/ψKS0)=(1.83±0.28)×10−5BF(B_s^0\to J/\psi K_S^0)=(1.83\pm0.28)\times10^{-5}. This is the most precise measurement to date

    Measurement of the CP-violating phase \phi s in Bs->J/\psi\pi+\pi- decays

    Get PDF
    Measurement of the mixing-induced CP-violating phase phi_s in Bs decays is of prime importance in probing new physics. Here 7421 +/- 105 signal events from the dominantly CP-odd final state J/\psi pi+ pi- are selected in 1/fb of pp collision data collected at sqrt{s} = 7 TeV with the LHCb detector. A time-dependent fit to the data yields a value of phi_s=-0.019^{+0.173+0.004}_{-0.174-0.003} rad, consistent with the Standard Model expectation. No evidence of direct CP violation is found.Comment: 15 pages, 10 figures; minor revisions on May 23, 201
    • 

    corecore