1,285 research outputs found

    Gauge/gravity duality and the interplay of various fractional branes

    Full text link
    We consider different types of fractional branes on a Z_2 orbifold of the conifold and analyze in detail the corresponding gauge/gravity duality. The gauge theory possesses a rich and varied dynamics, both in the UV and in the IR. We find the dual supergravity solution which contains both untwisted and twisted 3-form fluxes, related to what are known as deformation and N=2 fractional branes respectively. We analyze the resulting RG flow from the supergravity perspective, by developing an algorithm to easily extract it. We find hints of a generalization of the familiar cascade of Seiberg dualities due to a non-trivial interplay between the different types of fractional branes. We finally consider the IR behavior in several limits, where the dominant effective dynamics is either confining, in a Coulomb phase or runaway, and discuss the resolution of singularities in the dual geometric background.Comment: 38 pages + appendices, 15 figures; v2: refs added and typos correcte

    Big Data in HEP: A comprehensive use case study

    Full text link
    Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed.Comment: Proceedings for 22nd International Conference on Computing in High Energy and Nuclear Physics (CHEP 2016

    Factors affecting soil invertebrate biodiversity in agroecosystems of the Po Plain area (Italy)

    Get PDF
    Soil is a fundamental component of the biosphere, whose properties and quality are affected by human activities, such as agriculture. Soil health is fundamental for different ecosystem services and soil biota has a crucial role in maintaining it. Elucidating how different crops and agricultural practices affect soil invertebrates communities is of relevance. In the present study, a DNA metabarcoding approach was adopted to evaluate the effects of different biotic and abiotic factors, including agricultural practices, on the composition and diversity of soil invertebrate communities of different agro-ecosystems (Po Plain-Italy). At this aim, the DNA markers and the more effective primers in retrieving soil metazoan communities were established. Bulk soil samples from different agro-ecosystems (i.e., cornfield, alfalfa, paddy fields, vineyard, stable meadow, woodland) were collected and, processed for obtaining 18S rRNA and coi sequences (raw reads analyzed using QIIME2 and R). Soil physical and chemical parameters were measured for each soil sample (e.g., pH, carbon-nitrogen ratio, texture, porosity) and metadata on farms management were retrieved. The most efficient primer pairs in recovering soil metazoans were M620F/M1260R for 18S rRNA, and mlCOIintF/jgHCO2198R for coi gene. Soil communities resulted dominated by Nematoda, Arthropoda, Annelida, Rotifera and Tardigrada. The most diverse invertebrate communities have been found in the soil of stable meadows and woodlands, while cornfields showed the lowest level of diversity. The diversity of soil invertebrate communities (Hill numbers) was positively correlated with the level of porosity and carbon-nitrogen ratio, while it was negatively correlated with the phosphate abundance. This pattern probably reflects the negative effect of excessive fertilization with phosphates on soil fauna, while the abundance of organic matter and microhabitats were found to enhance the presence of more complex communities. Other soil properties were correlated only with specific taxa (e.g., pH was negatively correlated with the diversity of Annelida and Rotifera)

    Characterization of a Silicon Drift Detector for High-Resolution Electron Spectroscopy

    Full text link
    Silicon Drift Detectors, widely employed in high-resolution and high-rate X-ray applications, are considered here with interest also for electron detection. The accurate measurement of the tritium beta decay is the core of the TRISTAN (TRitium Investigation on STerile to Active Neutrino mixing) project. This work presents the characterization of a single-pixel SDD detector with a mono-energetic electron beam obtained from a Scanning Electron Microscope. The suitability of the SDD to detect electrons, in the energy range spanning from few keV to tens of keV, is demonstrated. Experimental measurements reveal a strong effect of the detector's entrance window structure on the observed energy response. A detailed detector model is therefore necessary to reconstruct the spectrum of an unknown beta-decay source

    Impact of image filtering and assessment of volume-confounding effects on CT radiomic features and derived survival models in non-small cell lung cancer

    Full text link
    BACKGROUND No evidence supports the choice of specific imaging filtering methodologies in radiomics. As the volume of the primary tumor is a well-recognized prognosticator, our purpose is to assess how filtering may impact the feature/volume dependency in computed tomography (CT) images of non-small cell lung cancer (NSCLC), and if such impact translates into differences in the performance of survival modeling. The role of lesion volume in model performances was also considered and discussed. METHODS Four-hundred seventeen CT images NSCLC patients were retrieved from the NSCLC-Radiomics public repository. Pre-processing and features extraction were implemented using Pyradiomics v3.0.1. Features showing high correlation with volume across original and filtered images were excluded. Cox proportional hazards (PH) with least absolute shrinkage and selection operator (LASSO) regularization and CatBoost models were built with and without volume, and their concordance (C-) indices were compared using Wilcoxon signed-ranked test. The Mann Whitney U test was used to assess model performances after stratification into two groups based on low- and high-volume lesions. RESULTS Radiomic models significantly outperformed models built on only clinical variables and volume. However, the exclusion/inclusion of volume did not generally alter the performances of radiomic models. Overall, performances were not substantially affected by the choice of either imaging filter (overall C-index 0.539-0.590 for Cox PH and 0.589-0.612 for CatBoost). The separation of patients with high-volume lesions resulted in significantly better performances in 2/10 and 7/10 cases for Cox PH and CatBoost models, respectively. Both low- and high-volume models performed significantly better with the inclusion of radiomic features (P<0.0001), but the improvement was largest in the high-volume group (+10.2% against +8.7% improvement for CatBoost models and +10.0% against +5.4% in Cox PH models). CONCLUSIONS Radiomic features complement well-known prognostic factors such as volume, but their volume-dependency is high and should be managed with vigilance. The informative content of radiomic features may be diminished in small lesion volumes, which could limit the applicability of radiomics in early-stage NSCLC, where tumors tend to be small. Our results also suggest an advantage of CatBoost models over the Cox PH models

    Quality assurance for automatically generated contours with additional deep learning

    Get PDF
    Objective: Deploying an automatic segmentation model in practice should require rigorous quality assurance (QA) and continuous monitoring of the model’s use and performance, particularly in high-stakes scenarios such as healthcare. Currently, however, tools to assist with QA for such models are not available to AI researchers. In this work, we build a deep learning model that estimates the quality of automatically generated contours. Methods: The model was trained to predict the segmentation quality by outputting an estimate of the Dice similarity coefficient given an image contour pair as input. Our dataset contained 60 axial T2-weighted MRI images of prostates with ground truth segmentations along with 80 automatically generated segmentation masks. The model we used was a 3D version of the EfficientDet architecture with a custom regression head. For validation, we used a fivefold cross-validation. To counteract the limitation of the small dataset, we used an extensive data augmentation scheme capable of producing virtually infinite training samples from a single ground truth label mask. In addition, we compared the results against a baseline model that only uses clinical variables for its predictions. Results: Our model achieved a mean absolute error of 0.020 ± 0.026 (2.2% mean percentage error) in estimating the Dice score, with a rank correlation of 0.42. Furthermore, the model managed to correctly identify incorrect segmentations (defined in terms of acceptable/unacceptable) 99.6% of the time. Conclusion: We believe that the trained model can be used alongside automatic segmentation tools to ensure quality and thus allow intervention to prevent undesired segmentation behavior

    Results from the Cuore Experiment

    Get PDF
    The Cryogenic Underground Observatory for Rare Events (CUORE) is the first bolometric experiment searching for neutrinoless double beta decay that has been able to reach the 1-ton scale. The detector consists of an array of 988 TeO2 crystals arranged in a cylindrical compact structure of 19 towers, each of them made of 52 crystals. The construction of the experiment was completed in August 2016 and the data taking started in spring 2017 after a period of commissioning and tests. In this work we present the neutrinoless double beta decay results of CUORE from examining a total TeO2 exposure of 86.3kg yr, characterized by an effective energy resolution of 7.7 keV FWHM and a background in the region of interest of 0.014 counts/ (keV kg yr). In this physics run, CUORE placed a lower limit on the decay half- life of neutrinoless double beta decay of 130Te > 1.3.1025 yr (90% C. L.). Moreover, an analysis of the background of the experiment is presented as well as the measurement of the 130Te 2vo3p decay with a resulting half- life of T2 2. [7.9 :- 0.1 (stat.) :- 0.2 (syst.)] x 10(20) yr which is the most precise measurement of the half- life and compatible with previous results

    The commissioning of the CUORE experiment: the mini-tower run

    Get PDF
    CUORE is a ton-scale experiment approaching the data taking phase in Gran Sasso National Laboratory. Its primary goal is to search for the neutrinoless double-beta decay in 130Te using 988 crystals of tellurim dioxide. The crystals are operated as bolometers at about 10 mK taking advantage of one of the largest dilution cryostat ever built. Concluded in March 2016, the cryostat commissioning consisted in a sequence of cool down runs each one integrating new parts of the apparatus. The last run was performed with the fully configured cryostat and the thermal load at 4 K reached the impressive mass of about 14 tons. During that run the base temperature of 6.3 mK was reached and maintained for more than 70 days. An array of 8 crystals, called mini-tower, was used to check bolometers operation, readout electronics and DAQ. Results will be presented in terms of cooling power, electronic noise, energy resolution and preliminary background measurements

    Combined Forward-Backward Asymmetry Measurements in Top-Antitop Quark Production at the Tevatron

    Get PDF
    The CDF and D0 experiments at the Fermilab Tevatron have measured the asymmetry between yields of forward- and backward-produced top and antitop quarks based on their rapidity difference and the asymmetry between their decay leptons. These measurements use the full data sets collected in proton-antiproton collisions at a center-of-mass energy of s=1.96\sqrt s =1.96 TeV. We report the results of combinations of the inclusive asymmetries and their differential dependencies on relevant kinematic quantities. The combined inclusive asymmetry is AFBttˉ=0.128±0.025A_{\mathrm{FB}}^{t\bar{t}} = 0.128 \pm 0.025. The combined inclusive and differential asymmetries are consistent with recent standard model predictions
    • …
    corecore