37 research outputs found

    Playing with readers' expectations: types of predictive infographics in digital media

    Get PDF
    Predictive graphics, which require users to think and deduce before they can access the result, broaden possibilities for media to attract their audience. Recent research suggests that visualizing one¿s predictions improves recall and comprehension of data. Through a qualitative study of interactive infographics, the present article identifies two types of predictive infographics in digital media: users´ prediction-based graphics and calculation-based predictive graphics. How to challenge users and interaction strategies are analyzed, describing models that could be useful for both researchers and professionals from the field.Los infográficos predictivos, aquellos que retan a los usuarios a hacer una estimación subjetiva en torno a un fenómeno antes de permitirles averiguar el resultado, amplían las posibilidades de los medios digitales para atraer y retener a su audiencia. Estudios recientes sugieren, además, que visualizar las propias predicciones mejora la comprensión de la información y la probabilidad de recordarla. A partir de un estudio cualitativo de infográficos interactivos, el presente artículo identifica dos tipos de gráficos predictivos presentes en los medios digitales actuales: gráficos basados en la predicción por parte de los usuarios y gráficos predictivos basados en cálculos. Analiza sus estrategias de interacción y desafío a los usuarios, describiendo modelos que pueden ser aprovechados tanto por investigadores como profesionales de la infografía periodística

    Goodness-of-fit tests of Gaussianity: constraints on the cumulants of the MAXIMA data

    Full text link
    In this work, goodness-of-fit tests are adapted and applied to CMB maps to detect possible non-Gaussianity. We use Shapiro-Francia test and two Smooth goodness-of-fit tests: one developed by Rayner and Best and another one developed by Thomas and Pierce. The Smooth tests test small and smooth deviations of a prefixed probability function (in our case this is the univariate Gaussian). Also, the Rayner and Best test informs us of the kind of non-Gaussianity we have: excess of skewness, of kurtosis, and so on. These tests are optimal when the data are independent. We simulate and analyse non-Gaussian signals in order to study the power of these tests. These non-Gaussian simulations are constructed using the Edgeworth expansion, and assuming pixel-to-pixel independence. As an application, we test the Gaussianity of the MAXIMA data. Results indicate that the MAXIMA data are compatible with Gaussianity. Finally, the values of the skewness and kurtosis of MAXIMA data are constrained by |S| \le 0.035 and |K| \le 0.036 at the 99% confidence level.Comment: New Astronomy Reviews, in pres

    Goodness-of-Fit Tests to study the Gaussianity of the MAXIMA data

    Full text link
    Goodness-of-Fit tests, including Smooth ones, are introduced and applied to detect non-Gaussianity in Cosmic Microwave Background simulations. We study the power of three different tests: the Shapiro-Francia test (1972), the uncategorised smooth test developed by Rayner and Best(1990) and the Neyman's Smooth Goodness-of-fit test for composite hypotheses (Thomas and Pierce 1979). The Smooth Goodness-of-Fit tests are designed to be sensitive to the presence of ``smooth'' deviations from a given distribution. We study the power of these tests based on the discrimination between Gaussian and non-Gaussian simulations. Non-Gaussian cases are simulated using the Edgeworth expansion and assuming pixel-to-pixel independence. Results show these tests behave similarly and are more powerful than tests directly based on cumulants of order 3, 4, 5 and 6. We have applied these tests to the released MAXIMA data. The applied tests are built to be powerful against detecting deviations from univariate Gaussianity. The Cholesky matrix corresponding to signal (based on an assumed cosmological model) plus noise is used to decorrelate the observations previous to the analysis. Results indicate that the MAXIMA data are compatible with Gaussianity.Comment: MNRAS, in pres

    Tests of Gaussianity

    Full text link
    We review two powerful methods to test the Gaussianity of the cosmic microwave background (CMB): one based on the distribution of spherical wavelet coefficients and the other on smooth tests of goodness-of-fit. The spherical wavelet families proposed to analyse the CMB are the Haar and the Mexican Hat ones. The latter is preferred for detecting non-Gaussian homogeneous and isotropic primordial models containing some amount of skewness or kurtosis. Smooth tests of goodness-of-fit have recently been introduced in the field showing some interesting properties. We will discuss the smooth tests of goodness-of-fit developed by Rayner and Best for the univariate as well as for the multivariate analysis.Comment: Proceedings of "The Cosmic Microwave Background and its Polarization", New Astronomy Reviews, (eds. S. Hanany and K.A. Olive), in pres

    Role of the 4Kscore test as a predictor of reclassification in prostate cancer active surveillance

    Get PDF
    Background: Management of active surveillance (AS) in low-risk prostate cancer (PCa) patients could be improved with new biomarkers, such as the 4Kscore test. We analyze its ability to predict tumor reclassification by upgrading at the confirmatory biopsy at 6 months. Methods: Observational, prospective, blinded, and non-randomized study, within the Spanish National Registry on AS (AEU/PIEM/2014/0001; NCT02865330) with 181 patients included after initial Bx and inclusion criteria: PSA =10 ng/mL, cT1c-T2a, Grade group 1, =2 cores, and =5 mm/50% length core involved. Central pathological review of initial and confirmatory Bx was performed on all biopsy specimens. Plasma was collected 6 months after initial Bx and just before confirmatory Bx to determine 4Kscore result. In order to predict reclassification defined as Grade group =2, we analyzed 4Kscore, percent free to total (%f/t) PSA ratio, prostate volume, PSA density, family history, body mass index, initial Bx, total cores, initial Bx positive cores, initial Bx % of positive cores, initial Bx maximum cancer core length and initial Bx cancer % involvement. Wilcoxon rank-sum test, non-parametric trend test or Fisher’s exact test, as appropriate established differences between groups of reclassification. Results: A total of 137 patients met inclusion criteria. Eighteen patients (13.1%) were reclassified at confirmatory Bx. The %f/t PSA ratio and 4Kscore showed differences between the groups of reclassification (Yes/No). Using 7.5% as cutoff for the 4Kscore, we found a sensitivity of 89% and a specificity of 29%, with no reclassifications to Grade group 3 for patients with 4Kscore below 7.5% and 2 (6%) missed Grade group 2 reclassified patients. Using this threshold value there is a biopsy reduction of 27%. Additionally, 4Kscore was also associated with changes in tumor volume. Conclusions: Our preliminary findings suggest that the 4Kscore may be a useful tool in the decision-making process to perform a confirmatory Bx in active surveillance management

    gSeaGen: The KM3NeT GENIE-based code for neutrino telescopes

    Get PDF
    Program summary Program Title: gSeaGen CPC Library link to program files: http://dx.doi.org/10.17632/ymgxvy2br4.1 Licensing provisions: GPLv3 Programming language: C++ External routines/libraries: GENIE [1] and its external dependencies. Linkable to MUSIC [2] and PROPOSAL [3]. Nature of problem: Development of a code to generate detectable events in neutrino telescopes, using modern and maintained neutrino interaction simulation libraries which include the state-of-the-art physics models. The default application is the simulation of neutrino interactions within KM3NeT [4]. Solution method: Neutrino interactions are simulated using GENIE, a modern framework for Monte Carlo event generators. The GENIE framework, used by nearly all modern neutrino experiments, is considered as a reference code within the neutrino community. Additional comments including restrictions and unusual features: The code was tested with GENIE version 2.12.10 and it is linkable with release series 3. Presently valid up to 5 TeV. This limitation is not intrinsic to the code but due to the present GENIE valid energy range. References: [1] C. Andreopoulos at al., Nucl. Instrum. Meth. A614 (2010) 87. [2] P. Antonioli et al., Astropart. Phys. 7 (1997) 357. [3] J. H. Koehne et al., Comput. Phys. Commun. 184 (2013) 2070. [4] S. Adrián-Martínez et al., J. Phys. G: Nucl. Part. Phys. 43 (2016) 084001.The gSeaGen code is a GENIE-based application developed to efficiently generate high statistics samples of events, induced by neutrino interactions, detectable in a neutrino telescope. The gSeaGen code is able to generate events induced by all neutrino flavours, considering topological differences between tracktype and shower-like events. Neutrino interactions are simulated taking into account the density and the composition of the media surrounding the detector. The main features of gSeaGen are presented together with some examples of its application within the KM3NeT project.French National Research Agency (ANR) ANR-15-CE31-0020Centre National de la Recherche Scientifique (CNRS)European Union (EU)Institut Universitaire de France (IUF), FranceIdEx program, FranceUnivEarthS Labex program at Sorbonne Paris Cite ANR-10-LABX-0023 ANR-11-IDEX-000502Paris Ile-de-France Region, FranceShota Rustaveli National Science Foundation of Georgia (SRNSFG), Georgia FR-18-1268German Research Foundation (DFG)Greek Ministry of Development-GSRTIstituto Nazionale di Fisica Nucleare (INFN)Ministry of Education, Universities and Research (MIUR)PRIN 2017 program Italy NAT-NET 2017W4HA7SMinistry of Higher Education, Scientific Research and Professional Training, MoroccoNetherlands Organization for Scientific Research (NWO) Netherlands GovernmentNational Science Centre, Poland 2015/18/E/ST2/00758National Authority for Scientific Research (ANCS), RomaniaMinisterio de Ciencia, Innovacion, Investigacion y Universidades (MCIU): Programa Estatal de Generacion de Conocimiento, Spain (MCIU/FEDER) PGC2018-096663-B-C41 PGC2018-096663-A-C42 PGC2018-096663-BC43 PGC2018-096663-B-C44Severo Ochoa Centre of Excellence and MultiDark Consolider (MCIU), Junta de Andalucia, Spain SOMM17/6104/UGRGeneralitat Valenciana: Grisolia, Spain GRISOLIA/2018/119GenT, Spain CIDEGENT/2018/034La Caixa Foundation LCF/BQ/IN17/11620019EU: MSC program, Spain 71367

    State of the climate in 2018

    Get PDF
    In 2018, the dominant greenhouse gases released into Earth’s atmosphere—carbon dioxide, methane, and nitrous oxide—continued their increase. The annual global average carbon dioxide concentration at Earth’s surface was 407.4 ± 0.1 ppm, the highest in the modern instrumental record and in ice core records dating back 800 000 years. Combined, greenhouse gases and several halogenated gases contribute just over 3 W m−2 to radiative forcing and represent a nearly 43% increase since 1990. Carbon dioxide is responsible for about 65% of this radiative forcing. With a weak La Niña in early 2018 transitioning to a weak El Niño by the year’s end, the global surface (land and ocean) temperature was the fourth highest on record, with only 2015 through 2017 being warmer. Several European countries reported record high annual temperatures. There were also more high, and fewer low, temperature extremes than in nearly all of the 68-year extremes record. Madagascar recorded a record daily temperature of 40.5°C in Morondava in March, while South Korea set its record high of 41.0°C in August in Hongcheon. Nawabshah, Pakistan, recorded its highest temperature of 50.2°C, which may be a new daily world record for April. Globally, the annual lower troposphere temperature was third to seventh highest, depending on the dataset analyzed. The lower stratospheric temperature was approximately fifth lowest. The 2018 Arctic land surface temperature was 1.2°C above the 1981–2010 average, tying for third highest in the 118-year record, following 2016 and 2017. June’s Arctic snow cover extent was almost half of what it was 35 years ago. Across Greenland, however, regional summer temperatures were generally below or near average. Additionally, a satellite survey of 47 glaciers in Greenland indicated a net increase in area for the first time since records began in 1999. Increasing permafrost temperatures were reported at most observation sites in the Arctic, with the overall increase of 0.1°–0.2°C between 2017 and 2018 being comparable to the highest rate of warming ever observed in the region. On 17 March, Arctic sea ice extent marked the second smallest annual maximum in the 38-year record, larger than only 2017. The minimum extent in 2018 was reached on 19 September and again on 23 September, tying 2008 and 2010 for the sixth lowest extent on record. The 23 September date tied 1997 as the latest sea ice minimum date on record. First-year ice now dominates the ice cover, comprising 77% of the March 2018 ice pack compared to 55% during the 1980s. Because thinner, younger ice is more vulnerable to melting out in summer, this shift in sea ice age has contributed to the decreasing trend in minimum ice extent. Regionally, Bering Sea ice extent was at record lows for almost the entire 2017/18 ice season. For the Antarctic continent as a whole, 2018 was warmer than average. On the highest points of the Antarctic Plateau, the automatic weather station Relay (74°S) broke or tied six monthly temperature records throughout the year, with August breaking its record by nearly 8°C. However, cool conditions in the western Bellingshausen Sea and Amundsen Sea sector contributed to a low melt season overall for 2017/18. High SSTs contributed to low summer sea ice extent in the Ross and Weddell Seas in 2018, underpinning the second lowest Antarctic summer minimum sea ice extent on record. Despite conducive conditions for its formation, the ozone hole at its maximum extent in September was near the 2000–18 mean, likely due to an ongoing slow decline in stratospheric chlorine monoxide concentration. Across the oceans, globally averaged SST decreased slightly since the record El Niño year of 2016 but was still far above the climatological mean. On average, SST is increasing at a rate of 0.10° ± 0.01°C decade−1 since 1950. The warming appeared largest in the tropical Indian Ocean and smallest in the North Pacific. The deeper ocean continues to warm year after year. For the seventh consecutive year, global annual mean sea level became the highest in the 26-year record, rising to 81 mm above the 1993 average. As anticipated in a warming climate, the hydrological cycle over the ocean is accelerating: dry regions are becoming drier and wet regions rainier. Closer to the equator, 95 named tropical storms were observed during 2018, well above the 1981–2010 average of 82. Eleven tropical cyclones reached Saffir–Simpson scale Category 5 intensity. North Atlantic Major Hurricane Michael’s landfall intensity of 140 kt was the fourth strongest for any continental U.S. hurricane landfall in the 168-year record. Michael caused more than 30 fatalities and 25billion(U.S.dollars)indamages.InthewesternNorthPacific,SuperTyphoonMangkhutledto160fatalitiesand25 billion (U.S. dollars) in damages. In the western North Pacific, Super Typhoon Mangkhut led to 160 fatalities and 6 billion (U.S. dollars) in damages across the Philippines, Hong Kong, Macau, mainland China, Guam, and the Northern Mariana Islands. Tropical Storm Son-Tinh was responsible for 170 fatalities in Vietnam and Laos. Nearly all the islands of Micronesia experienced at least moderate impacts from various tropical cyclones. Across land, many areas around the globe received copious precipitation, notable at different time scales. Rodrigues and Réunion Island near southern Africa each reported their third wettest year on record. In Hawaii, 1262 mm precipitation at Waipā Gardens (Kauai) on 14–15 April set a new U.S. record for 24-h precipitation. In Brazil, the city of Belo Horizonte received nearly 75 mm of rain in just 20 minutes, nearly half its monthly average. Globally, fire activity during 2018 was the lowest since the start of the record in 1997, with a combined burned area of about 500 million hectares. This reinforced the long-term downward trend in fire emissions driven by changes in land use in frequently burning savannas. However, wildfires burned 3.5 million hectares across the United States, well above the 2000–10 average of 2.7 million hectares. Combined, U.S. wildfire damages for the 2017 and 2018 wildfire seasons exceeded $40 billion (U.S. dollars)

    Scintillation light in SBND: simulation, reconstruction, and expected performance of the photon detection system

    Get PDF
    SBND is the near detector of the Short-Baseline Neutrino program at Fermilab. Its location near to the Booster Neutrino Beam source and relatively large mass will allow the study of neutrino interactions on argon with unprecedented statistics. This paper describes the expected performance of the SBND photon detection system, using a simulated sample of beam neutrinos and cosmogenic particles. Its design is a dual readout concept combining a system of 120 photomultiplier tubes, used for triggering, with a system of 192 X-ARAPUCA devices, located behind the anode wire planes. Furthermore, covering the cathode plane with highly-reflective panels coated with a wavelength-shifting compound recovers part of the light emitted towards the cathode, where no optical detectors exist. We show how this new design provides a high light yield and a more uniform detection efficiency, an excellent timing resolution and an independent 3D-position reconstruction using only the scintillation light. Finally, the whole reconstruction chain is applied to recover the temporal structure of the beam spill, which is resolved with a resolution on the order of nanoseconds

    Playing with readers' expectations: types of predictive infographics in digital media

    No full text
    Predictive graphics, which require users to think and deduce before they can access the result, broaden possibilities for media to attract their audience. Recent research suggests that visualizing one¿s predictions improves recall and comprehension of data. Through a qualitative study of interactive infographics, the present article identifies two types of predictive infographics in digital media: users´ prediction-based graphics and calculation-based predictive graphics. How to challenge users and interaction strategies are analyzed, describing models that could be useful for both researchers and professionals from the field.Los infográficos predictivos, aquellos que retan a los usuarios a hacer una estimación subjetiva en torno a un fenómeno antes de permitirles averiguar el resultado, amplían las posibilidades de los medios digitales para atraer y retener a su audiencia. Estudios recientes sugieren, además, que visualizar las propias predicciones mejora la comprensión de la información y la probabilidad de recordarla. A partir de un estudio cualitativo de infográficos interactivos, el presente artículo identifica dos tipos de gráficos predictivos presentes en los medios digitales actuales: gráficos basados en la predicción por parte de los usuarios y gráficos predictivos basados en cálculos. Analiza sus estrategias de interacción y desafío a los usuarios, describiendo modelos que pueden ser aprovechados tanto por investigadores como profesionales de la infografía periodística
    corecore