155 research outputs found
Commodities, inflation and monetary policy : a global perspective
Artículo de revist
libstable: Fast, Parallel, and High-Precision Computation of α-Stable Distributions in R, C/C++, and MATLAB
α-stable distributions are a family of well-known probability distributions. However, the lack of closed analytical expressions hinders their application. Currently, several tools have been developed to numerically evaluate their density and distribution functions or to estimate their parameters, but available solutions either do not reach sufficient precision on their evaluations or are excessively slow for practical purposes. Moreover, they do not take full advantage of the parallel processing capabilities of current multi-core machines. Other solutions work only on a subset of the α-stable parameter space. In this paper we present an R package and a C/C++ library with a MATLAB front-end that permit parallelized, fast and high precision evaluation of density, distribution and quantile functions, as well as random variable generation and parameter estimation of α-stable distributions in their whole parameter space. The described library can be easily integrated into third party developments
Insight into ADHD diagnosis with deep learning on Actimetry: Quantitative interpretation of occlusion maps in age and gender subgroups
Producción CientíficaAttention Deficit/Hyperactivity Disorder (ADHD) is a prevalent neurodevelopmental disorder in childhood that often persists into adulthood. Objectively diagnosing ADHD can be challenging due to the reliance on subjective questionnaires in clinical assessment. Fortunately, recent advancements in artificial intelligence (AI) have shown promise in providing objective diagnoses through the analysis of medical images or activity recordings. These AI-based techniques have demonstrated accurate ADHD diagnosis; however, the growing complexity of deep learning models has introduced a lack of interpretability. These models often function as black boxes, unable to offer meaningful insights into the data patterns that characterize ADHD.Agencia Estatal de Investigación (grants PID2020-115339RB-I00, TED2021-130090B-I00 and TED2021-131536B-I00)EU Horizon 2020 Research and Innovation Programme under the Marie Sklodowska-Curie grant agreement (101008297)Company ESAOTE Ltd (grant 18IQBM
Libstable: Fast, Parallel and High-Precision Computation of -Stable Distributions in C/C++ and MATLAB
-stable distributions are a wide family of probability distributions used in many
elds where probabilistic approaches are taken. However, the lack of closed analytical
expressions is a major drawback for their application. Currently, several tools have been
developed to numerically evaluate their density and distribution functions or estimate
their parameters, but available solutions either do not reach su cient precision on their
evaluations or are too slow for several practical purposes. Moreover, they do not take full
advantage of the parallel processing capabilities of current multi-core machines. Other solutions
work only on a subset of the -stable parameter space. In this paper we present a
C/C++ library and a MATLAB front-end that allows fully parallelized, fast and high precision
evaluation of density, distribution and quantile functions (PDF, CDF and CDF1
respectively), random variable generation and parameter estimation of -stable distributions
in their whole parameter space. The library provided can be easily integrated on
third party developments
Diffusion sampling schemes: A generalized methodology with nongeometric criteria
Producción CientíficaPurpose:The aim of this paper is to show that geometrical criteria for designingmultishellq-space sampling procedures do not necessarily translate into recon-struction matrices with high figures of merit commonly used in the compressedsensing theory. In addition, we show that a well-known method for visitingk-space in radial three-dimensional acquisitions, namely, the Spiral Phyllotaxis,is a competitive initialization for the optimization of our nonconvex objectivefunction.Theory and Methods:We propose the gradient design method WISH (WeIght-ing SHells) which uses an objective function that accounts for weighted dis-tances between gradients withinM-tuples of consecutive shells, withMrangingbetween 1 and the maximum number of shellsS. All theM-tuples share thesame weight�M. The objective function is optimized for a sample of theseweights, using Spiral Phyllotaxis as initialization. State-of-the-art General Elec-trostatic Energy Minimization (GEEM) and Spherical Codes (SC) were used forcomparison. For the three methods, reconstruction matrices of the attenuationsignal using MAP-MRI were tested using figures of merit borrowed from theCompressed Sensing theory (namely, Restricted Isometry Property —RIP— andCoherence); we also tested the gradient design using a geometric criterion basedon Voronoi cells.Results:For RIP and Coherence, WISH got better results in at least one com-bination of weights, whilst the criterion based on Voronoi cells showed anunrelated pattern.Conclusion:The versatility provided by WISH is supported by better results.Optimization in the weight parameter space is likely to provide additionalimprovements. For a practical design with an intermediate number of gradients,our results recommend to carry out the methodology here used to determine theappropriate gradient table.Agencia Estatal de Investigación,(under Grants RTI2018-094569-B-I00,PID2020-115339RB-I00 and TED2021-130090B-I00)ESAOTE, Ltd (Grant/Award Number: 18IQBM
OpenCLIPER: an OpenCL-based C++ Framework for Overhead-Reduced Medical Image Processing and Reconstruction on Heterogeneous Devices
Medical image processing is often limited by the computational cost of the
involved algorithms. Whereas dedicated computing devices (GPUs in particular)
exist and do provide significant efficiency boosts, they have an extra cost of
use in terms of housekeeping tasks (device selection and initialization, data
streaming, synchronization with the CPU and others), which may hinder
developers from using them. This paper describes an OpenCL-based framework that
is capable of handling dedicated computing devices seamlessly and that allows
the developer to concentrate on image processing tasks.
The framework handles automatically device discovery and initialization, data
transfers to and from the device and the file system and kernel loading and
compiling. Data structures need to be defined only once independently of the
computing device; code is unique, consequently, for every device, including the
host CPU. Pinned memory/buffer mapping is used to achieve maximum performance
in data transfers.
Code fragments included in the paper show how the computing device is almost
immediately and effortlessly available to the users algorithms, so they can
focus on productive work. Code required for device selection and
initialization, data loading and streaming and kernel compilation is minimal
and systematic. Algorithms can be thought of as mathematical operators (called
processes), with input, output and parameters, and they may be chained one
after another easily and efficiently. Also for efficiency, processes can have
their initialization work split from their core workload, so process chains and
loops do not incur in performance penalties. Algorithm code is independent of
the device type targeted
An extended BEPU approach integrating probabilistic assumptions on the availability of safety systems in deterministic safety analyses
[EN] The International Atomic Energy Agency (IAEA) produced guidance on the use of Deterministic Safety Analysis (DSA) for the design and licensing of Nuclear Power Plants (NPPs) in "DSA for NPP Specific Safety Guide, No. SSG-2", which proposes four options for the application of DSA. Option 3 involves the use of Best Estimate codes and data together with an evaluation of the uncertainties, the so called BEPU methodology. Several BEPU approaches have been developed in scopes that are accepted by the regulator authorities nowadays. They normally adopt conservative assumptions on the availability of safety systems. Option 4 goes beyond by pursuing the incorporation of realistic assumption on the availability of safety systems into the DSA. This paper proposes an Extended BEPU (EBEPU) approach that integrates insights from probabilistic Safety Analysis into a typical BEPU approach. There is an aim at combining the use of well-established BEPU methods and realistic ("probabilistic") assumptions on safety system availability. This paper presents the fundamentals of the EBEPU approach and the main results obtained for an example of application that focuses on an accident scenario corresponding to the initiating event "Loss of Feed Water (LOFW)" for a typical three-loops Pressurized Water Reactor (PWR) NPP.This work has been developed partially with the support of Programa de Apoyo a la Investigacion y Desarrollo of the Universitat Politecnica de Valencia (PAID UPV).Martorell Alsina, SS.; Sanchez Saez, F.; Villanueva López, JF.; Carlos Alberola, S. (2017). An extended BEPU approach integrating probabilistic assumptions on the availability of safety systems in deterministic safety analyses. Reliability Engineering & System Safety. 167:474-483. doi:10.1016/j.ress.2017.06.020S47448316
- …