86 research outputs found

    Measuring Loss Potential of Hedge Fund Strategies

    Get PDF
    We measure the loss potential of Hedge Funds by combining three market risk measures: VaR, Draw-Down and Time Under-The-Water. Calculations are carried out considering three different frameworks regarding Hedge Fund returns: i) Normality and time-independence, ii) Non-normality and time- independence and iii) Non-normality and time-dependence. In the case of Hedge Funds, our results clearly state that market risk may be substantially underestimated by those models which assume Normality or, even considering Non-Normality, neglect to model time- dependence. Moreover, VaR is an incomplete measure of market risk whenever the Normality assumption does not hold. In this case, VaR results must be compared with Draw-Down and Time Under-The-Water measures in order to accurately assess about Hedge Funds loss potential.Hedge Fund, Value-at-Risk, risk, performance, drawdown, under- the-water, normal returns, non-normal returns, time-dependence, ARMA, Monte Carlo, skewness, kurtosis, mixture of gaussian distributions, survival probability, styles, investment strategies

    Solving the Optimal Trading Trajectory Problem Using a Quantum Annealer

    Get PDF
    We solve a multi-period portfolio optimization problem using D-Wave Systems' quantum annealer. We derive a formulation of the problem, discuss several possible integer encoding schemes, and present numerical examples that show high success rates. The formulation incorporates transaction costs (including permanent and temporary market impact), and, significantly, the solution does not require the inversion of a covariance matrix. The discrete multi-period portfolio optimization problem we solve is significantly harder than the continuous variable problem. We present insight into how results may be improved using suitable software enhancements, and why current quantum annealing technology limits the size of problem that can be successfully solved today. The formulation presented is specifically designed to be scalable, with the expectation that as quantum annealing technology improves, larger problems will be solvable using the same techniques.Comment: 7 pages; expanded and update

    Pseudo-mathematics and financial charlatanism: the effects of backtest overfitting on out-of-sample performance

    Get PDF
    A backtest is a historical simulation of an algorithmic investment strategy. Among other things, it computes the series of profits and losses that such strategy would have generated had that algorithm been run over that time period. Popular performance statistics, such as the Sharpe ratio or the Information ratio, are used to quantify the backtested strategy’s return on risk. Investors typically study those backtest statistics and then allocate capital to the best performing scheme

    Connecting the Dots in Trustworthy Artificial Intelligence: From AI Principles, Ethics, and Key Requirements to Responsible AI Systems and Regulation

    Full text link
    Trustworthy Artificial Intelligence (AI) is based on seven technical requirements sustained over three main pillars that should be met throughout the system's entire life cycle: it should be (1) lawful, (2) ethical, and (3) robust, both from a technical and a social perspective. However, attaining truly trustworthy AI concerns a wider vision that comprises the trustworthiness of all processes and actors that are part of the system's life cycle, and considers previous aspects from different lenses. A more holistic vision contemplates four essential axes: the global principles for ethical use and development of AI-based systems, a philosophical take on AI ethics, a risk-based approach to AI regulation, and the mentioned pillars and requirements. The seven requirements (human agency and oversight; robustness and safety; privacy and data governance; transparency; diversity, non-discrimination and fairness; societal and environmental wellbeing; and accountability) are analyzed from a triple perspective: What each requirement for trustworthy AI is, Why it is needed, and How each requirement can be implemented in practice. On the other hand, a practical approach to implement trustworthy AI systems allows defining the concept of responsibility of AI-based systems facing the law, through a given auditing process. Therefore, a responsible AI system is the resulting notion we introduce in this work, and a concept of utmost necessity that can be realized through auditing processes, subject to the challenges posed by the use of regulatory sandboxes. Our multidisciplinary vision of trustworthy AI culminates in a debate on the diverging views published lately about the future of AI. Our reflections in this matter conclude that regulation is a key for reaching a consensus among these views, and that trustworthy and responsible AI systems will be crucial for the present and future of our society.Comment: 30 pages, 5 figures, under second revie

    Consistent patterns of common species across tropical tree communities

    Get PDF
    Trees structure the Earth’s most biodiverse ecosystem, tropical forests. The vast number of tree species presents a formidable challenge to understanding these forests, including their response to environmental change, as very little is known about most tropical tree species. A focus on the common species may circumvent this challenge. Here we investigate abundance patterns of common tree species using inventory data on 1,003,805 trees with trunk diameters of at least 10 cm across 1,568 locations1,2,3,4,5,6 in closed-canopy, structurally intact old-growth tropical forests in Africa, Amazonia and Southeast Asia. We estimate that 2.2%, 2.2% and 2.3% of species comprise 50% of the tropical trees in these regions, respectively. Extrapolating across all closed-canopy tropical forests, we estimate that just 1,053 species comprise half of Earth’s 800 billion tropical trees with trunk diameters of at least 10 cm. Despite differing biogeographic, climatic and anthropogenic histories7, we find notably consistent patterns of common species and species abundance distributions across the continents. This suggests that fundamental mechanisms of tree community assembly may apply to all tropical forests. Resampling analyses show that the most common species are likely to belong to a manageable list of known species, enabling targeted efforts to understand their ecology. Although they do not detract from the importance of rare species, our results open new opportunities to understand the world’s most diverse forests, including modelling their response to environmental change, by focusing on the common species that constitute the majority of their trees.Publisher PDFPeer reviewe

    In-situ estimation of ice crystal properties at the South Pole using LED calibration data from the IceCube Neutrino Observatory

    Get PDF
    The IceCube Neutrino Observatory instruments about 1 km3 of deep, glacial ice at the geographic South Pole using 5160 photomultipliers to detect Cherenkov light emitted by charged relativistic particles. A unexpected light propagation effect observed by the experiment is an anisotropic attenuation, which is aligned with the local flow direction of the ice. Birefringent light propagation has been examined as a possible explanation for this effect. The predictions of a first-principles birefringence model developed for this purpose, in particular curved light trajectories resulting from asymmetric diffusion, provide a qualitatively good match to the main features of the data. This in turn allows us to deduce ice crystal properties. Since the wavelength of the detected light is short compared to the crystal size, these crystal properties do not only include the crystal orientation fabric, but also the average crystal size and shape, as a function of depth. By adding small empirical corrections to this first-principles model, a quantitatively accurate description of the optical properties of the IceCube glacial ice is obtained. In this paper, we present the experimental signature of ice optical anisotropy observed in IceCube LED calibration data, the theory and parametrization of the birefringence effect, the fitting procedures of these parameterizations to experimental data as well as the inferred crystal properties.</p

    Tratamientos psicológicos empíricamente apoyados para adultos: Una revisión selectiva

    Get PDF
    Antecedentes: los tratamientos psicológicos han mostrado su eficacia, efectividad y eficiencia para el abordaje de los trastornos mentales; no obstante, considerando el conocimiento científico generado en los últimos años, no se dispone de trabajos de actualización en español sobre cuáles son los tratamientos psicológicos con respaldo empírico. El objetivo fue realizar una revisión selectiva de los principales tratamientos psicológicos empíricamente apoyados para el abordaje de trastornos mentales en personas adultas. Método: se recogen niveles de evidencia y grados de recomendación en función de los criterios propuestos por el Sistema Nacional de Salud de España (en las Guías de Práctica Clínica) para diferentes trastornos psicológicos. Resultados: los resultados sugieren que los tratamientos psicológicos disponen de apoyo empírico para el abordaje de un amplio elenco de trastornos psicológicos. El grado de apoyo empírico oscila de bajo a alto en función del trastorno psicológico analizado. La revisión sugiere que ciertos campos de intervención necesitan una mayor investigación. Conclusiones: a partir de esta revisión selectiva, los profesionales de la psicología podrán disponer de información rigurosa y actualizada que les permita tomar decisiones informadas a la hora de implementar aquellos procedimientos psicoterapéuticos empíricamente fundamentados en función de las características de las personas que demandan ayuda. Background: Psychological treatments have shown their efficacy, effectiveness, and efficiency in dealing with mental disorders. However, considering the scientific knowledge generated in recent years, in the Spanish context, there are no updating studies about empirically supported psychological treatments. The main goal was to carry out a selective review of the main empirically supported psychological treatments for mental disorders in adults. Method: Levels of evidence and degrees of recommendation were collected based on the criteria proposed by the Spanish National Health System (Clinical Practice Guidelines) for different psychological disorders. Results: The results indicate that psychological treatments have empirical support for the approach to a wide range of psychological disorders. These levels of empirical evidence gathered range from low to high depending on the psychological disorder analysed. The review indicates the existence of certain fields of intervention that need further investigation. Conclusions: Based on this selective review, psychology professionals will be able to have rigorous, up-to-date information that allows them to make informed decisions when implementing empirically based psychotherapeutic procedures based on the characteristics of the people who require help

    Multiple Scenario Generation of Subsurface Models:Consistent Integration of Information from Geophysical and Geological Data throuh Combination of Probabilistic Inverse Problem Theory and Geostatistics

    Get PDF
    Neutrinos with energies above 1017 eV are detectable with the Surface Detector Array of the Pierre Auger Observatory. The identification is efficiently performed for neutrinos of all flavors interacting in the atmosphere at large zenith angles, as well as for Earth-skimming \u3c4 neutrinos with nearly tangential trajectories relative to the Earth. No neutrino candidates were found in 3c 14.7 years of data taken up to 31 August 2018. This leads to restrictive upper bounds on their flux. The 90% C.L. single-flavor limit to the diffuse flux of ultra-high-energy neutrinos with an E\u3bd-2 spectrum in the energy range 1.0 7 1017 eV -2.5 7 1019 eV is E2 dN\u3bd/dE\u3bd &lt; 4.4 7 10-9 GeV cm-2 s-1 sr-1, placing strong constraints on several models of neutrino production at EeV energies and on the properties of the sources of ultra-high-energy cosmic rays
    corecore