1,282 research outputs found

    Classification methods for Hilbert data based on surrogate density

    Get PDF
    An unsupervised and a supervised classification approaches for Hilbert random curves are studied. Both rest on the use of a surrogate of the probability density which is defined, in a distribution-free mixture context, from an asymptotic factorization of the small-ball probability. That surrogate density is estimated by a kernel approach from the principal components of the data. The focus is on the illustration of the classification algorithms and the computational implications, with particular attention to the tuning of the parameters involved. Some asymptotic results are sketched. Applications on simulated and real datasets show how the proposed methods work.Comment: 33 pages, 11 figures, 6 table

    A decomposition theorem for fuzzy set-valued random variables and a characterization of fuzzy random translation

    Get PDF
    Let XX be a fuzzy set--valued random variable (\frv{}), and \huku{X} the family of all fuzzy sets BB for which the Hukuhara difference X\HukuDiff B exists P\mathbb{P}--almost surely. In this paper, we prove that XX can be decomposed as X(\omega)=C\Mink Y(\omega) for P\mathbb{P}--almost every ω∈Ω\omega\in\Omega, CC is the unique deterministic fuzzy set that minimizes E[d2(X,B)2]\mathbb{E}[d_2(X,B)^2] as BB is varying in \huku{X}, and YY is a centered \frv{} (i.e. its generalized Steiner point is the origin). This decomposition allows us to characterize all \frv{} translation (i.e. X(\omega) = M \Mink \indicator{\xi(\omega)} for some deterministic fuzzy convex set MM and some random element in \Banach). In particular, XX is an \frv{} translation if and only if the Aumann expectation EX\mathbb{E}X is equal to CC up to a translation. Examples, such as the Gaussian case, are provided.Comment: 12 pages, 1 figure. v2: minor revision. v3: minor revision; references, affiliation and acknowledgments added. Submitted versio

    On the AGN radio luminosity distribution and the black hole fundamental plane

    Full text link
    We have studied the dependence of the AGN nuclear radio (1.4 GHz) luminosity on both the AGN 2-10 keV X-ray and the host-galaxy K-band luminosity. A complete sample of 1268 X-ray selected AGN (both type 1 and type 2) has been used, which is the largest catalogue of AGN belonging to statistically well defined samples where radio, X and K band information exists. At variance with previous studies, radio upper limits have been statistically taken into account using a Bayesian Maximum Likelihood fitting method. It resulted that a good fit is obtained assuming a plane in the 3D L_R-L_X-L_K space, namely logL_R= xi_X logL_X + xi_K logL_K + xi_0, having a ~1 dex wide (1 sigma) spread in radio luminosity. As already shown, no evidence of bimodality in the radio luminosity distribution was found and therefore any definition of radio loudness in AGN is arbitrary. Using scaling relations between the BH mass and the host galaxy K-band luminosity, we have also derived a new estimate of the BH fundamental plane (in the L_5GHz -L_X-M_BH space). Our analysis shows that previous measures of the BH fundamental plane are biased by ~0.8 dex in favor of the most luminous radio sources. Therefore, many AGN studies, where the BH fundamental plane is used to investigate how AGN regulate their radiative and mechanical luminosity as a function of the accretion rate, or many AGN/galaxy co-evolution models, where radio-feedback is computed using the AGN fundamental plane, should revise their conclusions.Comment: Submitted to MNRAS. Revised version after minor referee comments. 12 pages, 12 figure

    A Note on Fuzzy Set--Valued Brownian Motion

    Get PDF
    In this paper, we prove that a fuzzy set--valued Brownian motion BtB_t, as defined in [1], can be handle by an RdR^d--valued Wiener process btb_t, in the sense that B_t =\indicator{b_t}; i.e. it is actually the indicator function of a Wiener process

    Describing the Concentration of Income Populations by Functional Principal Component Analysis on Lorenz Curves

    Get PDF
    Lorenz curves are widely used in economic studies (inequality, poverty, differentiation, etc.). From a model point of view, such curves can be seen as constrained functional data for which functional principal component analysis (FPCA) could be defined. Although statistically consistent, performing FPCA using the original data can lead to a suboptimal analysis from a mathematical and interpretation point of view. In fact, the family of Lorenz curves lacks very basic (e.g., vectorial) structures and, hence, must be treated with ad hoc methods. This work aims to provide a rigorous mathematical framework via an embedding approach to define a coherent FPCA for Lorenz curves. This approach is used to explore a functional dataset from the Bank of Italy income survey

    How is the Driver's Workload Influenced by the Road Environment?

    Get PDF
    This paper focuses on the study of the driver\u2019s workload while driving on a rural two-lane road with different traffic flows. The aim of the research is to examine a parameter that could be representative of the driving effort, quite sensible to the external factors that cause disturbance to the regular driving activity. To solve this problem, the authors used a specific instrumented vehicle for monitoring some physiological parameters of the driver (as the eye movements and the Galvanic Skin Resistance), referring their values to the road context. The results are very interesting and confirm that knowing the workload is useful to improve the road safety only if it is related to the external context, as well as road geometry, traffic, visibility, etc. Only in this way, the road administrators can deduce proper information to plan and direct accurate and productive upgrade working operation

    Analysis of different visual strategies of ‘isolated vehicle’and ‘disturbed vehicle’

    Get PDF
    This paper analyses the driver’ visual behaviour in the different conditions of ‘isolated vehicle’ and ‘disturbed vehicle’. If the meaning of the former is clear, the latter condition considers the influence on the driving behaviour of various objects that could be encountered along the road. These can be classified in static (signage, stationary vehicles at the roadside, etc.) and dynamic objects (cars, motorcycles, bicycles). The aim of this paper is to propose a proper analysis regarding the driver’s visual behaviour. In particular, the authors examined the quality of the visually information acquired from the entire road environment, useful for detecting any critical safety condition. In order to guarantee a deep examination of the various possible behaviours, the authors combined the several test outcomes with other variables related to the road geometry and with the dynamic variables involved while driving. The results of this study are very interesting. As expected, they obviously confirmed better performances for the ‘isolated vehicle’ in a rural two-lane road with different traffic flows. Moreover, analysing the various scenarios in the disturbed condition, the proposed indices allow the authors to quantitatively describe the different influence on the visual field and effects on the visual behaviour, favouring critical analysis of the road characteristics. Potential applications of these results may contribute to improve the choice of the best maintenance strategies for a road, to select the optimal signage location, to define forecasting models for the driving behaviour and to develop useful instruments for intelligent transportation systems

    Modeling functional data: a test procedure

    Get PDF
    The paper deals with a test procedure able to state the compatibility of observed data with a reference model, by using an estimate of the volumetric part in the small-ball probability factorization which plays the role of a real complexity index. As a preliminary by-product we state some asymptotics for a new estimator of the complexity index. A suitable test statistic is derived and, referring to the U-statistics theory, its asymptotic null distribution is obtained. A study of level and power of the test for finite sample sizes and a comparison with a competitor are carried out by Monte Carlo simulations. The test procedure is performed over a financial time series

    A robust MPC approach for the rebalancing of mobility on demand systems

    Get PDF
    A control-oriented model for mobility-on-demand systems is here proposed. The system is first described through dynamical stochastic state-space equations, and then suitably simplified in order to obtain a controloriented model, on which two control strategies based on Model Predictive Control are designed. The first strategy aims at keeping the expected value of the number of vehicles parked in stations within prescribed bounds; the second strategy specifically accounts for stochastic fluctuations around the expected value. The model includes the possibility of weighting the control effort, leading to control solutions that may trade off efficiency and cost. The models and control strategies are validated over a dataset of logged trips of ToBike, the bike-sharing systems in the city of Turin, Italy

    The space density of Compton-thick AGN at z~0.8 in the zCOSMOS-Bright Survey

    Full text link
    The obscured accretion phase in BH growth is a key ingredient in many models linking the AGN activity with the evolution of their host galaxy. At present, a complete census of obscured AGN is still missing. The purpose of this work is to assess the reliability of the [NeV] emission line at 3426 A to pick up obscured AGN up to z~1 by assuming that [NeV] is a reliable proxy of the intrinsic AGN luminosity and using moderately deep X-ray data to characterize the amount of obscuration. A sample of 69 narrow-line (Type 2) AGN at z=0.65-1.20 were selected from the 20k-zCOSMOS Bright galaxy sample on the basis of the presence of the [NeV] emission. The X-ray properties of these galaxies were then derived using the Chandra-COSMOS coverage of the field; the X-ray-to-[NeV] flux ratio, coupled with X-ray spectral and stacking analyses, was then used to infer whether Compton-thin or Compton-thick absorption were present in these sources. Then the [NeV] luminosity function was computed to estimate the space density of Compton-thick (CT) AGN at z~0.8. Twenty-three sources were detected by Chandra, and their properties are consistent with moderate obscuration (on average, ~a few 10^{22} cm^-2). The X-ray properties of the remaining 46 X-ray undetected Type 2 AGN were derived using X-ray stacking analysis. Current data indicate that a fraction as high as ~40% of the present sample is likely to be CT. The space density of CT AGN with logL_2-10keV>43.5 at z=0.83 is (9.1+/-2.1) 10^{-6} Mpc^{-3}, in good agreement with both XRB model expectations and the previously measured space density for objects in a similar redshift and luminosity range. We regard our selection technique for CT AGN as clean but not complete, since even a mild extinction in the NLR can suppress [NeV] emission. Therefore, our estimate of their space density should be considered as a lower limit.Comment: 10 pages, 7 figures, 2 tables, A&A, in pres
    • …
    corecore