27 research outputs found

    A comparison of the CAR and DAGAR spatial random effects models with an application to diabetics rate estimation in Belgium

    Get PDF
    When hierarchically modelling an epidemiological phenomenon on a finite collection of sites in space, one must always take a latent spatial effect into account in order to capture the correlation structure that links the phenomenon to the territory. In this work, we compare two autoregressive spatial models that can be used for this purpose: the classical CAR model and the more recent DAGAR model. Differently from the former, the latter has a desirable property: its ρ parameter can be naturally interpreted as the average neighbor pair correlation and, in addition, this parameter can be directly estimated when the effect is modelled using a DAGAR rather than a CAR structure. As an application, we model the diabetics rate in Belgium in 2014 and show the adequacy of these models in predicting the response variable when no covariates are available

    A Statistical Approach to the Alignment of fMRI Data

    Get PDF
    Multi-subject functional Magnetic Resonance Image studies are critical. The anatomical and functional structure varies across subjects, so the image alignment is necessary. We define a probabilistic model to describe functional alignment. Imposing a prior distribution, as the matrix Fisher Von Mises distribution, of the orthogonal transformation parameter, the anatomical information is embedded in the estimation of the parameters, i.e., penalizing the combination of spatially distant voxels. Real applications show an improvement in the classification and interpretability of the results compared to various functional alignment methods

    Smoothing and Ordering in Discriminant Analysis

    Get PDF
    This thesis addresses the question of how to achieve reliable estimation of the posterior probability function in discriminant analysis, both for continuous and ordered discrete feature variables. In the latter instance we are also concerned with the estimation of a posterior, which, regarded as a function of the feature variables, is ordered with respect to one or more independent variable. Chapter 1 introduces the discrimination problem, establishes notation and describes the possible approaches. Methods of density estimation, for use in discriminant analysis, are described, including the kernel method, as are some more direct approaches to discrimination and classification. Some comparative studies and their conclusions are reviewed. Means of assessing the performance of a discriminant rule are described with emphasis on measures of reliability rather than separation. The final section mentions briefly the important problem of variable selection, although this is not addressed elsewhere in the thesis. Chapter 2 addresses the problem of choosing smoothing parameters in kernel density estimation with continuous variables when this is to be used in the discrimination context. It is natural to suspect that the optimal degree of smoothing for marginal density estimates may not be that which will produce an optimal density ratio or posterior probability function when two such estimates are combined. A simulation study confirms that some popular methods for choosing the smoothing parameter can produce an estimated density ratio which Is poor in terms of mean square error. Some alternatives are proposed based on direct assessment measures of reliability, not of the marginal estimates but of the predicted probabilities. These are compared to the marginal approaches. To a more limited extent, the optimal (minimum mean square error) kernel method is compared to an optimal spline estimate of the density ratio. Both the marginal and direct methods are then applied to a real data set and the resulting estimates compared with a spline estimate. Chapter 3 discusses ordered variables, from qualitative orderings to grouped continuous variables, ways in which ordering can affect a data set and suitable models In each case. Particular emphasis is given to discrete kernel estimators and isotonic regression techniques. Some problems in applying existing algorithms for the latter are described and suggestions made for overcoming these. Chapter 4 applies ordered kernels and isotonic regression to 1- and 2-dimensional problems using the data of Titterington et al. (1981), concluding that the kernel methods are unable to recover the type of ordering manifested by the data and that a diagnostic approach is required. The results are compared in the univariate case to those in Chapter 2, Section 2.6 which used continuous kernels. The use of isotonic regression is then compared with 2 logistic models and an independence model using the same data set but with 3 variables. Suggestions are made for further smoothing of the isotonic estimator, 2 of which are implemented. Finally, Chapter 5 draws some conclusions and makes suggestions for further work. In particular, isotonic splines may be worthy of investigation

    Uncertainty in Artificial Intelligence: Proceedings of the Thirty-Fourth Conference

    Get PDF

    Risk Management for the Future

    Get PDF
    A large part of academic literature, business literature as well as practices in real life are resting on the assumption that uncertainty and risk does not exist. We all know that this is not true, yet, a whole variety of methods, tools and practices are not attuned to the fact that the future is uncertain and that risks are all around us. However, despite risk management entering the agenda some decades ago, it has introduced risks on its own as illustrated by the financial crisis. Here is a book that goes beyond risk management as it is today and tries to discuss what needs to be improved further. The book also offers some cases

    Behavioral microfoundations of retail credit markets :a theoretical and experimental approximation

    Get PDF
    [Resumen] La reciente crisis financiera ha renovado el interés en el papel que juega el crédito, particularmente bancario, para amplificar el ciclo económico. Esta tesis se centra en el lado de la oferta para estudiar la eficiencia informativa de los sistemas bancarios a la hora de conceder crédito a la economía. Tras una revisión de la literatura centrada en dicho objeto, establecemos la economía y finanzas conductuales como marco conceptual de nuestra investigación. Asi, comentamos en primer lugar los límites para aplicar la hipótesis del mercado eficiente, paradigma clásico en los mercados financieros, a los sistemas bancarios, y ofrecemos un enfoque alternativo en tres pasos que se basa en la literatura conductual. Después, llevamos a cabo una investigación experimental para testar el primer paso: un juego de simulación diseñado para replicar el esquema básico en el que un banco establece sus políticas de crédito. Los resultados se contrastan con los perfiles de los participantes en términos de exceso de confianza y la teoría prospectiva, para determinar si estos sesgos podrían explicar las diferentes políticas de crédito. Por último, ofrecemos un modelo teórico para analizar los pasos segundo y tercero. Asumiendo que algunos bancos tienen gerentes demasiado optimistas, el modelo muestra cómo los bancos racionales seguirían a sus competidores sesgados, y describe los límites del arbitraje en la industria que impedirían restaurar la eficiencia informativa.[Resumo]A recente crise financeira renovou o interese no papel xogado polo crédito, en particular bancario, na amplificación do ciclo económico. Esta tese céntrase no lado da oferta ao obxecto de estudar a eficiencia informativa dos sistemas bancarios cando conceden crédito á economía. Tras unha revisión das principais áreas da literatura centradas en dita cuestión, fixamos a economía e finanzas conductuais como marco conceptual da nosa investigación. Deste xeito, debatemos primeiro sobre as limitacións para aplicar a hipótese do mercado eficiente, paradigma clásico nos mercados financeiros, aos sistemas bancarios, e suxerimos un enfoque alternativo en tres pasos, baseado na literatura conductual. Logo, poñemos en práctica unha investigación experimental ao obxecto de testar o primeiro paso: un xogo de simulación deseñado para replicar o esquema básico no que un banco establece as súas políticas de crédito. Os resultados son contrastados cós perfís dos participantes en termos de exceso de confianza e a teoría prospectiva, para determinar se ditos sesgos poderían explicar diferentes políticas de crédito entre bancos. Para rematar, ofertamos un modelo teórico para analizar o segundo e terceiro paso. Asumindo que algúns bancos son dirixidos por xestores optimistas de máis, o modelo mostra como os bancos racionais seguirían aos seus sesgados competidores, e describe os límites da arbitraxe na industria que impedirían restablecer a eficiencia informativa.[Abstract] The recent financial crisis has renewed interest in the role credit plays, particularly when granted by the banking sector, to amplify the economic cycle. This thesis focuses on the credit supply side to study the informational efficiency of bank-based financial systems when granting credit to the economy. After a revision of the main areas of the literature that are devoted to such purpose, we set the behavioral economics and finance as a conceptual framework for our research. Thus, we firstly discuss the limits to apply the classic paradigm in financial markets, the efficient market hypothesis, to bank-based systems, and offer an alternative approach in three steps, based on the behavioral literature. Then, we implement an experimental research to test the first step. The experiment consists of a business simulation game designed to replicate the basics of a bank to set its credit policies. The results are tested against the participants’ profiles in terms of overconfidence and prospect theory, to determine whether these behavioral biases might explain different credit policies across banks. Finally, we offer a theoretical model to analyze the second and third steps in the behavioral approach. Assuming some banks are run by too optimistic managers, the model shows how rational banks would herd to follow their biased competitors, and describes the limits of arbitrage in the industry that would prevent informational efficiency to be restored

    Proceedings of the 7th Sound and Music Computing Conference

    Get PDF
    Proceedings of the SMC2010 - 7th Sound and Music Computing Conference, July 21st - July 24th 2010
    corecore