14,997 research outputs found

    Reviews

    Get PDF
    John Bowden and Ference Marton, The University of Learning: Beyond Quality and Competence in Higher Education, London: Kogan Page, 1998. ISBN: 0–7494–2292–0. Hardback, x310 pages, £35.00

    Exact Fermi coordinates for a class of spacetimes

    Full text link
    We find exact Fermi coordinates for timelike geodesic observers for a class of spacetimes that includes anti-de Sitter spacetime, de Sitter spacetime, the constant density interior Schwarzschild spacetime with positive, zero, and negative cosmological constant, and the Einstein static universe. Maximal charts for Fermi coordinates are discussed.Comment: 15 page

    nsolvency Experience, Risk-Based Capital, and Prompt Corrective Action in Property-Liability Insurance

    Get PDF
    In December 1992, the National Association of Insurance Commissioners (NAIC) adopted a life-health insurer risk-based capital (RBC) formula and model law that became effective with the 1993 annual statement filed in March 1994. In principle, well-designed RBC requirements can help achieve an efficient reduction in the expected costs of insolvencies. They can provide incentives for insurers to operate safely in cases where market incentives are weak due to government mandated guarantees of insurer obligations or asymmetries regarding solvency between insurers and buyers. RBC requirements also may facilitate or encourage prompt corrective action by solvency regulators by helping regulators to identify weak insurers and giving regulators legal authority to intervene when capital falls below specified levels. RBC requirements may force regulators to act in amore timely manner when confronted with external pressure to delay action. However, RBC capital requirements have a number ofpotential limitations. Unavoidable imperfections in any meaningful RBC system will likely distort some insurer decisions in undesirable and unintended ways. RBC requirements by themselves will do little or nothing to help regulators determine when an insurer s reported capital (surplus) is overstated due to understatement of liabilities or overstatement of assets. A well-designed RBC system should minimize costs associated with misclassification of insurers. The system should be able to identify a high proportion of troubled companies early enough to permit regulators to take prompt corrective action and should identify as troubled only a minimal proportion of financially sound insurers. This study analyzes data on solvent and insolvent property-liability insurers to determine whether modifications in the NAIC s RBC formula can improve its ability to predict firms that subsequently fail without substantially increasing the proportion of surviving insurers that are incorrectly predicted to fail. It uses logistic regression models to investigate whether changes in the weight for the major components in the RBC formula and incorporation of information on company size and organizational form improve the tradeoff between Type I error rates (the percentage of insurers that later failed that are incorrectly predicted not to fail) and the Type II error rates (the percentage of surviving insurers that are incorrectly predicted to fail). The data analyzed were for 1989-91 for firms that subsequently failed and for firms that survived through the first nine months of 1993. The authors make four main conclusions. First, less than half of the companies that later failed had RBC ratios within the proposed ranges for regulatory and company action. Second, total and component RBC ratios generally are significantly different for failed and surviving firms based on univariate tests. Third, estimation of multiple logistic regression models of insolvency risk indicated that allowing the weights of the RBC component to vary and including firm size and organizational form variables generally produce a material improvement in the tradeoff between sample Type I and Type II error rates. And, fourth,the RBC models are noticeably less successful in predicting large firm insolvencies than in predicting smaller insolvencies. Regarding the estimated weights in the logistic regression models, a major conclusion is the reserve component of the NAIC risk-based capital formula, which accounts for half of industry risk-based capital, has virtually no predictive power in any of the tests conducted. Given the high costs associated with large failures and the inferior performance of the models in predicting large insolvencies, a higher payoff in terms of reduced insolvency costs is likely to be achieved by developing models that perform better for large firms.

    Stimulus-invariant processing and spectrotemporal reverse correlation in primary auditory cortex

    Full text link
    The spectrotemporal receptive field (STRF) provides a versatile and integrated, spectral and temporal, functional characterization of single cells in primary auditory cortex (AI). In this paper, we explore the origin of, and relationship between, different ways of measuring and analyzing an STRF. We demonstrate that STRFs measured using a spectrotemporally diverse array of broadband stimuli -- such as dynamic ripples, spectrotemporally white noise, and temporally orthogonal ripple combinations (TORCs) -- are very similar, confirming earlier findings that the STRF is a robust linear descriptor of the cell. We also present a new deterministic analysis framework that employs the Fourier series to describe the spectrotemporal modulations contained in the stimuli and responses. Additional insights into the STRF measurements, including the nature and interpretation of measurement errors, is presented using the Fourier transform, coupled to singular-value decomposition (SVD), and variability analyses including bootstrap. The results promote the utility of the STRF as a core functional descriptor of neurons in AI.Comment: 42 pages, 8 Figures; to appear in Journal of Computational Neuroscienc

    Relative velocities for radial motion in expanding Robertson-Walker spacetimes

    Full text link
    The expansion of space, and other geometric properties of cosmological models, can be studied using geometrically defined notions of relative velocity. In this paper, we consider test particles undergoing radial motion relative to comoving (geodesic) observers in Robertson-Walker cosmologies, whose scale factors are increasing functions of cosmological time. Analytical and numerical comparisons of the Fermi, kinematic, astrometric, and the spectroscopic relative velocities of test particles are given under general circumstances. Examples include recessional comoving test particles in the de Sitter universe, the radiation-dominated universe, and the matter-dominated universe. Three distinct coordinate charts, each with different notions of simultaneity, are employed in the calculations. It is shown that the astrometric relative velocity of a radially receding test particle cannot be superluminal in any expanding Robertson-Walker spacetime. However, necessary and sufficient conditions are given for the existence of superluminal Fermi speeds, and it is shown how the four concepts of relative velocity determine geometric properties of the spacetime.Comment: 27 pages, 6 figure

    Lichens, a unique forage resource threatened by air pollution

    Get PDF
    Lichens are the primary winter forage for most mainland caribou and reindeer herds in North America and for the majority of domestic and wild reindeer in Siberia and northern Europe, collectively totaling in excess of 5 million animals. Lichens represent a unique forage resource throughout much of the circumpolar North that cannot effectively be replaced by vascular plants. Lichens are particularly sensitive to the effects of air pollution. The increased pace of exploitation and processing of minerals and petroleum resources throughout the circumpolar North, with associated introduction of pollution products into the atmosphere has already resulted in losses of lichens and their reduced productivity in extensive areas adjacent to large metallurgical complexes in the Taimyr of Siberia, on the Kola Peninsula, and in adjacent parts of Finland. Losses of terricolous lichens in the Taimyr from pollution generated by the Norilsk metallurgical complex have been nearly complete within a 300 000 ha area closest to the pollution source and damage and reduced growth extends over an area in excess of 600 000 ha. The Arctic also is a sink for atmospheric pollution generated in the heavily industrialized north temperate regions of the world. Assessment of the effects on lichens of this global scale increase in air pollution is difficult because of the lack of representative controls

    Deep mixture of linear mixed models for complex longitudinal data

    Full text link
    Mixtures of linear mixed models are widely used for modelling longitudinal data for which observation times differ between subjects. In typical applications, temporal trends are described using a basis expansion, with basis coefficients treated as random effects varying by subject. Additional random effects can describe variation between mixture components, or other known sources of variation in complex experimental designs. A key advantage of these models is that they provide a natural mechanism for clustering, which can be helpful for interpretation in many applications. Current versions of mixtures of linear mixed models are not specifically designed for the case where there are many observations per subject and a complex temporal trend, which requires a large number of basis functions to capture. In this case, the subject-specific basis coefficients are a high-dimensional random effects vector, for which the covariance matrix is hard to specify and estimate, especially if it varies between mixture components. To address this issue, we consider the use of recently-developed deep mixture of factor analyzers models as the prior for the random effects. The resulting deep mixture of linear mixed models is well-suited to high-dimensional settings, and we describe an efficient variational inference approach to posterior computation. The efficacy of the method is demonstrated on both real and simulated data

    Variational Inference and Sparsity in High-Dimensional Deep Gaussian Mixture Models

    Full text link
    Gaussian mixture models are a popular tool for model-based clustering, and mixtures of factor analyzers are Gaussian mixture models having parsimonious factor covariance structure for mixture components. There are several recent extensions of mixture of factor analyzers to deep mixtures, where the Gaussian model for the latent factors is replaced by a mixture of factor analyzers. This construction can be iterated to obtain a model with many layers. These deep models are challenging to fit, and we consider Bayesian inference using sparsity priors to further regularize the estimation. A scalable natural gradient variational inference algorithm is developed for fitting the model, and we suggest computationally efficient approaches to the architecture choice using overfitted mixtures where unnecessary components drop out in the estimation. In a number of simulated and two real examples, we demonstrate the versatility of our approach for high-dimensional problems, and demonstrate that the use of sparsity inducing priors can be helpful for obtaining improved clustering results
    • …
    corecore