88,083 research outputs found

    A decision-theoretic approach for segmental classification

    Full text link
    This paper is concerned with statistical methods for the segmental classification of linear sequence data where the task is to segment and classify the data according to an underlying hidden discrete state sequence. Such analysis is commonplace in the empirical sciences including genomics, finance and speech processing. In particular, we are interested in answering the following question: given data yy and a statistical model π(x,y)\pi(x,y) of the hidden states xx, what should we report as the prediction x^\hat{x} under the posterior distribution π(x∣y)\pi (x|y)? That is, how should you make a prediction of the underlying states? We demonstrate that traditional approaches such as reporting the most probable state sequence or most probable set of marginal predictions can give undesirable classification artefacts and offer limited control over the properties of the prediction. We propose a decision theoretic approach using a novel class of Markov loss functions and report x^\hat{x} via the principle of minimum expected loss (maximum expected utility). We demonstrate that the sequence of minimum expected loss under the Markov loss function can be enumerated exactly using dynamic programming methods and that it offers flexibility and performance improvements over existing techniques. The result is generic and applicable to any probabilistic model on a sequence, such as Hidden Markov models, change point or product partition models.Comment: Published in at http://dx.doi.org/10.1214/13-AOAS657 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Insurance Policies: The Grandparents of Contractual Black Holes

    Get PDF
    In their recent article, The Black Hole Problem in Commercial Boilerplate, Professors Stephen Choi, Mitu Gulati, and Robert Scott identify a phenomenon found in standardized contracts they describe as “contractual black holes.” The concept of black holes comes from theoretical physics. Under the original hypothesis, the gravitational pull of a black hole is so strong that once light or information is pulled past an event horizon into a black hole, it cannot escape. In recent years, the theory has been reformulated and now the hypothesis is that some information can escape, but it is so degraded that it is virtually useless. In their article, Choi, Gulati, and Scott apply the black hole concept to certain standardized contractual boilerplate provisions. Although the focus of their article is on the contractual black hole nature of pari passu clauses that are used in sovereign debt contracts, Choi, Gulati, and Scott note that “[s]tandard insurance contracts appear to be another area with the potential for terms that have lost meaning.” They are correct that insurance policies are an area in which contractual black holes would appear quite likely to develop. In this essay, to test the hypothesis that insurance policies potentially are, or contain, contractual black holes, four policy provisions found in commercial insurance policies are considered: 1) “Sue and Labor” Clauses, 2) “Ensuing Loss” Clauses, 3) “Non-Cumulation” Clauses, and 4) the “Sudden and Accidental” Pollution Exclusion. An examination of these provisions demonstrates that some policy provisions have become contractual black holes, some provisions are only apparent contractual black holes, and other provisions on their way to becoming contractual black holes were saved before the original meaning of such provisions crossed the event horizon

    Does IT Spending Matter on Hospital Financial Performance and Quality?

    Get PDF
    This research explored impacts of IT spending on hospital financial performance and hospital quality. We developed two research hypotheses accordingly. The first hypothesis was that IT spending would be positively related to the hospital financial performance, and the second hypothesis was that hospitals with higher IT spending would have better quality metrics. We used the 2017 American Hospital Association Survey data and the HCAHPS dataset from Medicare website. We tested three hospital financials and three quality measures. We employed T-Tests and ANOVA models to test the hypotheses. Results were inconclusive for both hypotheses. Evidence showed statistical significance on two out of seven tests

    Chris Brown: Out of control mess or grossly misunderstood Artist?

    Get PDF
    In today’s pop culture world, celebrities are seen as perfect individuals with grand houses, cars, and entourages. When Chris Brown came on the scene in the mid-2000s, he was a teenage heartthrob who could do no wrong. That all changed when he brutally beat fellow music superstar and then girlfriend Rihanna in 2009. Brown’s media persona came crashing down, along with seemingly everything else in his life. However, in a situation where many artists would normally descend and never return, Brown has surged back almost to the heights he reached prior to 2009. How did this happen? What does the role of the music industry play in this? The norms and tendencies of the popular music industry are examined to determine the external factors that both hindered and helped Brown’s changes in reputation (and by extension, record sales) over time

    Young module multiplicities and classifying the indecomposable Young permutation modules

    Full text link
    We study the multiplicities of Young modules as direct summands of permutation modules on cosets of Young subgroups. Such multiplicities have become known as the p-Kostka numbers. We classify the indecomposable Young permutation modules, and, applying the Brauer construction for p-permutation modules, we give some new reductions for p-Kostka numbers. In particular we prove that p-Kostka numbers are preserved under multiplying partitions by p, and strengthen a known reduction given by Henke, corresponding to adding multiples of a p-power to the first row of a partition.Comment: 22 page

    MS-089: Yarnell Collection

    Full text link
    The Yarnell Collection consists of correspondence received by Orpha Yarnell during World War II from both of her sons, Clyde and Glenn. Clyde served with the 493rd Quartermaster Depot, and the letter from overseas, his training at Camp Harahan, and his stay in Camp Stoneman. Glenn served with the 186th Engineer Combat Battalion, originating from Fort Jackson, Camp Forrest, and New Guinea.https://cupola.gettysburg.edu/findingaidsall/1081/thumbnail.jp

    Dictating Aesthetic and Political Legitimacy through Golden Age Theater: Fuente Ovejuna at the Teatro Español, Directed by Cayetano Luca de Tena (1944)

    Full text link
    Emboldened by their success in the Spanish Civil War (1936–1939), Nationalist ideologues sought to revitalize the stagnant Spanish theater and promote values associated with the newly formed authoritarian regime. The memory and restaging of seventeenth-century comedias became a crucial part of this project that focused particularly on Lope de Vega\u27s Fuente Ovejuna, a history play that dramatizes a village\u27s fifteenth-century rebellion against a tyrannical overlord. The definitive performance of Fuente Ovejuna during the early years of Franco\u27s dictatorship, a production directed by Cayetano Luca de Tena at the Teatro Español in 1944, represented the culmination of the right\u27s struggle to regenerate the theater. By adopting a fascist aesthetic and reinforcing the regime\u27s political legitimacy through history, Luca de Tena\u27s production captured its contemporary moment and signaled a possible solution to the theatrical crisis, one that blended historiography, aesthetics, and politics

    Semiautomated Skeletonization of the Pulmonary Arterial Tree in Micro-CT Images

    Get PDF
    We present a simple and robust approach that utilizes planar images at different angular rotations combined with unfiltered back-projection to locate the central axes of the pulmonary arterial tree. Three-dimensional points are selected interactively by the user. The computer calculates a sub- volume unfiltered back-projection orthogonal to the vector connecting the two points and centered on the first point. Because more x-rays are absorbed at the thickest portion of the vessel, in the unfiltered back-projection, the darkest pixel is assumed to be the center of the vessel. The computer replaces this point with the newly computer-calculated point. A second back-projection is calculated around the original point orthogonal to a vector connecting the newly-calculated first point and user-determined second point. The darkest pixel within the reconstruction is determined. The computer then replaces the second point with the XYZ coordinates of the darkest pixel within this second reconstruction. Following a vector based on a moving average of previously determined 3- dimensional points along the vessel\u27s axis, the computer continues this skeletonization process until stopped by the user. The computer estimates the vessel diameter along the set of previously determined points using a method similar to the full width-half max algorithm. On all subsequent vessels, the process works the same way except that at each point, distances between the current point and all previously determined points along different vessels are determined. If the difference is less than the previously estimated diameter, the vessels are assumed to branch. This user/computer interaction continues until the vascular tree has been skeletonized
    • 

    corecore