4,557 research outputs found

    An archival case study : revisiting the life and political economy of Lauchlin Currie

    Get PDF
    This paper forms part of a wider project to show the significance of archival material on distinguished economists, in this case Lauchlin Currie (1902-93), who studied and taught at Harvard before entering government service at the US Treasury and Federal Reserve Board as the intellectual leader of Roosevelt's New Deal, 1934-39, as FDR's White House economic adviser in peace and war, 1939-45, and as a post-war development economist. It discusses the uses made of the written and oral material available when the author was writing his intellectual biography of Currie (Duke University Press 1990) while Currie was still alive, and the significance of the material that has come to light after Currie's death

    Giving voters what they want? Party orientation perceptions and preferences in the British electorate

    Get PDF
    Some of the most important propositions in the political marketing literature hinge on assumptions about the electorate. In particular, voters are presumed to react in different ways to different orientations or postures. Yet there are theoretical reasons for questioning some of these assumptions, and certainly they have seldom been empirically tested. Here, we focus on one prominent example of political marketing research: Lees-Marshment’s orientations’ model. We investigate how the public reacts to product and market orientation, whether they see a trade-off between the two (a point in dispute among political marketing scholars), and whether partisans differ from non-partisan voters by being more inclined to value product over market orientation. Evidence from two mass sample surveys of the British public (both conducted online by YouGov) demonstrates important heterogeneity within the electorate, casts doubt on the core assumptions underlying some political marketing arguments and raises broader questions about what voters are looking for in a party

    Representing complex data using localized principal components with application to astronomical data

    Full text link
    Often the relation between the variables constituting a multivariate data space might be characterized by one or more of the terms: ``nonlinear'', ``branched'', ``disconnected'', ``bended'', ``curved'', ``heterogeneous'', or, more general, ``complex''. In these cases, simple principal component analysis (PCA) as a tool for dimension reduction can fail badly. Of the many alternative approaches proposed so far, local approximations of PCA are among the most promising. This paper will give a short review of localized versions of PCA, focusing on local principal curves and local partitioning algorithms. Furthermore we discuss projections other than the local principal components. When performing local dimension reduction for regression or classification problems it is important to focus not only on the manifold structure of the covariates, but also on the response variable(s). Local principal components only achieve the former, whereas localized regression approaches concentrate on the latter. Local projection directions derived from the partial least squares (PLS) algorithm offer an interesting trade-off between these two objectives. We apply these methods to several real data sets. In particular, we consider simulated astrophysical data from the future Galactic survey mission Gaia.Comment: 25 pages. In "Principal Manifolds for Data Visualization and Dimension Reduction", A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev (eds), Lecture Notes in Computational Science and Engineering, Springer, 2007, pp. 180--204, http://www.springer.com/dal/home/generic/search/results?SGWID=1-40109-22-173750210-

    Antipsychotic dose escalation as a trigger for Neuroleptic Malignant Syndrome (NMS): literature review and case series report

    Get PDF
    Background: “Neuroleptic malignant syndrome” (NMS) is a potentially fatal idiosyncratic reaction to any medication which affects the central dopaminergic system. Between 0.5% and 1% of patients exposed to antipsychotics develop the condition. Mortality rates may be as high as 55% and many risk factors have been reported. Although rapid escalation of antipsychotic dose is thought to be an important risk factor, to date it has not been the focus of a published case series or scientifically defined. <p/>Aims: To identify cases of NMS and review risk factors for its development with a particular focus on rapid dose escalation in the 30 days prior to onset. <p/>Methodology: A review of the literature on rapid dose escalation was undertaken and a pragmatic definition of “rapid dose escalation” was made. NMS cases were defined using DSM-IV criteria and systematically identified within a secondary care mental health service. A ratio of titration rate was calculated for each NMS patient and “rapid escalators” and “non rapid escalators” were compared. <p/>Results: 13 cases of NMS were identified. A progressive mean dose increase 15 days prior to the confirmed episode of NMS was observed (241.7mg/day during days 1-15 to 346.9mg/day during days 16-30) and the mean ratio of dose escalation for NMS patients was 1.4. Rapid dose escalation was seen in 5/13 cases and non rapid escalators had markedly higher daily cumulative antipsychotic dose compared to rapid escalators. <p/>Conclusions: Rapid dose escalation occurred in less than half of this case series (n=5, 38.5%), although there is currently no consensus on the precise definition of rapid dose escalation. Cumulative antipsychotic dose – alongside other known risk factors - may also be important in the development of NMS

    Mobile Phone Radiation Induces Reactive Oxygen Species Production and DNA Damage in Human Spermatozoa In Vitro

    Get PDF
    Background: In recent times there has been some controversy over the impact of electromagnetic radiation on human health. The significance of mobile phone radiation on male reproduction is a key element of this debate since several studies have suggested a relationship between mobile phone use and semen quality. The potential mechanisms involved have not been established, however, human spermatozoa are known to be particularly vulnerable to oxidative stress by virtue of the abundant availability of substrates for free radical attack and the lack of cytoplasmic space to accommodate antioxidant enzymes. Moreover, the induction of oxidative stress in these cells not only perturbs their capacity for fertilization but also contributes to sperm DNA damage. The latter has, in turn, been linked with poor fertility, an increased incidence of miscarriage and morbidity in the offspring, including childhood cancer. In light of these associations, we have analyzed the influence of RF-EMR on the cell biology of human spermatozoa in vitro. Principal Findings: Purified human spermatozoa were exposed to radio-frequency electromagnetic radiation (RF-EMR) tuned to 1.8 GHz and covering a range of specific absorption rates (SAR) from 0.4 W/kg to 27.5 W/kg. In step with increasing SAR, motility and vitality were significantly reduced after RF-EMR exposure, while the mitochondrial generation of reactive oxygen species and DNA fragmentation were significantly elevated (P<0.001). Furthermore, we also observed highly significant relationships between SAR, the oxidative DNA damage bio-marker, 8-OH-dG, and DNA fragmentation after RF-EMRexposure. Conclusions: RF-EMR in both the power density and frequency range of mobile phones enhances mitochondrial reactive oxygen species generation by human spermatozoa, decreasing the motility and vitality of these cells while stimulating DNA base adduct formation and, ultimately DNA fragmentation. These findings have clear implications for the safety of extensive mobile phone use by males of reproductive age, potentially affecting both their fertility and the health and wellbeing of their offspring

    Challenging Methods and Results Obtained from User-Generated Content in Barcelona’s Public Open Spaces

    Get PDF
    User-generated content (UGC) provides useful resources for academics, technicians and policymakers to obtain and analyse results in order to improve lives of individuals in urban settings. User-generated content comes from people who voluntarily contribute data, information, or media that then appears in a way which can be viewed by others; usually on the Web. However, to date little is known about how complex methodologies for getting results are subject to methodology-formation errors, personal data protection, and reliability of outcomes. Different researches have been approaching to inquire big data methods for a better understanding of social groups for planners and economic needs. In this chapter, through UGC from Tweets of users located in Barcelona, we present different research experiments. Data collection is based on the use of REST API; while analysis and representation of UGC follow different ways of processing and providing a plurality of information. The first objective is to study the results at a different geographical scale, Barcelona’s Metropolitan Area and at two Public Open Spaces (POS) in Barcelona, Enric Granados Street and the area around the Fòrum de les Cultures; during similar days in two periods of time - in January of 2015 and 2017. The second objective is intended to better understand how different types of POS’ Twitter-users draw urban patterns. The Origin-Destination patterns generated illustrate new social behaviours, addressed to multifunctional uses. This chapter aims to be influential in the use of UGC analysis for planning purposes and to increase quality of life

    Differential expression analysis with global network adjustment

    Get PDF
    &lt;p&gt;Background: Large-scale chromosomal deletions or other non-specific perturbations of the transcriptome can alter the expression of hundreds or thousands of genes, and it is of biological interest to understand which genes are most profoundly affected. We present a method for predicting a gene’s expression as a function of other genes thereby accounting for the effect of transcriptional regulation that confounds the identification of genes differentially expressed relative to a regulatory network. The challenge in constructing such models is that the number of possible regulator transcripts within a global network is on the order of thousands, and the number of biological samples is typically on the order of 10. Nevertheless, there are large gene expression databases that can be used to construct networks that could be helpful in modeling transcriptional regulation in smaller experiments.&lt;/p&gt; &lt;p&gt;Results: We demonstrate a type of penalized regression model that can be estimated from large gene expression databases, and then applied to smaller experiments. The ridge parameter is selected by minimizing the cross-validation error of the predictions in the independent out-sample. This tends to increase the model stability and leads to a much greater degree of parameter shrinkage, but the resulting biased estimation is mitigated by a second round of regression. Nevertheless, the proposed computationally efficient “over-shrinkage” method outperforms previously used LASSO-based techniques. In two independent datasets, we find that the median proportion of explained variability in expression is approximately 25%, and this results in a substantial increase in the signal-to-noise ratio allowing more powerful inferences on differential gene expression leading to biologically intuitive findings. We also show that a large proportion of gene dependencies are conditional on the biological state, which would be impossible with standard differential expression methods.&lt;/p&gt; &lt;p&gt;Conclusions: By adjusting for the effects of the global network on individual genes, both the sensitivity and reliability of differential expression measures are greatly improved.&lt;/p&gt

    A sense of embodiment is reflected in people's signature size

    Get PDF
    BACKGROUND: The size of a person's signature may reveal implicit information about how the self is perceived although this has not been closely examined. METHODS/RESULTS: We conducted three experiments to test whether increases in signature size can be induced. Specifically, the aim of these experiments was to test whether changes in signature size reflect a person's current implicit sense of embodiment. Experiment 1 showed that an implicit affect task (positive subliminal evaluative conditioning) led to increases in signature size relative to an affectively neutral task, showing that implicit affective cues alter signature size. Experiments 2 and 3 demonstrated increases in signature size following experiential self-focus on sensory and affective stimuli relative to both conceptual self-focus and external (non-self-focus) in both healthy participants and patients with anorexia nervosa, a disorder associated with self-evaluation and a sense of disembodiment. In all three experiments, increases in signature size were unrelated to changes in self-reported mood and larger than manipulation unrelated variations. CONCLUSIONS: Together, these findings suggest that a person's sense of embodiment is reflected in their signature size

    Forecasting Player Behavioral Data and Simulating in-Game Events

    Full text link
    Understanding player behavior is fundamental in game data science. Video games evolve as players interact with the game, so being able to foresee player experience would help to ensure a successful game development. In particular, game developers need to evaluate beforehand the impact of in-game events. Simulation optimization of these events is crucial to increase player engagement and maximize monetization. We present an experimental analysis of several methods to forecast game-related variables, with two main aims: to obtain accurate predictions of in-app purchases and playtime in an operational production environment, and to perform simulations of in-game events in order to maximize sales and playtime. Our ultimate purpose is to take a step towards the data-driven development of games. The results suggest that, even though the performance of traditional approaches such as ARIMA is still better, the outcomes of state-of-the-art techniques like deep learning are promising. Deep learning comes up as a well-suited general model that could be used to forecast a variety of time series with different dynamic behaviors
    corecore