9,263 research outputs found

    High-resolution modeling of typhoon morakot (2009): Vortex rossby waves and their role in extreme precipitation over Taiwan

    Full text link
    A high-resolution nonhydrostatic numerical model, the Advanced Regional Prediction System (ARPS), was used to simulate Typhoon Morakot (2009) as it made landfall over Taiwan, producing record rainfall totals. In particular, the mesoscale structure of the typhoon was investigated, emphasizing its associated deep convection, the development of inner rainbands near the center, and the resultant intense rainfall over western Taiwan. Simulations at 15- and 3-km grid spacing revealed that, following the decay of the initial inner eyewall, a new, much larger eyewall developed as the typhoon made landfall over Taiwan. Relatively large-amplitude wave structures developed in the outer eyewall and are identified as vortex Rossby waves (VRWs), based on the wave characteristics and their similarity to VRWs identified in previous studies. Moderate to strong vertical shear over the typhoon system produced a persistent wavenumber-1 (WN1) asymmetric structure during the landfall period, with upward motion and deep convection in the downshear and downshear-left sides, consistent with earlier studies. This strong asymmetry masks the effects of WN1 VRWs. WN2 and WN3 VRWs apparently are associated with the development of deep convective bands in Morakot's southwestern quadrant. This occurs as the waves move cyclonically into the downshear side of the cyclone. Although the typhoon track and topographic enhancement contribute most to the recordbreaking rainfall totals, the location of the convective bands, and their interaction with the mountainous terrain of Taiwan, also affect the rainfall distribution. Quantitatively, the 3-km ARPS rainfall forecasts are superior to those obtained from coarser-resolution models. © 2013 American Meteorological Society

    The relationship between problem gambling, excessive gaming, psychological distress and spending on loot boxes in Aotearoa New Zealand, Australia, and the United States-A cross-national survey

    Get PDF
    Loot boxes are digital containers of randomised rewards available in many video games.Due to similarities between some loot boxes and traditional forms of gambling, concernsregarding the relationship between spending on loot boxes in video games and symptomsof problematic gambling have been expressed by policy makers and the general public. Wepresent the first investigation of these concerns in large cross-sectional cross-national samples from three countries (Aotearoa New Zealand, Australia, and the United States). A sample of 1,049 participants were recruited through Qualtrics’ Survey Targeting service from abroad cross-section of the population in Australia (n = 339), Aotearoa New Zealand (n =323), and the United States (n = 387). Participants answered a survey assessing problemgambling, problem gaming symptomology, and how much they spent on loot boxes permonth. On average, individuals with problem gambling issues spent approximately $13USD per month more on loot boxes than those with no such symptoms. Loot box spendingwas also associated with both positive and negative moods, albeit with small effect sizes.Analyses showed both interactions and correlations between problematic gambling andproblematic gaming symptoms, indicating both some commonality in the mechanismsunderlying, and independent contributions made by, these proposed diagnostic criteria.These results provide context for dialogues regarding how best to reduce the impacts of lootbox spending among those with problematic gambling symptoms

    'Looking Downward Thence’: D. G. Rossetti’s ‘The Blessed Damozel’ in Astronomical Focus

    Get PDF
    This is the author accepted manuscript. The final version is available from Project MUSE via the DOI in this record

    Who Watches the Watchmen? An Appraisal of Benchmarks for Multiple Sequence Alignment

    Get PDF
    Multiple sequence alignment (MSA) is a fundamental and ubiquitous technique in bioinformatics used to infer related residues among biological sequences. Thus alignment accuracy is crucial to a vast range of analyses, often in ways difficult to assess in those analyses. To compare the performance of different aligners and help detect systematic errors in alignments, a number of benchmarking strategies have been pursued. Here we present an overview of the main strategies--based on simulation, consistency, protein structure, and phylogeny--and discuss their different advantages and associated risks. We outline a set of desirable characteristics for effective benchmarking, and evaluate each strategy in light of them. We conclude that there is currently no universally applicable means of benchmarking MSA, and that developers and users of alignment tools should base their choice of benchmark depending on the context of application--with a keen awareness of the assumptions underlying each benchmarking strategy.Comment: Revie

    Observational study of the association of first insulin type in uncontrolled type 2 diabetes with macrovascular and microvascular disease

    Get PDF
    <p>Aims: To compare the risk of vascular disease, HbA1c and weight change, between first prescribed insulins in people with type 2 diabetes.</p> <p>Methods: People included in THIN United Kingdom primary care record database who began insulin (2000–2007) after poor control on oral glucose-lowering agents (OGLD) were grouped by the number of OGLDs in their treatment regimen immediately before starting insulin (n = 3,485). Within OGLD group, Cox regression compared macrovascular (all-cause mortality, myocardial infarction, acute coronary syndrome and stroke) and microvascular disease (peripheral neuropathy, nephropathy, and retinopathy) between insulin type (basal, pre-mix or Neutral Protamine Hagedorn, NPH) while ANCOVAs compared haemoglobin A1c (HbA1c) and weight change.</p> <p>Results: Mean follow-up was 3.6 years. Rates of incident macrovascular events were similar when basal insulin was compared to pre-mix or NPH, adjusted hazard ratio versus basal: pre-mix 1.08 (95% CI 0.73, 1.59); NPH 1.00 (0.63, 1.58) after two OGLDs, and pre-mix 0.97 (0.46, 2.02); NPH 0.77 (0.32, 1.86) after three OGLDs. An increased risk of microvascular disease in NPH versus basal after 3 OGLDs, adjusted hazard ratio1.87 (1.04, 3.36), was not seen after two agents or in comparisons of basal and pre-mix. At one year, after two OGLDs, weight increase was less with basal compared with pre-mix. After three OGLDs, mean HbA1c had reduced less in basal versus pre-mix or NPH at 6–8 and at 9–11 months, and versus pre-mix at 12–14 months.</p> <p>Conclusion: We found no difference in the risk of macrovascular events between first insulins in the medium term when started during poor glycaemia control. The increased risk of microvascular events with NPH warrants further study. In certain groups, first use of basal insulin was associated with less gain in weight and decrease in HbA1c compared to other insulins.</p&gt

    Spread Supersymmetry

    Full text link
    In the multiverse the scale of SUSY breaking, \tilde{m} = F_X/M_*, may scan and environmental constraints on the dark matter density may exclude a large range of \tilde{m} from the reheating temperature after inflation down to values that yield a LSP mass of order a TeV. After selection effects, the distribution for \tilde{m} may prefer larger values. A single environmental constraint from dark matter can then lead to multi-component dark matter, including both axions and the LSP, giving a TeV-scale LSP lighter than the corresponding value for single-component LSP dark matter. If SUSY breaking is mediated to the SM sector at order X^* X, only squarks, sleptons and one Higgs doublet acquire masses of order \tilde{m}. The gravitino mass is lighter by a factor of M_*/M_Pl and the gaugino masses are suppressed by a further loop factor. This Spread SUSY spectrum has two versions; the Higgsino masses are generated in one from supergravity giving a wino LSP and in the other radiatively giving a Higgsino LSP. The environmental restriction on dark matter fixes the LSP mass to the TeV domain, so that the squark and slepton masses are order 10^3 TeV and 10^6 TeV in these two schemes. We study the spectrum, dark matter and collider signals of these two versions of Spread SUSY. The Higgs is SM-like and lighter than 145 GeV; monochromatic photons in cosmic rays arise from dark matter annihilations in the halo; exotic short charged tracks occur at the LHC, at least for the wino LSP; and there are the eventual possibilities of direct detection of dark matter and detailed exploration of the TeV-scale states at a future linear collider. Gauge coupling unification is as in minimal SUSY theories. If SUSY breaking is mediated at order X, a much less hierarchical spectrum results---similar to that of the MSSM, but with the superpartner masses 1--2 orders of magnitude larger than in natural theories.Comment: 20 pages, 5 figure

    Reconstructing North Atlantic marine climate variability using an absolutely-dated sclerochronological network

    Get PDF
    This is the final version of the article. Available from Elsevier via the DOI in this record.Reconstructing regional to hemispheric-scale climate variability requires the application of spatially representative and climatically sensitive proxy archives. Large spatial networks of dendrochronologies have facilitated the reconstruction of atmospheric variability and inferred variability in the Atlantic Ocean system. However, the marine environment has hitherto lacked the direct application of the spatial network approach because of the small number of individual absolutely-dated marine archives. In this study we present the first analyses of a network of absolutely-dated annually-resolved growth increment width chronologies from the marine bivalves Glycymeris glycymeris and Arctica islandica. The network contains eight chronologies spanning > 500 km along the western British continental shelf from the southern Irish Sea to North West Scotland. Correlation analysis of the individual chronologies and a suite of climate indices, including the Atlantic Multidecadal Oscillation (AMO), Central England surface air temperature (CET), northeast Atlantic sea surface temperatures (SST's) and the winter North Atlantic Oscillation (wNAO), demonstrates that, despite the large geographical distances been sites and the heterogeneous nature of the marine environment, the increment width variability in these series contains an element of coherence likely driven by a common response to changing environmental forcing. A nested Principal component analysis (PCA) was used to construct five composite series which explain between 31% and 74% of the variance across the individual chronologies. Linear regression analyses indicate that the composite series explain up to 41% of the variance in Northeast Atlantic SSTs over the calibration period (1975–2000). Calibration verification (reduction of error [RE] and coefficient of efficiency [CE]) statistics indicate that the composite series contains significant skill at reconstructing multi-decadal northeast Atlantic SST variability over the past two centuries (1805–2010). These data suggest that composite series derived from sclerochronology networks can facilitate the robust reconstruction of marine climate over past centuries to millennia providing invaluable baseline records of natural oceanographic variability.This work was supported financially by the NERC funded project Climate of the Last Millennium Project (CLAM; project No. NE/N001176/1) and the Marie Curie Frame work Partnership Annually Resolved Archives of Marine Climate Change (ARAMACC; Project No. FP7 604802). The authors would like to thank the three anonymous reviewer‘s for their constructive comments during the peer review process

    An adaptive prefix-assignment technique for symmetry reduction

    Full text link
    This paper presents a technique for symmetry reduction that adaptively assigns a prefix of variables in a system of constraints so that the generated prefix-assignments are pairwise nonisomorphic under the action of the symmetry group of the system. The technique is based on McKay's canonical extension framework [J.~Algorithms 26 (1998), no.~2, 306--324]. Among key features of the technique are (i) adaptability---the prefix sequence can be user-prescribed and truncated for compatibility with the group of symmetries; (ii) parallelizability---prefix-assignments can be processed in parallel independently of each other; (iii) versatility---the method is applicable whenever the group of symmetries can be concisely represented as the automorphism group of a vertex-colored graph; and (iv) implementability---the method can be implemented relying on a canonical labeling map for vertex-colored graphs as the only nontrivial subroutine. To demonstrate the practical applicability of our technique, we have prepared an experimental open-source implementation of the technique and carry out a set of experiments that demonstrate ability to reduce symmetry on hard instances. Furthermore, we demonstrate that the implementation effectively parallelizes to compute clusters with multiple nodes via a message-passing interface.Comment: Updated manuscript submitted for revie
    corecore