555 research outputs found

    A Randomized Exchange Algorithm for Computing Optimal Approximate Designs of Experiments

    Get PDF
    We propose a class of subspace ascent methods for computing optimal approximate designs that covers both existing as well as new and more efficient algorithms. Within this class of methods, we construct a simple, randomized exchange algorithm (REX). Numerical comparisons suggest that the performance of REX is comparable or superior to the performance of state-of-the-art methods across a broad range of problem structures and sizes. We focus on the most commonly used criterion of D-optimality that also has applications beyond experimental design, such as the construction of the minimum volume ellipsoid containing a given set of data-points. For D-optimality, we prove that the proposed algorithm converges to the optimum. We also provide formulas for the optimal exchange of weights in the case of the criterion of A-optimality. These formulas enable one to use REX for computing A-optimal and I-optimal designs.Comment: 23 pages, 2 figure

    Using the stochastic Galerkin method as a predictive tool during an epidemic

    Get PDF
    The ability to accurately predict the course of an epidemic is extremely important. This article looks at an influenza outbreak that spread through a small boarding school. Predictions are made on multiple days throughout the epidemic using the stochastic Galerkin method to consider a range of plausible values for the parameters. These predictions are then compared to known data points. Predictions made before the peak of the epidemic had much larger variances compared to predictions made after the peak of the epidemic. References B. M. Chen-Charpentier, J. C. Cortes, J. V. Romero, and M. D. Rosello. Some recommendations for applying gPC (generalized polynomial chaos) to modeling: An analysis through the Airy random differential equation. Applied Mathematics and Computation, 219(9):4208 – 4218, 2013. doi:10.1016/j.amc.2012.11.007 B. M. Chen-Charpentier and D. Stanescu. Epidemic models with random coefficients. Mathematical and Computer Modelling, 52:1004 – 1010, 2010. doi:10.1016/j.mcm.2010.01.014 D. B. Harman and P. R. Johnston. Applying the stochastic galerkin method to epidemic models with individualised parameter distributions. In Proceedings of the 12th Biennial Engineering Mathematics and Applications Conference, EMAC-2015, volume 57 of ANZIAM J., pages C160–C176, August 2016. doi:10.21914/anziamj.v57i0.10394 D. B. Harman and P. R. Johnston. Applying the stochastic galerkin method to epidemic models with uncertainty in the parameters. Mathematical Biosciences, 277:25 – 37, 2016. doi:10.1016/j.mbs.2016.03.012 D. B. Harman and P. R. Johnston. Boarding house: find border. 2019. doi:10.6084/m9.figshare.7699844.v1 D. B. Harman and P. R. Johnston. SIR uniform equations. 2 2019. doi:10.6084/m9.figshare.7692392.v1 H. W. Hethcote. The mathematics of infectious diseases. SIAM Review, 42(4):599–653, 2000. doi:10.1137/S0036144500371907 R.I. Hickson and M.G. Roberts. How population heterogeneity in susceptibility and infectivity influences epidemic dynamics. Journal of Theoretical Biology, 350(0):70 – 80, 2014. doi:10.1016/j.jtbi.2014.01.014 W. O. Kermack and A. G. McKendrick. A contribution to the mathematical theory of epidemics. Proceedings of the Royal Society of London. Series A, 115(772):700–721, August 1927. doi:10.1098/rspa.1927.0118 M. G. Roberts. A two-strain epidemic model with uncertainty in the interaction. The ANZIAM Journal, 54:108–115, 10 2012. doi:10.1017/S1446181112000326 M. G. Roberts. Epidemic models with uncertainty in the reproduction number. Journal of Mathematical Biology, 66(7):1463–1474, 2013. doi:10.1007/s00285-012-0540-y F. Santonja and B. Chen-Charpentier. Uncertainty quantification in simulations of epidemics using polynomial chaos. Computational and Mathematical Methods in Medicine, 2012:742086, 2012. doi:10.1155/2012/742086 Communicable Disease Surveillance Centre (Public Health Laboratory Service) and Communicable Diseases (Scotland) Unit. Influenza in a boarding school. BMJ, 1(6112):587, 1978. doi:10.1136/bmj.1.6112.586 G. Strang. Linear Algebra and Its Applications. Thomson, Brooks/Cole, 2006. D. Xiu. Numerical Methods for Stochastic Computations: A Spectral Method Approach. Princeton University Press, 2010

    Shader Demonstration Using OpenSceneGraph and QT Libraries

    Get PDF
    Úlohou tĂ©to prĂĄce je pƙiblĂ­ĆŸit čtenáƙi prĂĄci s vertex a fragment procesorem. Programy pro tyto procesory se nazĂœvajĂ­ vertex a fragment shadery. Mohou bĂœt napsanĂ© v rĆŻznĂœch programovacĂ­ch jazycĂ­ch pro ně určenĂœch(HLSL, Cg...), avĆĄak v prĂĄci bude diskutovanĂœ jazyk OpenGL Shading Language(GLSL). Budou demonstrovanĂ© techniky pokročilĂ©ho renderingu: PhongĆŻv model, Blinn-Phong model Lambertovo osvětlenĂ­, Gouraudovo tĂłnovĂĄnĂ­ DĂĄle bude popsanĂĄ prĂĄce s knihovnou OpenSceneGraph, jako s knihovnou zaloĆŸenou na OpenGL, a jejĂ­ integracĂ­ s knihovnou na tvorbu uĆŸivatelskĂ©ho rozhranĂ­ QT. VĂœsledkem bude multiplatformovĂĄ aplikace demonstrujĂ­cĂ­ propojenĂ­ knihoven QT a OpenSceneGraph s integrovanĂœm nĂĄvodem popisujĂ­cĂ­m celĂœ proces implementace. Nebudou chybět teoretickĂ© zĂĄklady.Assignment of this work is to zoom in the work with vertex and fragment processor for users. Programs for these processors are called vertex and fragment shaders. They may be written in a various programming languages intended for them (HLSL, Cg...), however in the work is going to be discussed OpenGL Shading Language (GLSL). There are going to be demonstrated this techniques of advanced rendering: Phong shading, Blinn-Phong shading Lambert illumination, Gouraud shading Later on is going to be described work with library OpenSceneGraph, as a library based on OpenGL and its integration with library for generating user's interface. The result is going to be a multiplatform application demonstrating connection between QT and OpenSpaceGraph libraries with integrated tutorial describing whole process of implementation. Theoretical background is going to be included as well.

    Magnetic-Moment Fragmentation and Monopole Crystallization

    Get PDF
    The Coulomb phase, with its dipolar correlations and pinch-point-scattering patterns, is central to discussions of geometrically frustrated systems, from water ice to binary and mixed-valence alloys, as well as numerous examples of frustrated magnets. The emergent Coulomb phase of lattice-based systems has been associated with divergence-free fields and the absence of long-range order. Here, we go beyond this paradigm, demonstrating that a Coulomb phase can emerge naturally as a persistent fluctuating background in an otherwise ordered system. To explain this behavior, we introduce the concept of the fragmentation of the field of magnetic moments into two parts, one giving rise to a magnetic monopole crystal, the other a magnetic fluid with all the characteristics of an emergent Coulomb phase. Our theory is backed up by numerical simulations, and we discuss its importance with regard to the interpretation of a number of experimental results

    Psychiatric Evaluation of the Agitated Patient: Consensus Statement of the American Association for Emergency Psychiatry Project BETA Psychiatric Evaluation Workgroup

    Get PDF
    It is difficult to fully assess an agitated patient, and the complete psychiatric evaluation usually cannot be completed until the patient is calm enough to participate in a psychiatric interview. Nonetheless, emergency clinicians must perform an initial mental status screening to begin this process as soon as the agitated patient presents to an emergency setting. For this reason, the psychiatric evaluation of the agitated patient can be thought of as a 2-step process. First, a brief evaluation must be aimed at determining the most likely cause of agitation, so as to guide preliminary interventions to calm the patient. Once the patient is calmed, more extensive psychiatric assessment can be completed. The goal of the emergency assessment of the psychiatric patient is not necessarily to obtain a definitive diagnosis. Rather, ascertaining a differential diagnosis, determining safety, and developing an appropriate treatment and disposition plan are the goals of the assessment. This article will summarize what components of the psychiatric assessment can and should be done at the time the agitated patient presents to the emergency setting. The complete psychiatric evaluation of the patient whose agitation has been treated successfully is beyond the scope of this article and Project BETA (Best practices in Evaluation and Treatment of Agitation), but will be outlined briefly to give the reader an understanding of what a full psychiatric assessment would entail. Other issues related to the assessment of the agitated patient in the emergency setting will also be discussed

    Forest distribution and site quality in southern Lower Michigan, USA

    Get PDF
    ABSTRACT Aim The primary objectives of this research were to determine whether current forest patches in southern Lower Michigan are a proportionate sample of forest types present in the pre-settlement cover and, if not, to establish the degree to which certain types are over-or under-represented in the contemporary landscape. This determination is useful not only because any conservation policy designed to restore the present forest to pre-settlement biodiversity through preservation of existing stands requires an accurate understanding of the degree to which these stands in sum mirror past forest diversity, but also because it fills a gap in the existing ecological literature. Location The research was conducted within four counties in southern Lower Michigan, USA (Ionia, Livingston, Tuscola and Van Buren). Methods Soil survey data were used to characterize the range of site quality across the study area and the areal extent of each quality category. The geographic locations of all current forest patches in each county were then determined from land use maps and were overlaid on the site quality classification. This procedure yielded the observed distribution of forest relative to site quality. The expected areal extent of forest within each category of site quality on the landscape was determined by assuming a random distribution and multiplying the total area of forestland by the proportion of landscape within each category of site quality. This procedure calculated the expected distribution of forest in terms of site quality by dividing the total forestland among the landscape types, relative to how well represented the landscape types were. The observed and expected distributions were then compared both in terms of absolute difference and normalized difference. Results Overall results indicate that categories of site quality that support a large proportion of the present-day forest patches are generally composed of agriculturally inferior soil and are over-represented with forest. Surviving or reforested tracts are concentrated on inferior types of habitat. Main conclusions Results suggest that the present-day forest patches may not be a proportionate sample of the primeval forest. Rather, they are concentrated on agriculturally-inferior (coarse-textured, steeply-sloped, or poorly-drained) types of habitat. Unless these stands are for some unknown reason compositionally richer than their pre-settlement counterparts, these results suggest that the existing forest resource in southern Lower Michigan is an inferior (biased) sample of the primeval cover. Furthermore, because forest types associated with the most heavily-developed agricultural sites have apparently suffered the most loss of habitat, species more characteristic of these types may have experienced a greater decline in overall importance across the landscape. This study suggests that policy aimed at increasing the potential biodiversity of Land cover disturbance associated with the arrival of European settlers has left the Midwestern United States with a severely altered forest ecosystem The result today is a pattern of altered forest fragments, and plant scientists confront the challenge of determining how and to what degree they resemble the pre-settlement forest. Such a question is significant not only because of academic goals arising out of our need to know about our own environment, but also because conservation policy intended to restore forest biodiversity to pre-settlement levels needs to be informed by solid empirical understanding. Actually, two distinct kinds of questions about the relationship between present and former forests arise, one of which has dominated this area of research. It concerns the composition of the pre-settlement forests and the degree to which present forest remnants resemble them, and approaching it requires inventories not only of present forest resources but also of the prehistoric composition, which necessitates the development of scientific methods that allow us to reliably reconstruct the forests of the past. One way this reconstruction has been attempted is through inference based on fossil pollen deposits. Since the pioneering work of Von Post (1917) and Another way the composition of the historic forests can be estimated is through the use of 'witness trees' from land survey records, following the pioneering work of A second, less-frequently-asked set of questions about the degree of match-up between contemporary forests and their pre-settlement counterparts has to do with the proportion of different forest types across the contemporary landscape vs. the historic condition, and this relationship is the focus of our paper. Questions of this type are relevant to the issue of biodiversity on the total landscape (rather than in particular stands) and arise because evidence suggests that the decisions as to which forests early European settlers removed for agricultural pursuit were systematic rather than random. That is, because some kinds of habitats were (and remain) preferred for farming over others, these kinds were most frequently cleared may also be the last to be abandoned and reforested (as farming in parts of the Midwest has recently declined). From the standpoint of ecological interpretation of present-day forests, such a selective process of clearing would be important if it would affect the degree to which extant forests in aggregate mirror the pre-settlement condition. As intuitive as such an outcome seems, we are aware of no formal scientific treatment of the evidence for such systematic bias in the land-clearing process and/or its impact on the contemporary forest resource. This paper is our attempt to address this deficiency by determining the degree to which the current forest remnants in a portion of the Midwest (southern Lower Michigan) represent the pre-settlement forest, that is, whether they form a proportionate sample of these forests. As Keywords Pre-settlement forest, biodiversity, Michigan P. R. Scull and J. R. Harman 1504 Journal of Biogeography 31, 1503-1514, ÂȘ 2004 Blackwell Publishing Ltd unattractive for farming and the forest composition on those sites. If such a linkage could be established, then the resulting pattern of (both remnant and reforested) woodlots would be distributed non-randomly across all possible sites on a landscape, with a clear preference for agriculturally-inferior sites. We seek to determine whether such a pattern is objectively verifiable on the southern Michigan landscape by determining whether various categories of landscape types support both a larger and smaller proportion of the total forest area today than they did prior to European settlement. We think this question is significant because an answer to it might inform conservation strategies geared to enhancing total landscape (not stand) biodiversity. Obviously, the ideal restored (or preserved) forest resource (in terms of biodiversity) will be the one that most closely approximates the proportion of types present in their pre-settlement distribution. One that either over or under-represents the original mix of types would be an inauthentic reproduction. Unless we know both what proportion of the areal coverage of the original forest consisted of what type and to what degree the present cover is a proportionate sample of the various types in that original, programmes designed to promote the indiscriminate preservation of extant forest stands may end up contributing to the development of a contemporary forest cover skewed quite far from the original in terms of the proportion of its component types. BACKGROUN

    Conditional citizens? welfare rights and responsibilities in the late 1990s

    Get PDF
    In Britain the relationship between welfare rights and responsibilities has undergone change. A new welfare 'consensus' that emphasizes a citizen ship centred on notions of duty rather than rights has been built. This has allowed the state to reduce its role as a provider of welfare and also defend a position in which the welfare rights of some citizens are increas ingly conditional on those individuals meeting compulsory responsibili ties or duties. This concentration on individual responsibility/duty has undermined the welfare rights of some of the poorest members of society. Three levels of debate are considered within the article: academic, pol itical and 'grassroots'. The latter is included in an attempt to allow some 'bottom up' views into what is largely a debate dominated by social sci entists and politicians

    Cyber-Physical Systems for Smart Water Networks: A Review

    Get PDF
    There is a growing demand to equip Smart Water Networks (SWN) with advanced sensing and computation capabilities in order to detect anomalies and apply autonomous event-triggered control. Cyber-Physical Systems (CPSs) have emerged as an important research area capable of intelligently sensing the state of SWN and reacting autonomously in scenarios of unexpected crisis development. Through computational algorithms, CPSs can integrate physical components of SWN, such as sensors and actuators, and provide technological frameworks for data analytics, pertinent decision making, and control. The development of CPSs in SWN requires the collaboration of diverse scientific disciplines such as civil, hydraulics, electronics, environment, computer science, optimization, communication, and control theory. For efficient and successful deployment of CPS in SWN, there is a need for a common methodology in terms of design approaches that can involve various scientific disciplines. This paper reviews the state of the art, challenges, and opportunities for CPSs, that could be explored to design the intelligent sensing, communication, and control capabilities of CPS for SWN. In addition, we look at the challenges and solutions in developing a computational framework from the perspectives of machine learning, optimization, and control theory for SWN.acceptedVersio

    Incremental bounded model checking for embedded software

    Get PDF
    Program analysis is on the brink of mainstream usage in embedded systems development. Formal verification of behavioural requirements, finding runtime errors and test case generation are some of the most common applications of automated verification tools based on bounded model checking (BMC). Existing industrial tools for embedded software use an off-the-shelf bounded model checker and apply it iteratively to verify the program with an increasing number of unwindings. This approach unnecessarily wastes time repeating work that has already been done and fails to exploit the power of incremental SAT solving. This article reports on the extension of the software model checker CBMC to support incremental BMC and its successful integration with the industrial embedded software verification tool BTC EMBEDDED TESTER. We present an extensive evaluation over large industrial embedded programs, mainly from the automotive industry. We show that incremental BMC cuts runtimes by one order of magnitude in comparison to the standard non-incremental approach, enabling the application of formal verification to large and complex embedded software. We furthermore report promising results on analysing programs with arbitrary loop structure using incremental BMC, demonstrating its applicability and potential to verify general software beyond the embedded domain
    • 

    corecore