205 research outputs found

    To what extent do existing nature-based lesson plans for outdoor learning experiences in the Great Lakes area align to a nature-based sense of place framework?

    Get PDF
    This paper explores the question To what extent do existing nature-based lesson plans for outdoor learning experiences in the Great Lakes region align to a nature-based sense of place framework? Sense of place includes place meaning and place attachment that must be developed through experience. Chapter two explores nature-based experiences with examples from environmental education to help illustrate a sense of place development in action. The literature review further investigates current understandings of sense of place and current best practices for developing outdoor, nature-based, environmental experiences. The Sense of Place framework developed by Mathews et al. (2020) was used to guide the creation of a rubric to rate how closely the lessons align to the construct of sense of place. Chapter Three describes the scope of the project in terms of timeline, audience, and assessment. Twenty nature-based lesson plans from six organizations based in the Great Lakes region were rated using a rubric to identify how closely the actual lessons align to the framework to develop a sense of place. A key outcome of this project is a White Paper that serves as a resource which calls for the use of the rating rubric which is understood to be a practical tool for the Sense of Place theoretical framework. Conclusions included descriptions and analyses. The paper ends with the limitations of this work and the call for completion of a reliability and validity study on the rating rubric

    Implementing a Program Checker for Linked Lists

    Get PDF
    Coordinated Science Laboratory was formerly known as Control Systems LaboratoryNational Science Foundation / CCR-931569

    Use of novel postmortem sample types for detection of African swine fever virus infection after natural consumption

    Get PDF
    Master of ScienceDepartment of Diagnostic Medicine/PathobiologyMegan NiederwerderAfrican swine fever virus (ASFV) is a recognizable disease by the World Organization for Animal Health (WOAH). Virulent strains can cause 100% mortality in pigs and with no current licensed vaccine or therapeutic, rapid identification, biocontainment, and culling is critical. First discovered in Africa in the 1920s, ASFV has spread rapidly through the continent and into other countries. Since its first introduction into China in 2018, ASFV has rapidly spread across the country and into additional countries such as Germany in 2020. In 2021, the virus was detected in the Dominican Republic, due to its geographical location in respect to the United States, the U.S is on heightened alert. The United States is an ASFV naïve country and relies on mitigation strategies and trade restrictions to maintain that status. Current confirmatory testing for ASFV for postmortem requires spleen, tonsil, gastrohepatic lymph node, renal lymph node, and inguinal lymph node. These samples are not often collected on farms and may be difficult to collect during passive surveillance due to decomposition of a wild boar carcass. It is critical to have validated, consistent practices that can be applied to a wide variety of circumstances. Having more samples validated for the detection of ASFV will improve rapid and reliable detection while also reducing further environmental contamination posed from opening a carcass. This study sought to identify novel sample matrices, that were equivalent to the postmortem sample spleen, for the detection of ASFV Georgia/07 in pigs’ that orally consumed ASFV inoculated media. In our experiment, 7-8 week of male pigs (n=10), orally consumed media with 104 TCID50/ml (tissue culture infectious dose) ASFV Georgia/07, along with controls (n=2), who received sterile non-infectious media. After presentation of clinical signs between 5-7 dpi, pigs were humanely euthanized, and a variety of tissues were immediately collected. This study compares log10 Starting Quantity (SQ) copy numbers of ASFV Georgia/07 generated from real-time PCR to assess quantity of ASFV DNA present in: swabs (preputial, spleen, muscle, peritoneal fluid, conjunctiva), lymph nodes (mesenteric, gastrohepatic, inguinal, popliteal, submandibular, tracheobronchial, retropharyngeal, sternal), fluid (ocular, urine, feces), and tissues (spleen, tonsil, conjunctiva, muscle, ear notch, tail notch, bone marrow, diaphragm) to the gold-standard postmortem sample of spleen. After collection, samples were processed and stored immediately in -80° C until DNA extraction and PCR was performed. Samples were evaluated using paired t-Test (p ≤ 0.05) and were individually compared to not only spleen but also other samples with similar mean SQ values and variance for quantity of ASFV DNA present. These multiple comparisons will provide additional information for field veterinarians, hunters, and slaughterhouse staff to select an accessible and available tissue with confidence that it is comparable to the SQ of the spleen or other reliable matrix for the early detection of ASFV

    Machine Cryptography and Modern Cryptanalysis

    Get PDF

    Algorithms for Performance, Dependability, and Performability Evaluation using Stochastic Activity Networks

    Get PDF
    Modeling tools and technologies are important for aerospace development. At the University of Illinois, we have worked on advancing the state of the art in modeling by Markov reward models in two important areas: reducing the memory necessary to numerically solve systems represented as stochastic activity networks and other stochastic Petri net extensions while still obtaining solutions in a reasonable amount of time, and finding numerically stable and memory-efficient methods to solve for the reward accumulated during a finite mission time. A long standing problem when modeling with high level formalisms such as stochastic activity networks is the so-called state space explosion, where the number of states increases exponentially with size of the high level model. Thus, the corresponding Markov model becomes prohibitively large and solution is constrained by the the size of primary memory. To reduce the memory necessary to numerically solve complex systems, we propose new methods that can tolerate such large state spaces that do not require any special structure in the model (as many other techniques do). First, we develop methods that generate row and columns of the state transition-rate-matrix on-the-fly, eliminating the need to explicitly store the matrix at all. Next, we introduce a new iterative solution method, called modified adaptive Gauss-Seidel, that exhibits locality in its use of data from the state transition-rate-matrix, permitting us to cache portions of the matrix and hence reduce the solution time. Finally, we develop a new memory and computationally efficient technique for Gauss-Seidel based solvers that avoids the need for generating rows of A in order to solve Ax = b. This is a significant performance improvement for on-the-fly methods as well as other recent solution techniques based on Kronecker operators. Taken together, these new results show that one can solve very large models without any special structure

    Explicit Model Checking of Very Large MDP using Partitioning and Secondary Storage

    Full text link
    The applicability of model checking is hindered by the state space explosion problem in combination with limited amounts of main memory. To extend its reach, the large available capacities of secondary storage such as hard disks can be exploited. Due to the specific performance characteristics of secondary storage technologies, specialised algorithms are required. In this paper, we present a technique to use secondary storage for probabilistic model checking of Markov decision processes. It combines state space exploration based on partitioning with a block-iterative variant of value iteration over the same partitions for the analysis of probabilistic reachability and expected-reward properties. A sparse matrix-like representation is used to store partitions on secondary storage in a compact format. All file accesses are sequential, and compression can be used without affecting runtime. The technique has been implemented within the Modest Toolset. We evaluate its performance on several benchmark models of up to 3.5 billion states. In the analysis of time-bounded properties on real-time models, our method neutralises the state space explosion induced by the time bound in its entirety.Comment: The final publication is available at Springer via http://dx.doi.org/10.1007/978-3-319-24953-7_1

    A fluid analysis framework for a Markovian process algebra

    Get PDF
    Markovian process algebras, such as PEPA and stochastic π-calculus, bring a powerful compositional approach to the performance modelling of complex systems. However, the models generated by process algebras, as with other interleaving formalisms, are susceptible to the state space explosion problem. Models with only a modest number of process algebra terms can easily generate so many states that they are all but intractable to traditional solution techniques. Previous work aimed at addressing this problem has presented a fluid-flow approximation allowing the analysis of systems which would otherwise be inaccessible. To achieve this, systems of ordinary differential equations describing the fluid flow of the stochastic process algebra model are generated informally. In this paper, we show formally that for a large class of models, this fluid-flow analysis can be directly derived from the stochastic process algebra model as an approximation to the mean number of component types within the model. The nature of the fluid approximation is derived and characterised by direct comparison with the Chapman–Kolmogorov equations underlying the Markov model. Furthermore, we compare the fluid approximation with the exact solution using stochastic simulation and we are able to demonstrate that it is a very accurate approximation in many cases. For the first time, we also show how to extend these techniques naturally to generate systems of differential equations approximating higher order moments of model component counts. These are important performance characteristics for estimating, for instance, the variance of the component counts. This is very necessary if we are to understand how precise the fluid-flow calculation is, in a given modelling situation

    An open system transportation security sensor network: field trial experiences

    Get PDF
    Abstract Cargo shipments are subject to hijack, theft, or tampering. Furthermore, cargo shipments are at risk of being used to transport contraband, potentially resulting in fines to shippers. The Transportation Security Sensor Network (TSSN), which is based on open software systems and Service Oriented Architecture (SOA) principles, has been developed to mitigate these risks. Using commercial off-the-shelf (COTS) hardware, the TSSN is able to detect events and report those relevant to appropriate decision makers. However, field testing is required to validate the system architecture as well as to determine if the system can provide timely event notification. Field experiments were conducted to assess the TSSN's suitability for monitoring rail-borne cargo. Log files were collected from these experiments and postprocessed. We present the TSSN architecture and results of field experiments, including the time taken to report events using the TSSN as well as on the interaction between various components of the TSSN. These results show that the TSSN architecture can be used to monitor rail-borne cargo. i

    Location-Aware Quality of Service Measurements for Service-Level Agreements

    Get PDF
    We add specifications of location-aware measurements to performance models in a compositional fashion, promoting precision in performance measurement design. Using immediate actions to send control signals between measurement components we are able to obtain more accurate measurements from our stochastic models without disturbing their structure. A software tool processes both the model and the measurement specifications to give response time distributions and quantiles, an essential calculation in determining satisfaction of service-level agreements (SLAs)
    • …
    corecore