4,183 research outputs found

    Towards Visual Analytics for Teachers’ Dynamic Diagnostic Pedagogical Decision-Making

    Get PDF
    The focus of this paper is to delineate and discuss design considerations for supporting teachers\u27 dynamic diagnostic decision-making in classrooms of the 21st century. Based on the Next Generation Teaching Education and Learning for Life (NEXT-TELL) European Commission integrated project, we envision classrooms of the 21st century to (a) incorporate 1:1 computing, (b) provide computational as well as methodological support for teachers to design, deploy and assess learning activities and (c) immerse students in rich, personalized and varied learning activities in information ecologies resulting in high-performance, high-density, high-bandwidth, and data-rich classrooms. In contrast to existing research in educational data mining and learning analytics, our vision is to employ visual analytics techniques and tools to support teachers dynamic diagnostic pedagogical decision-making in real-time and in actual classrooms. The primary benefits of our vision is that learning analytics becomes an integral part of the teaching profession so that teachers can provide timely, meaningful, and actionable formative assessments to on-going learning activities in-situ. Integrating emerging developments in visual analytics and the established methodological approach of design-based research (DBR) in the learning sciences, we introduce a new method called Teaching Analytics and explore a triadic model of teaching analytics (TMTA). TMTA adapts and extends the Pair Analytics method in visual analytics which in turn was inspired by the pair programming model of the extreme programming paradigm. Our preliminary vision of TMTA consists of a collocated collaborative triad of a Teaching Expert (TE), a Visual Analytics Expert (VAE), and a Design-Based Research Expert (DBRE) analyzing, interpreting and acting upon real-time data being generated by students\u27 learning activities by using a range of visual analytics tools. We propose an implementation of TMTA using open learner models (OLM) and conclude with an outline of future work

    Local and non-local measures of acceleration in cosmology

    Get PDF
    Current cosmological observations, when interpreted within the framework of a homogeneous and isotropic Friedmann-Lemaitre-Robertson-Walker (FLRW) model, strongly suggest that the Universe is entering a period of accelerating expansion. This is often taken to mean that the expansion of space itself is accelerating. In a general spacetime, however, this is not necessarily true. We attempt to clarify this point by considering a handful of local and non-local measures of acceleration in a variety of inhomogeneous cosmological models. Each of the chosen measures corresponds to a theoretical or observational procedure that has previously been used to study acceleration in cosmology, and all measures reduce to the same quantity in the limit of exact spatial homogeneity and isotropy. In statistically homogeneous and isotropic spacetimes, we find that the acceleration inferred from observations of the distance-redshift relation is closely related to the acceleration of the spatially averaged universe, but does not necessarily bear any resemblance to the average of the local acceleration of spacetime itself. For inhomogeneous spacetimes that do not display statistical homogeneity and isotropy, however, we find little correlation between acceleration inferred from observations and the acceleration of the averaged spacetime. This shows that observations made in an inhomogeneous universe can imply acceleration without the existence of dark energy.Comment: 19 pages, 10 figures. Several references added or amended, some minor clarifications made in the tex

    On polymorphic logical gates in sub-excitable chemical medium

    Get PDF
    In a sub-excitable light-sensitive Belousov-Zhabotinsky chemical medium an asymmetric disturbance causes the formation of localized traveling wave-fragments. Under the right conditions these wave-fragment can conserve their shape and velocity vectors for extended time periods. The size and life span of a fragment depend on the illumination level of the medium. When two or more wave-fragments collide they annihilate or merge into a new wave-fragment. In computer simulations based on the Oregonator model we demonstrate that the outcomes of inter-fragment collisions can be controlled by varying the illumination level applied to the medium. We interpret these wave-fragments as values of Boolean variables and design collision-based polymorphic logical gates. The gate implements operation XNOR for low illumination, and it acts as NOR gate for high illumination. As a NOR gate is a universal gate then we are able to demonstrate that a simulated light sensitive BZ medium exhibits computational universality

    Triage, decision-making and follow-up of patients referred to a UK forensic service: validation of the DUNDRUM toolkit

    Get PDF
    BACKGROUND: Forensic medium secure services in the UK are a scarce but essential resource providing care for those in the criminal justice system with severe mental disorder. Appropriate allocation of beds to those most in need is essential to ensure efficient use of this resource. To improve decision-making processes in a UK forensic service, an admissions panel utilized the DUNDRUM 1&2 (D1 & D2) triage instruments. METHODS: Demographic, diagnostic and clinical information on a prospective sample of referrals to a UK adult forensic service was gathered (n = 195). D1 and D2 measures were scored by a panel of clinical managers considering referral information and clinician opinion in reaching their ratings; those not admitted were also followed up. RESULTS: Within the sample, D1 ratings were predictive of decisions to admit (AUC = .79) and also differentiated between levels of security (F(4) = 16.54, p < .001). Non-admission was not significantly associated with increased risk of offending at follow-up. Items relating to self-harm and institutional behaviour did not show a predictive relationship with the panel decision to admit. CONCLUSIONS: Use of a structured professional judgement tool showing good predictive validity has improved transparency of decisions and appears to be associated with more efficient use of resources, without increased risk to the public

    Photographic identification of individuals of a free-ranging, small terrestrial vertebrate

    Get PDF
    This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.Recognition of individuals within an animal population is central to a range of estimates about population structure and dynamics. However, traditional methods of distinguishing individuals, by some form of physical marking, often rely on capture and handling which may affect aspects of normal behavior. Photographic identification has been used as a less-invasive alternative, but limitations in both manual and computer-automated recognition of individuals are particularly problematic for smaller taxa (<500 g). In this study, we explored the use of photographic identification for individuals of a free-ranging, small terrestrial reptile using (a) independent observers, and (b) automated matching with the Interactive Individual Identification System (I3S Pattern) computer algorithm. We tested the technique on individuals of an Australian skink in the Egernia group, Slater’s skink Liopholis slateri, whose natural history and varied scale markings make it a potentially suitable candidate for photo-identification. From ‘photographic captures’ of skink head profiles, we designed a multichoice key based on alternate character states and tested the abilities of observers — with or without experience in wildlife survey — to identify individuals using categorized test photos. We also used the I3S Pattern algorithm to match the same set of test photos against a database of 30 individuals. Experienced observers identified a significantly higher proportion of photos correctly (74%) than those with no experience (63%) while the I3S software correctly matched 67% as the first ranked match and 83% of images in the top five ranks. This study is one of the first to investigate photo identification with a free-ranging small vertebrate. The method demonstrated here has the potential to be applied to the developing field of camera-traps for wildlife survey and thus a wide range of survey and monitoring applications

    Outlier ensembles: A robust method for damage detection and unsupervised feature extraction from high-dimensional data

    Get PDF
    Outlier ensembles are shown to provide a robust method for damage detection and dimension reduction via a wholly unsupervised framework. Most interestingly, when utilised for feature extraction, the proposed heuristic defines features that enable near-equivalent classification performance (95.85%) when compared to the features found (in previous work) through supervised techniques (97.39%) — specifically, a genetic algorithm. This is significant for practical applications of structural health monitoring, where labelled data are rarely available during data mining. Ensemble analysis is applied to practical examples of problematic engineering data; two case studies are presented in this work. Case study I illustrates how outlier ensembles can be used to expose outliers hidden within a dataset. Case study II demonstrates how ensembles can be utilised as a tool for robust outlier analysis and feature extraction in a noisy, high-dimensional feature-space

    Towards the probabilistic analysis of small bowel capsule endoscopy features to predict severity of duodenal histology in patients with villous atrophy

    Get PDF
    Small bowel capsule endoscopy (SBCE) can be complementary to histological assessment of celiac disease (CD) and serology negative villous atrophy (SNVA). Determining the severity of disease on SBCE using statistical machine learning methods can be useful in the follow up of patients. SBCE can play an additional role in differentiating between CD and SNVA. De-identified SBCEs of patients with CD and SNVA were included. Probabilistic analysis of features on SBCE were used to predict severity of duodenal histology and to distinguish between CD and SNVA. Patients with higher Marsh scores were more likely to have a positive SBCE and a continuous distribution of macroscopic features of disease than those with lower Marsh scores. The same pattern was also true for patients with CD when compared to patients with SNVA. The validation accuracy when predicting the severity of Marsh scores and when distinguishing between CD and SNVA was 69.1% in both cases. When the proportions of each SBCE class group within the dataset were included in the classification model, to distinguish between the two pathologies, the validation accuracy increased to 75.3%. The findings of this work suggest that by using features of CD and SNVA on SBCE, predictions can be made of the type of pathology and the severity of disease

    The isotropic blackbody CMB as evidence for a homogeneous universe

    Get PDF
    The question of whether the Universe is spatially homogeneous and isotropic on the largest scales is of fundamental importance to cosmology, but has not yet been answered decisively. Surprisingly, neither an isotropic primary CMB nor combined observations of luminosity distances and galaxy number counts are sufficient to establish such a result. The inclusion of the Sunyaev-Zel'dovich effect in CMB observations, however, dramatically improves this situation. We show that even a solitary observer who sees an isotropic blackbody CMB can conclude that the universe is homogeneous and isotropic in their causal past when the Sunyaev-Zel'dovich effect is present. Critically, however, the CMB must either be viewed for an extended period of time, or CMB photons that have scattered more than once must be detected. This result provides a theoretical underpinning for testing the Cosmological Principle with observations of the CMB alone.Comment: 5 pages, 1 figur

    Size of Outbreaks Near the Epidemic Threshold

    Full text link
    The spread of infectious diseases near the epidemic threshold is investigated. Scaling laws for the size and the duration of outbreaks originating from a single infected individual in a large susceptible population are obtained. The maximal size of an outbreak n_* scales as N^{2/3} with N the population size. This scaling law implies that the average outbreak size scales as N^{1/3}. Moreover, the maximal and the average duration of an outbreak grow as t_* ~ N^{1/3} and ~ ln N, respectively.Comment: 4 pages, 5 figure

    Medication Reconciliation of Medically-Complex Emergency Department Patients by Second-Year Professional Pharmacy Students

    Get PDF
    Background: There is a high potential for medication discrepancies to occur during patient care transitions. However, health professionals must find ways to reduce these and improve patient care, such as with medication reconciliation. This intervention is used to identify a patient’s most accurate medication list by comparing the medical record to another list obtained from the patient, hospital, or other provider. Pharmacists have a major role in this process because of their medicinal expertise, but paying them is expensive, so using students may be more cost-effective. Research has examined fourth-year professional pharmacy students (P4s) performing medication reconciliation during their advanced pharmacy practice experiences; however, no research currently exists that explores the use of P2 students to determine their efficacy. Objectives: The primary objective of this study is to determine the effect of P2 students on medication reconciliation for high-risk patients undergoing care transitions within the emergency department compared to the efforts of P4s as described in the literature. The secondary objective is to determine the impact on 30-day readmission rates. Methodology: This is a historical-controlled, prospective, observational study. Data collection will occur August 2015 through April 2016 at Miami Valley Hospital in Dayton, Ohio. A sample size of at least 34 subjects is required to obtain statistical significance. Subjects will be selected by purposive sampling based on inclusion and exclusion criteria. P2 students will perform medication reconciliation and complete a reporting form providing information on medication discrepancies, interventions, and re-admission dates. Analysis: Researchers will use descriptive statistics, such as mean, mode, and standard deviation, to report each set of data based on normality. One-sample t-tests will also be used to compare P2 data with pre-existing P4 data found in the literature
    • …
    corecore