1,731 research outputs found

    Differentiated Instruction at Work. Reinforcing the art of classroom observation through the creation of a checklist for beginning and pre-service teachers

    Get PDF
    Professional experience is viewed as integral to shaping philosophy and acquiring skills in the area of classroom teaching. Classrooms are complex places, with educators implementing differentiated strategies to cater for student diversity. Pre-service teachers who observe these lessons often miss the intuitive practices, as there is much to absorb during a typical observation session. Equipping them with a checklist enhances this experience, giving them intentional guidelines with regard to observation. The current study, utilized a qualitative approach, to gain an understanding of specific dynamics that impact on a pre-service teacher’s professional experience. The intersection of data and the literature led to the creation of a checklist for use by beginning and pre-service teachers. The checklist may be used by teacher educators as an instrument to assist with the preparation of teachers, as it could help with honing in on key elements of observation of classroom practice and differentiated strategies

    Causal connectivity of evolved neural networks during behavior

    Get PDF
    To show how causal interactions in neural dynamics are modulated by behavior, it is valuable to analyze these interactions without perturbing or lesioning the neural mechanism. This paper proposes a method, based on a graph-theoretic extension of vector autoregressive modeling and 'Granger causality,' for characterizing causal interactions generated within intact neural mechanisms. This method, called 'causal connectivity analysis' is illustrated via model neural networks optimized for controlling target fixation in a simulated head-eye system, in which the structure of the environment can be experimentally varied. Causal connectivity analysis of this model yields novel insights into neural mechanisms underlying sensorimotor coordination. In contrast to networks supporting comparatively simple behavior, networks supporting rich adaptive behavior show a higher density of causal interactions, as well as a stronger causal flow from sensory inputs to motor outputs. They also show different arrangements of 'causal sources' and 'causal sinks': nodes that differentially affect, or are affected by, the remainder of the network. Finally, analysis of causal connectivity can predict the functional consequences of network lesions. These results suggest that causal connectivity analysis may have useful applications in the analysis of neural dynamics

    Belief-propagation algorithm and the Ising model on networks with arbitrary distributions of motifs

    Full text link
    We generalize the belief-propagation algorithm to sparse random networks with arbitrary distributions of motifs (triangles, loops, etc.). Each vertex in these networks belongs to a given set of motifs (generalization of the configuration model). These networks can be treated as sparse uncorrelated hypergraphs in which hyperedges represent motifs. Here a hypergraph is a generalization of a graph, where a hyperedge can connect any number of vertices. These uncorrelated hypergraphs are tree-like (hypertrees), which crucially simplify the problem and allow us to apply the belief-propagation algorithm to these loopy networks with arbitrary motifs. As natural examples, we consider motifs in the form of finite loops and cliques. We apply the belief-propagation algorithm to the ferromagnetic Ising model on the resulting random networks. We obtain an exact solution of this model on networks with finite loops or cliques as motifs. We find an exact critical temperature of the ferromagnetic phase transition and demonstrate that with increasing the clustering coefficient and the loop size, the critical temperature increases compared to ordinary tree-like complex networks. Our solution also gives the birth point of the giant connected component in these loopy networks.Comment: 9 pages, 4 figure

    Granger causality and transfer entropy are equivalent for Gaussian variables

    Full text link
    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.Comment: In review, Phys. Rev. Lett., Nov. 200

    Cleaning sky survey databases using Hough Transform and Renewal String approaches

    Get PDF
    Large astronomical databases obtained from sky surveys such as the SuperCOSMOS Sky Survey (SSS) invariably suffer from spurious records coming from artefactual effects of the telescope, satellites and junk objects in orbit around earth and physical defects on the photographic plate or CCD. Though relatively small in number these spurious records present a significant problem in many situations where they can become a large proportion of the records potentially of interest to a given astronomer. Accurate and robust techniques are needed for locating and flagging such spurious objects, and we are undertaking a programme investigating the use of machine learning techniques in this context. In this paper we focus on the four most common causes of unwanted records in the SSS: satellite or aeroplane tracks, scratches, fibres and other linear phenomena introduced to the plate, circular halos around bright stars due to internal reflections within the telescope and diffraction spikes near to bright stars. Appropriate techniques are developed for the detection of each of these. The methods are applied to the SSS data to develop a dataset of spurious object detections, along with confidence measures, which can allow these unwanted data to be removed from consideration. These methods are general and can be adapted to other astronomical survey data.Comment: Accepted for MNRAS. 17 pages, latex2e, uses mn2e.bst, mn2e.cls, md706.bbl, shortbold.sty (all included). All figures included here as low resolution jpegs. A version of this paper including the figures can be downloaded from http://www.anc.ed.ac.uk/~amos/publications.html and more details on this project can be found at http://www.anc.ed.ac.uk/~amos/sattrackres.htm

    A Theory of Cheap Control in Embodied Systems

    Full text link
    We present a framework for designing cheap control architectures for embodied agents. Our derivation is guided by the classical problem of universal approximation, whereby we explore the possibility of exploiting the agent's embodiment for a new and more efficient universal approximation of behaviors generated by sensorimotor control. This embodied universal approximation is compared with the classical non-embodied universal approximation. To exemplify our approach, we present a detailed quantitative case study for policy models defined in terms of conditional restricted Boltzmann machines. In contrast to non-embodied universal approximation, which requires an exponential number of parameters, in the embodied setting we are able to generate all possible behaviors with a drastically smaller model, thus obtaining cheap universal approximation. We test and corroborate the theory experimentally with a six-legged walking machine. The experiments show that the sufficient controller complexity predicted by our theory is tight, which means that the theory has direct practical implications. Keywords: cheap design, embodiment, sensorimotor loop, universal approximation, conditional restricted Boltzmann machineComment: 27 pages, 10 figure

    Probabilities, causation, and logic programming in conditional reasoning: reply to Stenning and Van Lambalgen (2016)

    Get PDF
    Oaksford and Chater (2014, Thinking and Reasoning, 20, 269–295) critiqued the logic programming (LP) approach to nonmonotonicity and proposed that a Bayesian probabilistic approach to conditional reasoning provided a more empirically adequate theory. The current paper is a reply to Stenning and van Lambalgen's rejoinder to this earlier paper entitled ‘Logic programming, probability, and two-system accounts of reasoning: a rejoinder to Oaksford and Chater’ (2016) in Thinking and Reasoning. It is argued that causation is basic in human cognition and that explaining how abnormality lists are created in LP requires causal models. Each specific rejoinder to the original critique is then addressed. While many areas of agreement are identified, with respect to the key differences, it is concluded the current evidence favours the Bayesian approach, at least for the moment

    Discovering a junction tree behind a Markov network by a greedy algorithm

    Full text link
    In an earlier paper we introduced a special kind of k-width junction tree, called k-th order t-cherry junction tree in order to approximate a joint probability distribution. The approximation is the best if the Kullback-Leibler divergence between the true joint probability distribution and the approximating one is minimal. Finding the best approximating k-width junction tree is NP-complete if k>2. In our earlier paper we also proved that the best approximating k-width junction tree can be embedded into a k-th order t-cherry junction tree. We introduce a greedy algorithm resulting very good approximations in reasonable computing time. In this paper we prove that if the Markov network underlying fullfills some requirements then our greedy algorithm is able to find the true probability distribution or its best approximation in the family of the k-th order t-cherry tree probability distributions. Our algorithm uses just the k-th order marginal probability distributions as input. We compare the results of the greedy algorithm proposed in this paper with the greedy algorithm proposed by Malvestuto in 1991.Comment: The paper was presented at VOCAL 2010 in Veszprem, Hungar

    Screening for left ventricular hypertrophy in patients with type 2 diabetes mellitus in the community

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Left ventricular hypertrophy (LVH) is a strong predictor of cardiovascular disease and is common among patients with type 2 diabetes. However, no systematic screening for LVH is currently recommended for patients with type 2 diabetes. The purpose of this study was to determine whether NT-proBNP was superior to 12-lead electrocardiography (ECG) for detection of LVH in patients with type 2 diabetes.</p> <p>Methods</p> <p>Prospective cross-sectional study comparing diagnostic accuracy of ECG and NT-proBNP for the detection of LVH among patients with type 2 diabetes. Inclusion criteria included having been diagnosed for > 5 years and/or on treatment for type 2 diabetes; patients with Stage 3/4 chronic kidney disease and known cardiovascular disease were excluded. ECG LVH was defined as either the Sokolow-Lyon or Cornell voltage criteria. NT-proBNP level was measured using the Roche Diagnostics Elecsys assay. Left ventricular mass was assessed from echocardiography. Receiver operating characteristic curve analysis was carried out and area under the curve (AUC) was calculated.</p> <p>Results</p> <p>294 patients with type 2 diabetes were recruited, mean age 58 (SD 11) years, BP 134/81 ± 18/11 mmHg, HbA<sub>1c </sub>7.3 ± 1.5%. LVH was present in 164 patients (56%). In a logistic regression model age, gender, BMI and a history of hypertension were important determinants of LVH (p < 0.05). Only 5 patients with LVH were detected by either ECG voltage criteria. The AUC for NT-proBNP in detecting LVH was 0.68.</p> <p>Conclusions</p> <p>LVH was highly prevalent in asymptomatic patients with type 2 diabetes. ECG was an inadequate test to identify LVH and while NT-proBNP was superior to ECG it remained unsuitable for detecting LVH. Thus, there remains a need for a screening tool to detect LVH in primary care patients with type 2 diabetes to enhance risk stratification and management.</p
    • …
    corecore