2,213 research outputs found
Supervisory Practices of Three Female Principals in the Era of No Child Left Behind
The purpose of this study was to describe the present status of teacher supervision and evaluation in the era of No Child Left Behind (NCLB) as experienced by three female elementary principals and twelve female elementary teachers in a suburban school district in Western Pennsylvania. The study compared the findings from the literature in the areas of supervision and evaluation, leadership, communication style, power orientation, and ethic of care, with the beliefs and reality of present practice. The literature cited focused on the ways that female principals enact the role of an instructional leader when supervising and evaluating teachers.The study took the form of a case study in order to provide a detailed description of a single school district in Western Pennsylvania. Three elementary schools, each headed by a female principal, were studied in the district. Interview questions were constructed based on the research questions. Each interview was transcribed and content analysis was employed to identify commonalities in the data. Common themes were identified for each research question based on the responses of the principals and teachers.The study revealed profound consistency between the information cited in the literature and the information reported by the three elementary principals and twelve female elementary teachers in the areas of supervision and evaluation, leadership, communication style, power orientation, and ethic of care. The study also revealed the potential conflicts between the beliefs of the principals and the NCLB legislation and the effects of NCLB on the practices of the principals and teachers
Recommended from our members
School-university partnerships: fulfilling the potential. Summary Report: October 2014
Recommended from our members
Seismic data clustering management system
This is the abstract of the paper given at the conference. Copyright @ 2011 The Authors.Over the last years, seismic images have increasingly played a vital role to the study of earthquakes. The large volume of seismic data that has been accumulated has created the need to develop sophisticated systems to manage this kind of data. Seismic interpretation can play a much more active role in the evaluation of large volumes of data by providing at an early stage vital information relating to the framework of potential producing levels. [1] This work presents a novel method to manage and analyse seismic data. The data is initially turned into clustering maps using clustering techniques [2] [3] [4] [5] [6], in order to be analysed on the platform. These clustering maps can then be analysed with the friendly-user interface of Seismic 1 which is based on .Net framework architecture [7]. This feature permits the porting of the application in any Windows – based computer as also to many other Linux based environments, using the Mono project functionality [8], so it can run an application using the No-Touch Deployment [7]. The platform supports two ways of processing seismic data. Firstly, a fast multifunctional version of the classical region-growing segmentation algorithm [9], [10] is applied to various areas of interest permitting their precise definition and labelling. Moreover, this algorithm is assigned to automatically allocate new earthquakes to a particular cluster based upon the magnitude of the centre of gravity of the existing clusters; or create a new cluster if all centers of gravity are above a predefined by the user upper threshold point. Secondly, a visual technique is used to record the behaviour of a cluster of earthquakes in a designated area. In this way, the system functions as a dynamic temporal simulator which depicts sequences of earthquakes on a map [11]
A new approach to analysing HST spatial scans: the transmission spectrum of HD 209458 b
The Wide Field Camera 3 (WFC3) on Hubble Space Telescope (HST) is currently
one of the most widely used instruments for observing exoplanetary atmospheres,
especially with the use of the spatial scanning technique. An increasing number
of exoplanets have been studied using this technique as it enables the
observation of bright targets without saturating the sensitive detectors. In
this work we present a new pipeline for analyzing the data obtained with the
spatial scanning technique, starting from the raw data provided by the
instrument. In addition to commonly used correction techniques, we take into
account the geometric distortions of the instrument, whose impact may become
important when combined to the scanning process. Our approach can improve the
photometric precision for existing data and also push further the limits of the
spatial scanning technique, as it allows the analysis of even longer spatial
scans. As an application of our method and pipeline, we present the results
from a reanalysis of the spatially scanned transit spectrum of HD 209458 b. We
calculate the transit depth per wavelength channel with an average relative
uncertainty of 40 ppm. We interpret the final spectrum with T-Rex, our fully
Bayesian spectral retrieval code, which confirms the presence of water vapor
and clouds in the atmosphere of HD 209458 b. The narrow wavelength range limits
our ability to disentangle the degeneracies between the fitted atmospheric
parameters. Additional data over a broader spectral range are needed to address
this issue.Comment: 13 pages, 15 figures, 7 tables, Accepted for publication in Ap
Neural Insights into the Relation between Language and Communication
The human capacity to communicate has been hypothesized to be causally dependent upon language. Intuitively this seems plausible since most communication relies on language. Moreover, intention recognition abilities (as a necessary prerequisite for communication) and language development seem to co-develop. Here we review evidence from neuroimaging as well as from neuropsychology to evaluate the relationship between communicative and linguistic abilities. Our review indicates that communicative abilities are best considered as neurally distinct from language abilities. This conclusion is based upon evidence showing that humans rely on different cortical systems when designing a communicative message for someone else as compared to when performing core linguistic tasks, as well as upon observations of individuals with severe language loss after extensive lesions to the language system, who are still able to perform tasks involving intention understanding
Formulaic Language in People with Probable Alzheimer's Disease: A Frequency-Based Approach
BACKGROUND: Language change can be a valuable biological marker of overall cognitive change in Alzheimer's disease (AD) and other forms of dementia. Previous reports have described increased use of language formulas in AD, i.e., combinations likely processed in a holistic manner. Words that commonly occur together are more likely to become a formula. OBJECTIVE: To determine if frequency of co-occurrence as one indicator for formulaic language can distinguish people with probable AD from controls and if variables are sensitive to time post-symptom onset. METHODS: We developed the Frequency in Language Analysis Tool (FLAT), which indicates degrees of formulaicity in an individual language sample. The FLAT accomplishes this by comparing individual language samples to co-occurrence data from the British National Corpus (BNC). Our analysis also contained more conventional language variables in order to assess novel contributions of the FLAT. We analyzed data from the Pitt Corpus, which is part of DementiaBank. RESULTS: Both conventional and co-occurrence variables were able to distinguish AD and control groups. According to co-occurrence data, people with probable AD produced more formulaic language than controls. Only co-occurrence variables correlated with disease progression. DISCUSSION: Frequency of word co-occurrences is one indicator for formulaicity and a valuable contribution to characterizing language change in AD
Recommended from our members
Teaching schools evaluation. Research Brief
This Research Brief reports the findings from a two-year study (2013-15) in to the work of teaching schools and their alliances commissioned by the National College for Teaching and Leadership (NCTL). The broad aim of the study was to investigate the effectiveness and impact of teaching schools on improvement, and identify the quality and scope of external support that are required to enhance these . This was achieved through combining qualitative and quantitative data collection and analysis derived from three research activities: case studies of 26 teaching schools alliances (TSAs), a national survey of the first three cohorts of 345 TSAs, and secondary research and analysis of national performance and inspection results
3D freeform surfaces from planar sketches using neural networks
A novel intelligent approach into 3D freeform surface reconstruction from planar sketches is proposed. A multilayer perceptron (MLP) neural network is employed to induce 3D freeform surfaces from planar freehand curves. Planar curves were used to represent the boundaries of a freeform surface patch. The curves were varied iteratively and sampled to produce training data to train and test the neural network. The obtained results demonstrate that the network successfully learned the inverse-projection map and correctly inferred the respective surfaces from fresh curves
- …