23,870 research outputs found
Policy Barriers to School Improvement: What's Real and What's Imagined?
Some of the most promising reforms are happening where school leaders are thinking differently about how to get the strongest student outcomes from the limited resources available. But even principals who use their autonomy to aggressively reallocate resources say that persistent district, state, and federal barriers prohibit them from doing more.What are these barriers? What do they block principals from doing? Is there a way around them?CRPE researchers probed these questions with principals in three states (NH, CT, MD). These principals cited numerous district, state, and federal barriers standing in the way of school improvement. The barriers, 128 in all, fell into three categories: 1) barriers to instructional innovations, 2) barriers to allocating resources differently, and 3) barriers to improving teacher quality.Upon investigation, researchers found that principals have far more authority than they think. Only 31% of the barriers cited were "real" -- immovable statutes, policies, or managerial directives that bring the threat of real consequences if broken.The report recommends educating principals on the authority they already possess, to help them find workarounds to onerous rules. The report also outlines a number of specific state and district policy changes to grant schools the autonomy they need to improve student outcomes
Spectral Methods for Numerical Relativity. The Initial Data Problem
Numerical relativity has traditionally been pursued via finite differencing.
Here we explore pseudospectral collocation (PSC) as an alternative to finite
differencing, focusing particularly on the solution of the Hamiltonian
constraint (an elliptic partial differential equation) for a black hole
spacetime with angular momentum and for a black hole spacetime superposed with
gravitational radiation. In PSC, an approximate solution, generally expressed
as a sum over a set of orthogonal basis functions (e.g., Chebyshev
polynomials), is substituted into the exact system of equations and the
residual minimized. For systems with analytic solutions the approximate
solutions converge upon the exact solution exponentially as the number of basis
functions is increased. Consequently, PSC has a high computational efficiency:
for solutions of even modest accuracy we find that PSC is substantially more
efficient, as measured by either execution time or memory required, than finite
differencing; furthermore, these savings increase rapidly with increasing
accuracy. The solution provided by PSC is an analytic function given
everywhere; consequently, no interpolation operators need to be defined to
determine the function values at intermediate points and no special
arrangements need to be made to evaluate the solution or its derivatives on the
boundaries. Since the practice of numerical relativity by finite differencing
has been, and continues to be, hampered by both high computational resource
demands and the difficulty of formulating acceptable finite difference
alternatives to the analytic boundary conditions, PSC should be further pursued
as an alternative way of formulating the computational problem of finding
numerical solutions to the field equations of general relativity.Comment: 15 pages, 5 figures, revtex, submitted to PR
Determination of the absorption length of CO2, Nd:YAG and high power diode laser radiation for a selected grouting material
The laser beam absorption lengths of CO2, Nd:YAG and a high power diode laser (HPDL) radiation for a newly developed SiO2/Al2O3-based tile grout have been determined through the application of Beer-Lambert’s law. The findings revealed marked differences in the absorption lengths despite the material having similar beam absorption coefficients for the lasers. The absorption lengths for the SiO2/Al2O3-based tile grout for CO2, Nd:YAG and HPDL radiation were calculated as being 23211 m, 1934 m and 1838 m respectively. Moreover, this method of laser beam absorption length determination, which has hitherto been used predominantly with lasers operated in the pulsed mode, is shown to be valid for use with lasers operated in the continuous wave (CW) mode, depending upon the material being treated
Parallel resampling in the particle filter
Modern parallel computing devices, such as the graphics processing unit
(GPU), have gained significant traction in scientific and statistical
computing. They are particularly well-suited to data-parallel algorithms such
as the particle filter, or more generally Sequential Monte Carlo (SMC), which
are increasingly used in statistical inference. SMC methods carry a set of
weighted particles through repeated propagation, weighting and resampling
steps. The propagation and weighting steps are straightforward to parallelise,
as they require only independent operations on each particle. The resampling
step is more difficult, as standard schemes require a collective operation,
such as a sum, across particle weights. Focusing on this resampling step, we
analyse two alternative schemes that do not involve a collective operation
(Metropolis and rejection resamplers), and compare them to standard schemes
(multinomial, stratified and systematic resamplers). We find that, in certain
circumstances, the alternative resamplers can perform significantly faster on a
GPU, and to a lesser extent on a CPU, than the standard approaches. Moreover,
in single precision, the standard approaches are numerically biased for upwards
of hundreds of thousands of particles, while the alternatives are not. This is
particularly important given greater single- than double-precision throughput
on modern devices, and the consequent temptation to use single precision with a
greater number of particles. Finally, we provide auxiliary functions useful for
implementation, such as for the permutation of ancestry vectors to enable
in-place propagation.Comment: 21 pages, 6 figure
Joint bidding, information pooling, and the performance of petroleum lease auctions / BEBR No.889
Includes bibliographical references (p. 22)
Collider Searches for Long-Lived Particles Beyond the Standard Model
Experimental tests of the Standard Model of particle physics (SM) find
excellent agreement with its predictions. Since the original formation of the
SM, experiments have provided little guidance regarding the explanations of
phenomena outside the SM, such as the baryon asymmetry and dark matter. Nor
have we understood the aesthetic and theoretical problems of the SM, despite
years of searching for physics beyond the Standard Model (BSM) at particle
colliders. Some BSM particles can be produced at colliders yet evade being
discovered, if the reconstruction and analysis procedures not matched to
characteristics of the particle. An example is particles with large lifetimes.
As interest in searches for such long-lived particles (LLPs) grows rapidly, a
review of the topic is presented in this article. The broad range of
theoretical motivations for LLPs and the experimental strategies and methods
employed to search for them are described. Results from decades of LLP searches
are reviewed, as are opportunities for the next generation of searches at both
existing and future experiments.Comment: 79 pages, 36 figures, submitted to Progress in Particle and Nuclear
Physic
Lemons on the Web: A Signalling Approach to the Problem of Trust in Internet Commerce
Asymmetric information is at the heart of situations involving trust. In the case of B2C Internet commerce, the information asymmetry typically relates to the difficulty that consumers have of distinguishing between "trustworthy" and "untrustworthy" Web merchants. The impasse can be resolved by the use of signals by trustworthy Web merchants to differentiate themselves from untrustworthy ones. Using an experimental design where subjects are exposed to a series of purchase choices, we investigate three possible signals, an unconditional money-back guarantee, branding, and privacy statement, and test their efficacy. Our empirical results confirm the predictions suggested by signalling theory.trust (social behaviour), consumer behaviour
Chicago Music City
Chicago Music City compares the strength and vitality of music industries and scenes across the United States. Sociologists, urban planners, and real-estate developers point to quality of life and availability of cultural amenities as important indicators of the health and future success of urban areas. Economic impact studies show the importance of music to local economies. This publication compares Chicago's musical strength with the 50 largest metropolitan areas in the U.S., focusing on 11 comparison cities: Chicago and its demographic peers, New York and Los Angeles, and eight other cities with strong musical reputations -- Atlanta, Austin, Boston, Las Vegas, Memphis, Nashville, New Orleans and Seattle
- …