12,358 research outputs found
Secondary literacy across the curriculum: Challenges and possibilities
This paper discusses the challenges and possibilities attendant upon successfully implementing literacy across the curriculum initiatives â or âschool language policiesâ as they have come to be known - particularly at the secondary or high school level. It provides a theoretical background to these issues, exploring previous academic discussions of school language policies, and highlights key areas of concern as well as opportunity with respect to school implementation of such policies. As such, it provides a necessary conceptual background to the subsequent papers in this special issue, which focus upon the Secondary Schoolsâ Literacy Initiative (SSLI) â a New Zealand funded programme that aims to establish cross-curricular language and literacy policies in secondary schools
Eigenvalue Separation in Some Random Matrix Models
The eigenvalue density for members of the Gaussian orthogonal and unitary
ensembles follows the Wigner semi-circle law. If the Gaussian entries are all
shifted by a constant amount c/Sqrt(2N), where N is the size of the matrix, in
the large N limit a single eigenvalue will separate from the support of the
Wigner semi-circle provided c > 1. In this study, using an asymptotic analysis
of the secular equation for the eigenvalue condition, we compare this effect to
analogous effects occurring in general variance Wishart matrices and matrices
from the shifted mean chiral ensemble. We undertake an analogous comparative
study of eigenvalue separation properties when the size of the matrices are
fixed and c goes to infinity, and higher rank analogues of this setting. This
is done using exact expressions for eigenvalue probability densities in terms
of generalized hypergeometric functions, and using the interpretation of the
latter as a Green function in the Dyson Brownian motion model. For the shifted
mean Gaussian unitary ensemble and its analogues an alternative approach is to
use exact expressions for the correlation functions in terms of classical
orthogonal polynomials and associated multiple generalizations. By using these
exact expressions to compute and plot the eigenvalue density, illustrations of
the various eigenvalue separation effects are obtained.Comment: 25 pages, 9 figures include
Traffic Network Optimum Principle - Minimum Probability of Congestion Occurrence
We introduce an optimum principle for a vehicular traffic network with road
bottlenecks. This network breakdown minimization (BM) principle states that the
network optimum is reached, when link flow rates are assigned in the network in
such a way that the probability for spontaneous occurrence of traffic breakdown
at one of the network bottlenecks during a given observation time reaches the
minimum possible value. Based on numerical simulations with a stochastic
three-phase traffic flow model, we show that in comparison to the well-known
Wardrop's principles the application of the BM principle permits considerably
greater network inflow rates at which no traffic breakdown occurs and,
therefore, free flow remains in the whole network.Comment: 22 pages, 6 figure
Systems, interactions and macrotheory
A significant proportion of early HCI research was guided by one very clear vision: that the existing theory base in psychology and cognitive science could be developed to yield engineering tools for use in the interdisciplinary context of HCI design. While interface technologies and heuristic methods for behavioral evaluation have rapidly advanced in both capability and breadth of application, progress toward deeper theory has been modest, and some now believe it to be unnecessary. A case is presented for developing new forms of theory, based around generic âsystems of interactors.â An overlapping, layered structure of macro- and microtheories could then serve an explanatory role, and could also bind together contributions from the different disciplines. Novel routes to formalizing and applying such theories provide a host of interesting and tractable problems for future basic research in HCI
On the Prior Sensitivity of Thompson Sampling
The empirically successful Thompson Sampling algorithm for stochastic bandits
has drawn much interest in understanding its theoretical properties. One
important benefit of the algorithm is that it allows domain knowledge to be
conveniently encoded as a prior distribution to balance exploration and
exploitation more effectively. While it is generally believed that the
algorithm's regret is low (high) when the prior is good (bad), little is known
about the exact dependence. In this paper, we fully characterize the
algorithm's worst-case dependence of regret on the choice of prior, focusing on
a special yet representative case. These results also provide insights into the
general sensitivity of the algorithm to the choice of priors. In particular,
with being the prior probability mass of the true reward-generating model,
we prove and regret upper bounds for the
bad- and good-prior cases, respectively, as well as \emph{matching} lower
bounds. Our proofs rely on the discovery of a fundamental property of Thompson
Sampling and make heavy use of martingale theory, both of which appear novel in
the literature, to the best of our knowledge.Comment: Appears in the 27th International Conference on Algorithmic Learning
Theory (ALT), 201
The open cluster NGC 6520 and the nearby dark molecular cloud Barnard 86
Wide field BVI photometry and CO(10) observations are presen ted
in the region of the open cluster NGC 6520 and the dark molecular cloud
Barnard~86. From the analysis of the optical data we find that the cluster is
rather compact, with a radius of 1.00.5 arcmin, smaller than previous
estimates. The cluster age is 15050 Myr and the reddening
E=0.420.10. The distance from the Sun is estimated to be
1900100 pc, and it is larger than previous estimates. We finally derive
basic properties of the dark nebula Barnard 86 on the assumption that it lies
at the same distance of the cluster.Comment: 21 pages, 8 eps figures (a few degraded in resolution), accepted for
publication in the Astronomical Journa
Kajian Refraksi-Difraksi Dan Transformasi Penjalaran Gelombang Laut Di Perairan Pantai Tapak Paderi Kota Bengkulu
Pantai Tapak Paderi Kota Bengkulu merupakan pantai yang telah beralih fungsi dari kawasan pelabuhan menjadi kawasan pantai wisata akibat kurangnya informasi terkait penjalaran gelombang laut di perairannya. Tujuan penelitian ini untuk mendapatkan informasi transformasi penjalaran gelombang terkait refraksi dan difraksi gelombang di perairan Tapak Paderi Kota Bengkulu berdasarkan jalur pengamatan. Proses pengolahan data terdiri dari konversi gelombang per musim dari data angin 10 tahun dan proses model simulasi penjalaran gelombang menggunakan BOUSS2D dari program Surface Water Modeling System 11. Hasil pengukuran lapangan selama tiga hari di perairan Tapak Paderi menghasilkan tinggi signifikan 1,6 m dan periode signifikan 7,75 s. Hasil konversi angin menggunakan metode SMB mengasilkan resume data gelombang per musim selama sepuluh tahun. Gelombang signifikan tertinggi di perairan Tapak Paderi terjadi pada musim Timur dan peralihan II, masing-masing memiliki tinggi signifikan 2,23 m dan 2,0 m dengan periode signifikan 7,12 s dan 7,0. Gelombang di perairan Tapak Paderi memiliki dua arah datang dominan, yakni gelombang datang dari kelompok fetch barat laut dan kelompok fetch selatan. Hasil analisis simulasi model berdasarkan jalur pengamatan menunjukkan bahwa gelombang mengalami refraksi di jalur pengamatan 1, mengalami difraksi setelah adanya Gosong Patasembilan di jalur pengamatan 2, mengalami difraksi setelah adanya anak Gosong Patasembilan dan konvergensi gelombang pada Tanjung Pantai Tapak Paderi di jalur pengamatan 3, mengalami konvergensi pada daerah tanjung dan divergensi pada daerah teluk di jalur pengamatan 4. Sedangkan untuk tinggi gelombang terendah sebesar 0,29 m terjadi pada musim peralihan II pada titik pengamatan di belakang ujung bangunan pantai Tapak Paderi
Rethinking the patient: using Burden of Treatment Theory to understand the changing dynamics of illness
<b>Background</b> In this article we outline Burden of Treatment Theory, a new model of the relationship between sick people, their social networks, and healthcare services. Health services face the challenge of growing populations with long-term and life-limiting conditions, they have responded to this by delegating to sick people and their networks routine work aimed at managing symptoms, and at retarding - and sometimes preventing - disease progression. This is the new proactive work of patient-hood for which patients are increasingly accountable: founded on ideas about self-care, self-empowerment, and self-actualization, and on new technologies and treatment modalities which can be shifted from the clinic into the community. These place new demands on sick people, which they may experience as burdens of treatment.<p></p>
<b>Discussion</b> As the burdens accumulate some patients are overwhelmed, and the consequences are likely to be poor healthcare outcomes for individual patients, increasing strain on caregivers, and rising demand and costs of healthcare services. In the face of these challenges we need to better understand the resources that patients draw upon as they respond to the demands of both burdens of illness and burdens of treatment, and the ways that resources interact with healthcare utilization.<p></p>
<b>Summary</b> Burden of Treatment Theory is oriented to understanding how capacity for action interacts with the work that stems from healthcare. Burden of Treatment Theory is a structural model that focuses on the work that patients and their networks do. It thus helps us understand variations in healthcare utilization and adherence in different healthcare settings and clinical contexts
The Clumping Transition in Niche Competition: a Robust Critical Phenomenon
We show analytically and numerically that the appearance of lumps and gaps in
the distribution of n competing species along a niche axis is a robust
phenomenon whenever the finiteness of the niche space is taken into account. In
this case depending if the niche width of the species is above or
below a threshold , which for large n coincides with 2/n, there are
two different regimes. For the lumpy pattern emerges
directly from the dominant eigenvector of the competition matrix because its
corresponding eigenvalue becomes negative. For the lumpy
pattern disappears. Furthermore, this clumping transition exhibits critical
slowing down as is approached from above. We also find that the number
of lumps of species vs. displays a stair-step structure. The positions
of these steps are distributed according to a power-law. It is thus
straightforward to predict the number of groups that can be packed along a
niche axis and it coincides with field measurements for a wide range of the
model parameters.Comment: 16 pages, 7 figures;
http://iopscience.iop.org/1742-5468/2010/05/P0500
- âŠ