4,412 research outputs found
Reinventing discovery learning: a field-wide research program
© 2017, Springer Science+Business Media B.V., part of Springer Nature. Whereas some educational designers believe that students should learn new concepts through explorative problem solving within dedicated environments that constrain key parameters of their search and then support their progressive appropriation of empowering disciplinary forms, others are critical of the ultimate efficacy of this discovery-based pedagogical philosophy, citing an inherent structural challenge of students constructing historically achieved conceptual structures from their ingenuous notions. This special issue presents six educational research projects that, while adhering to principles of discovery-based learning, are motivated by complementary philosophical stances and theoretical constructs. The editorial introduction frames the set of projects as collectively exemplifying the viability and breadth of discovery-based learning, even as these projects: (a) put to work a span of design heuristics, such as productive failure, surfacing implicit know-how, playing epistemic games, problem posing, or participatory simulation activities; (b) vary in their target content and skills, including building electric circuits, solving algebra problems, driving safely in traffic jams, and performing martial-arts maneuvers; and (c) employ different media, such as interactive computer-based modules for constructing models of scientific phenomena or mathematical problem situations, networked classroom collective âvideo games,â and intercorporeal masterâstudent training practices. The authors of these papers consider the potential generativity of their design heuristics across domains and contexts
Data-Oblivious Stream Productivity
We are concerned with demonstrating productivity of specifications of
infinite streams of data, based on orthogonal rewrite rules. In general, this
property is undecidable, but for restricted formats computable sufficient
conditions can be obtained. The usual analysis disregards the identity of data,
thus leading to approaches that we call data-oblivious. We present a method
that is provably optimal among all such data-oblivious approaches. This means
that in order to improve on the algorithm in this paper one has to proceed in a
data-aware fashion
Comparison of analgesic effects and patient tolerability of nabilone and dihydrocodeine for chronic neuropathic pain: randomised, crossover, double blind study
<b>Objective</b>: To compare the analgesic efficacy and side effects of the synthetic cannabinoid nabilone with those of the weak opioid dihydrocodeine for chronic neuropathic pain.
<b>Design</b>: Randomised, double blind, crossover trial of 14 weeksâ duration comparing dihydrocodeine and nabilone.
<b>Setting</b>: Outpatient units of three hospitals in the United Kingdom.
<b>Participants</b>: 96 patients with chronic neuropathic pain, aged 23-84 years.
<b>Main outcome measures</b>: The primary outcome was difference between nabilone and dihydrocodeine in pain, as measured by the mean visual analogue score computed over the last 2 weeks of each treatment period. Secondary outcomes were changes in mood, quality of life, sleep, and psychometric function. Side effects were measured by a questionnaire.
<b>Intervention</b>: Patients received a maximum daily dose of 240 mg dihydrocodeine or 2 mg nabilone at the end of each escalating treatment period of 6 weeks. Treatment periods were separated by a 2 week washout period.
<b>Results</b>: Mean baseline visual analogue score was 69.6 mm (range 29.4-95.2) on a 0-100 mm scale. 73 patients were included in the available case analysis and 64 patients in the per protocol analysis. The mean score was 6.0 mm longer for nabilone than for dihydrocodeine (95% confidence interval 1.4 to 10.5) in the available case analysis and 5.6 mm (10.3 to 0.8) in the per protocol analysis. Side effects were more frequent with nabilone.
<b>Conclusion</b>: Dihydrocodeine provided better pain relief than the synthetic cannabinoid nabilone and had slightly fewer side effects, although no major adverse events occurred for either drug
Does clinical management improve outcomes following self-Harm? Results from the multicentre study of self-harm in England
Background
Evidence to guide clinical management of self-harm is sparse, trials have recruited selected samples, and psychological treatments that are suggested in guidelines may not be available in routine practice.
Aims
To examine how the management that patients receive in hospital relates to subsequent outcome.
Methods
We identified episodes of self-harm presenting to three UK centres (Derby, Manchester, Oxford) over a 10 year period (2000 to 2009). We used established data collection systems to investigate the relationship between four aspects of management (psychosocial assessment, medical admission, psychiatric admission, referral for specialist mental health follow up) and repetition of self-harm within 12 months, adjusted for differences in baseline demographic and clinical characteristics.
Results
35,938 individuals presented with self-harm during the study period. In two of the three centres, receiving a psychosocial assessment was associated with a 40% lower risk of repetition, Hazard Ratios (95% CIs): Centre A 0.99 (0.90â1.09); Centre B 0.59 (0.48â0.74); Centre C 0.59 (0.52â0.68). There was little indication that the apparent protective effects were mediated through referral and follow up arrangements. The association between psychosocial assessment and a reduced risk of repetition appeared to be least evident in those from the most deprived areas.
Conclusion
These findings add to the growing body of evidence that thorough assessment is central to the management of self-harm, but further work is needed to elucidate the possible mechanisms and explore the effects in different clinical subgroups
Exact resultants for corner-cut unmixed multivariate polynomial systems using the dixon formulation
Structural conditions on the support of a multivariate polynomial system are developed for which the Dixon-based resultant methods compute exact resultants. For cases when this cannot be done, an upper bound on the degree of the extraneous factor in the projection operator can be determined a priori, thus resulting in quick identification of the extraneous factor in the projection operator. (For the bivariate case, the degree of the extraneous factor in a projection operator can be determined a priori.) The concepts of a corner-cut support and almost corner-cut support of an unmixed polynomial system are introduced. For generic unmixed polynomial systems with corner-cut and almost corner-cut supports, the Dixon based methods can be used to compute their resultants exactly. These structural conditions on supports are based on analyzing how such supports differ from box supports of n-degree systems for which the Dixon formulation is known to compute the resultants exactly. Such an analysis also gives a sharper bound on the complexity of resultant computation using the Dixon formulation in terms of the support and the mixed volume of the Newton polytope of the support. These results are a direct generalization of the authors â results on bivariate systems including the results of Zhang and Goldman as well as of Chionh for generic unmixed bivariate polynomial systems with corner-cut supports
Generating all polynomial invariants in simple loops
AbstractThis paper presents a method for automatically generating all polynomial invariants in simple loops. It is first shown that the set of polynomials serving as loop invariants has the algebraic structure of an ideal. Based on this connection, a fixpoint procedure using operations on ideals and Gröbner basis constructions is proposed for finding all polynomial invariants. Most importantly, it is proved that the procedure terminates in at most m+1 iterations, where m is the number of program variables. The proof relies on showing that the irreducible components of the varieties associated with the ideals generated by the procedure either remain the same or increase their dimension at every iteration of the fixpoint procedure. This yields a correct and complete algorithm for inferring conjunctions of polynomial equalities as invariants. The method has been implemented in Maple using the Groebner package. The implementation has been used to automatically discover non-trivial invariants for several examples to illustrate the power of the technique
Capturing Hiproofs in HOL Light
Hierarchical proof trees (hiproofs for short) add structure to ordinary proof
trees, by allowing portions of trees to be hierarchically nested. The
additional structure can be used to abstract away from details, or to label
particular portions to explain their purpose. In this paper we present two
complementary methods for capturing hiproofs in HOL Light, along with a tool to
produce web-based visualisations. The first method uses tactic recording, by
modifying tactics to record their arguments and construct a hierarchical tree;
this allows a tactic proof script to be modified. The second method uses proof
recording, which extends the HOL Light kernel to record hierachical proof trees
alongside theorems. This method is less invasive, but requires care to manage
the size of the recorded objects. We have implemented both methods, resulting
in two systems: Tactician and HipCam
Antibody localization in horse, rabbit, and goat antilymphocyte sera
The localization of antibodies was studied in rabbit, goat, and horse ALS raised by weekly immunization with canine or human spleen cells for 4 to 12 weeks. A combination of analytic techniques was used including column chromatography, electrophoresis, immunoelectrophoresis, determination of protein concentration, and measurement of antibody titers. In the rabbit and goat ALS, virtually all of the leukoagglutinins and lymphocytotoxins were in the easily separable IgG; accidentally induced thromboagglutinins were in the same location. In the rabbit hemagglutinins were found in both the IgG and IgM, whereas in the goat these were almost exclusively in the IgM. The antiwhite cell antibodies were most widely distributed in the horse. The cytotoxins were primarily in the IgG, but the leukoagglutinins were most heavily concentrated in the T-equine globulin which consists mostly of IgA. By differential ammonium sulfate precipitation of a horse antidoglymphocyte serum, fractions were prepared that were rich in IgG and IgA. Both were able to delay the rejection of canine renal homografts, the IgA-rich preparation to a somewhat greater degree. The findings in this study have been discussed in relation to the refining techniques that have been used for the production of globulin from heterologous ALS. © 1970
An Information-Theoretic Solution to Parameter Setting*
In this paper, we point out a possible way by which the child could obtain the target values of the word order parameters for her language. The essential idea is an entropy-based statistical analysis of the input stream
Rectification by charging -- the physics of contact-induced current asymmetry in molecular conductors
We outline the qualitatively different physics behind charging-induced
current asymmetries in molecular conductors operating in the weakly interacting
self-consistent field (SCF) and the strongly interacting Coulomb Blockade (CB)
regimes. A conductance asymmetry arises in SCF because of the unequal
mean-field potentials that shift a closed-shell conducting level differently
for positive and negative bias. A very different current asymmetry arises for
CB due to the unequal number of open-shell excitation channels at opposite bias
voltages. The CB regime, dominated by single charge effects, typically requires
a computationally demanding many-electron or Fock space description. However,
our analysis of molecular Coulomb Blockade measurements reveals that many novel
signatures can be explained using a {{simpler}} orthodox model that involves an
incoherent sum of Fock space excitations and {\it{hence treats the molecule as
a metallic dot or an island}}. This also reduces the complexity of the Fock
space description by just including various charge configurations only, thus
partially underscoring the importance of electronic structure, while retaining
the essence of the single charge nature of the transport process. We finally
point out, however, that the inclusion of electronic structure and hence
well-resolved Fock space excitations is crucial in some notable examples.Comment: 12 pages, 10 figure
- âŠ