22,458 research outputs found
A pragmatic cluster randomized controlled trial of an educational intervention for GPs in the assessment and management of depression
Background. General practitioners (GPs) can be provided with effective training in the skills to manage depression. However, it remains uncertain whether such training achieves health gain for their patients.
Method. The study aimed to measure the health gain from training GPs in skills for the assessment and management of depression. The study design was a cluster randomized controlled trial. GP participants were assessed for recognition of psychological disorders, attitudes to depression, prescribing patterns and experience of psychiatry and communication skills training. They were then randomized to receive training at baseline or the end of the study. Patients selected by GPs were assessed at baseline, 3 and 12 months. The primary outcome was depression status, measured by HAM-D. Secondary outcomes were psychiatric symptoms (GHQ-12) quality of life (SF-36), satisfaction with consultations, and health service use and costs.
Results. Thirty-eight GPs were recruited and 36 (95%) completed the study. They selected 318 patients, of whom 189 (59%) were successfully recruited. At 3 months there were no significant differences between intervention and control patients on HAM-D, GHQ-12 or SF-36. At 12 months there was a positive training effect in two domains of the SF-36, but no differences in HAM-D, GHQ-12 or health care costs. Patients reported trained GPs as somewhat better at listening and understanding but not in the other aspects of satisfaction.
Conclusions. Although training programmes may improve GPs' skills in managing depression, this does not appear to translate into health gain for depressed patients or the health service
Evidence for the reliability and validity, and some support for the practical utility of the two-factor Consideration of Future Consequences Scale-14
Researchers have proposed 1-factor, 2-factor, and bifactor solutions to the 12-item Consideration of Future Consequences Scale (CFCS-12). In order to overcome some measurement problems and to create a robust and conceptually useful two-factor scale the CFCS-12 was recently modified to include two new items and to become the CFCS-14. Using a University sample, we tested four competing models for the CFCS-14: (a) a 12-item unidimensional model, (b) a model fitted for two uncorrelated factors (CFC-Immediate and CFC-Future), (c) a model fitted for two correlated factors (CFC-I and CFC-F), and (d) a bifactor model. Results suggested that the addition of the two new items has strengthened the viability of a two factor solution of the CFCS-14. Results of linear regression models suggest that the CFC-F factor is redundant. Further studies using alcohol and mental health indicators are required to test this redundancy
Measuring time perspective in adolescents : can you get the right answer by asking the wrong questions?
Time perspective continues to evolve as a psychological construct. The extant literature suggests that higher future orientation and lower present orientation are associated with better developmental outcomes. However, the extant literature also suggests that issues remain with the measurement of the construct. Recently, a 25-item version of the Zimbardo Time Perspective Inventory (ZTPI-25) was suggested for use based on high internal consistency estimates and good discriminant validity of scores in a sample of Italian adolescents. However, the genesis of this scale is uncertain. The present study examined the factorial validity, reliability, and concurrent validity of ZTPI-25 scores in Slovenian, American, and British adolescents. Results revealed satisfactory concurrent validity based on correlations with measures used in the development of the full ZTPI. However, internal consistency and factorial validity of scores were unsatisfactory. The present study questions the use of the ZTPI-25 with adolescents in the context of conceptual and measurement issues more broadly
Nonperturbative renormalization group in a light-front three-dimensional real scalar model
The three-dimensional real scalar model, in which the symmetry
spontaneously breaks, is renormalized in a nonperturbative manner based on the
Tamm-Dancoff truncation of the Fock space. A critical line is calculated by
diagonalizing the Hamiltonian regularized with basis functions. The marginal
() coupling dependence of the critical line is weak. In the broken
phase the canonical Hamiltonian is tachyonic, so the field is shifted as
. The shifted value is determined as a function of
running mass and coupling so that the mass of the ground state vanishes.Comment: 23 pages, LaTeX, 6 Postscript figures, uses revTeX and epsbox.sty. A
slight revision of statements made, some references added, typos correcte
A new mechanism for negative refraction and focusing using selective diffraction from surface corrugation
Refraction at a smooth interface is accompanied by momentum transfer normal
to the interface. We show that corrugating an initially smooth, totally
reflecting, non-metallic interface provides a momentum kick parallel to the
surface, which can be used to refract light negatively or positively. This new
mechanism of negative refraction is demonstrated by visible light and microwave
experiments on grisms (grating-prisms). Single-beam
all-angle-negative-refraction is achieved by incorporating a surface grating on
a flat multilayered material. This negative refraction mechanism is used to
create a new optical device, a grating lens. A plano-concave grating lens is
demonstrated to focus plane microwaves to a point image. These results show
that customized surface engineering can be used to achieve negative refraction
even though the bulk material has positive refractive index. The surface
periodicity provides a tunable parameter to control beam propagation leading to
novel optical and microwave devices.Comment: 6 pages, 7 figures in RevTex forma
Dangerous implications of a minimum length in quantum gravity
The existence of a minimum length and a generalization of the Heisenberg
uncertainty principle seem to be two fundamental ingredients required in any
consistent theory of quantum gravity. In this letter we show that they would
predict dangerous processes which are phenomenologically unacceptable. For
example, long--lived virtual super--Planck mass black holes may lead to rapid
proton decay. Possible solutions of this puzzle are briefly discussed.Comment: 5 pages, no figure. v3: refereed versio
Optical and Infrared Spectroscopy
Contains research objectives and reports on two research projects.Joint Services Electronics Programs (U. S. Army, U.S. Navy, and U.S. Air Force) under Contract DA 36-039-AMC-03200(E)U.S. Air Force (ESD Contract AF19(628)-6066)Sloan Fund for Basic Research (M.I.T. Grant
Making mentoring work: The need for rewiring epistemology
To help produce expert coaches at both participation and performance levels, a number of governing bodies have established coach mentoring systems. In light of the limited literature on coach mentoring, as well as the risks of superficial treatment by coach education systems, this paper therefore critically discusses the role of the mentor in coach development, the nature of the mentor-mentee relationship and, most specifically, how expertise in the mentee may best be developed. If mentors are to be effective in developing expert coaches then we consequently argue that a focus on personal epistemology is required. On this basis, we present a framework that conceptualizes mentee development on this level through a step by step progression, rather than unrealistic and unachievable leap toward expertise. Finally, we consider the resulting implications for practice and research with respect to one-on-one mentoring, communities of practice, and formal coach education
Architectural mismatch tolerance
The integrity of complex software systems built from existing components is becoming more dependent on the integrity of the mechanisms used to interconnect these components and, in particular, on the ability of these mechanisms to cope with architectural mismatches that might exist between components. There is a need to detect and handle (i.e. to tolerate) architectural mismatches during runtime because in the majority of practical situations it is impossible to localize and correct all such mismatches during development time. When developing complex software systems, the problem is not only to identify the appropriate components, but also to make sure that these components are interconnected in a way that allows mismatches to be tolerated. The resulting architectural solution should be a system based on the existing components, which are independent in their nature, but are able to interact in well-understood ways. To find such a solution we apply general principles of fault tolerance to dealing with arch itectural mismatche
Context Dependence, MOPs,WHIMs and procedures Recanati and Kaplan on Cognitive Aspects in Semantics
After presenting Kripke’s criticism to Frege’s ideas on context dependence of thoughts, I present two recent attempts of considering cognitive aspects of context dependent expressions inside a truth conditional pragmatics or semantics: Recanati’s non-descriptive modes of presentation (MOPs) and Kaplan’s ways of having in mind (WHIMs). After analysing the two attempts and verifying which answers they should give to the problem discussed by Kripke, I suggest a possible interpretation of these attempts: to insert a procedural or algorithmic level in semantic representations of indexicals. That a function may be computed by different procedures might suggest new possibilities of integrating contextual cognitive aspects in model theoretic semanti
- …