347 research outputs found

    Behavioral Inhibition as a Risk Factor for the Development of Childhood Anxiety Disorders: A Longitudinal Study

    Get PDF
    This longitudinal study examined the additive and interactive effects of behavioral inhibition and a wide range of other vulnerability factors in the development of anxiety problems in youths. A sample of 261 children, aged 5 to 8 years, 124 behaviorally inhibited and 137 control children, were followed during a 3-year period. Assessments took place on three occasions to measure children’s level of behavioral inhibition, anxiety disorder symptoms, other psychopathological symptoms, and a number of other vulnerability factors such as insecure attachment, negative parenting styles, adverse life events, and parental anxiety. Results obtained with Structural Equation Modeling indicated that behavioral inhibition primarily acted as a specific risk factor for the development of social anxiety symptoms. Furthermore, the longitudinal model showed additive as well as interactive effects for various vulnerability factors on the development of anxiety symptoms. That is, main effects of anxious rearing and parental trait anxiety were found, whereas behavioral inhibition and attachment had an interactive effect on anxiety symptomatology. Moreover, behavioral inhibition itself was also influenced by some of the vulnerability factors. These results provide support for dynamic, multifactorial models for the etiology of child anxiety problems

    Contributions to the Power Spectrum of Cosmic Microwave Background from Fluctuations Caused by Clusters of Galaxies

    Get PDF
    We estimate the contributions to the cosmic microwave background radiation (CMBR) power spectrum from the static and kinematic Sunyaev-Zel'dovich (SZ) effects, and from the moving cluster of galaxies (MCG) effect. We conclude, in agreement with other studies, that at sufficiently small scales secondary fluctuations caused by clusters provide important contributions to the CMBR. At 3000\ell \gtrsim 3000, these secondary fluctuations become important relative to lensed primordial fluctuations. Gravitational lensing at small angular scales has been proposed as a way to break the ``geometric degeneracy'' in determining fundamental cosmological parameters. We show that this method requires the separation of the static SZ effect, but the kinematic SZ effect and the MCG effect are less important. The power spectrum of secondary fluctuations caused by clusters of galaxies, if separated from the spectrum of lensed primordial fluctuations, might provide an independent constraint on several important cosmological parameters.Comment: LateX, 41 pages and 10 figures. Accepted for publication in the Astrophysical Journa

    The Abnormally Weighting Energy Hypothesis: the Missing Link between Dark Matter and Dark Energy

    Full text link
    We generalize tensor-scalar theories of gravitation by the introduction of an abnormally weighting type of energy. This theory of tensor-scalar anomalous gravity is based on a relaxation of the weak equivalence principle that is now restricted to ordinary visible matter only. As a consequence, the convergence mechanism toward general relativity is modified and produces naturally cosmic acceleration as an inescapable gravitational feedback induced by the mass-variation of some invisible sector. The cosmological implications of this new theoretical framework are studied. From the Hubble diagram cosmological test \textit{alone}, this theory provides an estimation of the amount of baryons and dark matter in the Universe that is consistent with the independent cosmological tests of Cosmic Microwave Background (CMB) and Big Bang Nucleosynthesis (BBN). Cosmic coincidence is naturally achieved from a equally natural assumption on the amplitude of the scalar coupling strength. Finally, from the adequacy to supernovae data, we derive a new intriguing relation between the space-time dependences of the gravitational coupling and the dark matter mass, providing an example of crucial constraint on microphysics from cosmology. This glimpses at an enticing new symmetry between the visible and invisible sectors, namely that the scalar charges of visible and invisible matter are exactly opposite.Comment: 24 pages, 6 figures, new version with extended discussions and added references. Accepted for publication in JCAP (sept. 2008

    Inflationary attractor in Gauss-Bonnet brane cosmology

    Full text link
    The inflationary attractor properties of the canonical scalar field and Born-Infeld field are investigated in the Randall-Sundrum II scenario with a Gauss-Bonnet term in the bulk action. We find that the inflationary attractor property will always hold for both the canonical and Born-Infeld fields for any allowed non-negative Gauss-Bonnet coupling. We also briefly discuss the possibility of explaining the suppressed lower multiples and running scalar spectral index simultaneously in the scenario of Gauss-Bonnet brane inflation.Comment: 7 pages, no figures. An error in the discussion of BI field corrected, conclusion correcte

    Modelling Clock Synchronization in the Chess gMAC WSN Protocol

    Get PDF
    We present a detailled timed automata model of the clock synchronization algorithm that is currently being used in a wireless sensor network (WSN) that has been developed by the Dutch company Chess. Using the Uppaal model checker, we establish that in certain cases a static, fully synchronized network may eventually become unsynchronized if the current algorithm is used, even in a setting with infinitesimal clock drifts

    Using Scenarios to Validate Requirements through the use of Eye-Tracking in Prototyping

    Get PDF
    Research has shown that eliciting and capturing the correct behavior of systems reduces the number of defects that a system contains. A requirements engineer will model the functions of the system to gain a comprehensive understanding of the system in question. Engineers must verify the model for correctness by either having another engineer review it or build a prototype and validate with a stakeholder. However, research has shown that this form of verification can be ineffective because looking at an existing model can be suggestive and stump the development of new ideas. This paper provides an automated technique that can be used as an unbiased review of use case scenarios. Using the prototype and a scenario, a stakeholder can be guided through the use case scenario demonstrating where they expect to find the next step while their eye movements are tracked. Analysis of the eye tracking data can be used to identify missing requirements such as interaction steps that should have alternative sequences or determining problems with the flow of actions

    Genesis of Dark Energy: Dark Energy as Consequence of Release and Two-stage Tracking Cosmological Nuclear Energy

    Full text link
    Recent observations on Type-Ia supernovae and low density (Ωm=0.3\Omega_{m} = 0.3) measurement of matter including dark matter suggest that the present-day universe consists mainly of repulsive-gravity type `exotic matter' with negative-pressure often said `dark energy' (Ωx=0.7\Omega_{x} = 0.7). But the nature of dark energy is mysterious and its puzzling questions, such as why, how, where and when about the dark energy, are intriguing. In the present paper the authors attempt to answer these questions while making an effort to reveal the genesis of dark energy and suggest that `the cosmological nuclear binding energy liberated during primordial nucleo-synthesis remains trapped for a long time and then is released free which manifests itself as dark energy in the universe'. It is also explained why for dark energy the parameter w=2/3w = - {2/3}. Noting that w=1 w = 1 for stiff matter and w=1/3w = {1/3} for radiation; w=2/3w = - {2/3} is for dark energy because "1""-1" is due to `deficiency of stiff-nuclear-matter' and that this binding energy is ultimately released as `radiation' contributing "+1/3""+ {1/3}", making w=1+1/3=2/3w = -1 + {1/3} = - {2/3}. When dark energy is released free at Z=80Z = 80, w=2/3w = -{2/3}. But as on present day at Z=0Z = 0 when radiation strength has diminished to δ0\delta \to 0, w=1+δ1/3=1w = -1 + \delta{1/3} = - 1. This, thus almost solves the dark-energy mystery of negative pressure and repulsive-gravity. The proposed theory makes several estimates /predictions which agree reasonably well with the astrophysical constraints and observations. Though there are many candidate-theories, the proposed model of this paper presents an entirely new approach (cosmological nuclear energy) as a possible candidate for dark energy.Comment: 17 pages, 4 figures, minor correction

    Overestimating Outcome Rates: Statistical Estimation When Reliability Is Suboptimal

    Full text link
    To demonstrate how failure to account for measurement error in an outcome (dependent) variable can lead to significant estimation errors and to illustrate ways to recognize and avoid these errors. Data Sources . Medical literature and simulation models. Study Design/Data Collection . Systematic review of the published and unpublished epidemiological literature on the rate of preventable hospital deaths and statistical simulation of potential estimation errors based on data from these studies. Principal Findings . Most estimates of the rate of preventable deaths in U.S. hospitals rely upon classifying cases using one to three physician reviewers (implicit review). Because this method has low to moderate reliability, estimates based on statistical methods that do not account for error in the measurement of a “preventable death” can result in significant overestimation. For example, relying on a majority rule rating with three reviewers per case (reliability ∼0.45 for the average of three reviewers) can result in a 50–100 percent overestimation compared with an estimate based upon a reliably measured outcome (e.g., by using 50 reviewers per case). However, there are statistical methods that account for measurement error that can produce much more accurate estimates of outcome rates without requiring a large number of measurements per case. Conclusion . The statistical principles discussed in this case study are critically important whenever one seeks to estimate the proportion of cases belonging to specific categories (such as estimating how many patients have inadequate blood pressure control or identifying high-cost or low-quality physicians). When the true outcome rate is low (<20 percent), using an outcome measure that has low-to-moderate reliability will generally result in substantially overestimating the proportion of the population having the outcome unless statistical methods that adjust for measurement error are used.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/74896/1/j.1475-6773.2006.00661.x.pd

    Cosmological Dynamics of Phantom Field

    Get PDF
    We study the general features of the dynamics of the phantom field in the cosmological context. In the case of inverse coshyperbolic potential, we demonstrate that the phantom field can successfully drive the observed current accelerated expansion of the universe with the equation of state parameter wϕ<1w_{\phi} < -1. The de-Sitter universe turns out to be the late time attractor of the model. The main features of the dynamics are independent of the initial conditions and the parameters of the model. The model fits the supernova data very well, allowing for 2.4<wϕ<1-2.4 < w_{\phi} < -1 at 95 % confidence level.Comment: Typos corrected. Some clarifications and references added. To appear in Physical Review
    corecore