2,743 research outputs found
Estimating Dynamic Models of Imperfect Competition
We describe a two-step algorithm for estimating dynamic games under the assumption that behavior is consistent with Markov Perfect Equilibrium. In the first step, the policy functions and the law of motion for the state variables are estimated. In the second step, the remaining structural parameters are estimated using the optimality conditions for equilibrium. The second step estimator is a simple simulated minimum distance estimator. The algorithm applies to a broad class of models, including I.O. models with both discrete and continuous controls such as the Ericson and Pakes (1995) model. We test the algorithm on a class of dynamic discrete choice models with normally distributed errors, and a class of dynamic oligopoly models similar to that of Pakes and McGuire (1994).
Early non-psychotic deviant behaviour as an endophenotypic marker in bipolar disorder, schizo-affective disorder and schizophrenia
Objective: To determine and compare the incidence of early non-psychotic deviant behaviour (i.e. under the age of ten) in Afrikaner patients with bipolar disorder, schizo-affective disorder and schizophrenia. Methods: Patients with bipolar disorder, schizo-affective disorder and schizophrenia were interviewed using a structured questionnaire probing for early deviant childhood behaviour starting before the age of 10 years. Information from close family members was also obtained where possible. Seven areas of possible deviance were probed into: social dysfunction, unprovoked aggression, extreme anxiety, chronic sadness, extreme odd behaviours, attention impairment and learning difficulties. Demographic data included: age, marital status, gender, and years of formal education. The following clinical features were also recorded: age of onset of illness and suicide attempts. Results: A total of 74 patients diagnosed with bipolar disorder, 43 patients diagnosed with schizo-affective disorder and 80 patients diagnosed with schizophrenia were interviewed. Early deviant behaviour was statistically more prevalent in schizophrenia (65%) and schizo-affective disorder (60,5%), than in the bipolar group (21,6%). Deviant childhood behaviour was grouped into 3 clusters: social functioning impairment cluster (social isolation, aggression, extreme odd behavior), mood/anxiety cluster (extreme fears, chronic sadness) and a cognitive impairment cluster (attention impairment, learning disability). Bipolar patients showed significantly less social functioning and cognitive impairment compared to patients with schizo-affective disorder and schizophrenia. Conclusion: Our findings suggest that early deviant behaviour may be a possible endophenotypic marker in schizophrenia and schizoaffective disorder. Keywords: early non-psychotic deviant behaviour, endophenotype, bipolar disorder, schizo-affective disorder, schizophrenia South African Psychiatry Review Vol. 8(4) 2005: 153-15
Bicarbonate-responsive “soluble” adenylyl cyclase defines a nuclear cAMP microdomain
Bicarbonate-responsive “soluble” adenylyl cyclase resides, in part, inside the mammalian cell nucleus where it stimulates the activity of nuclear protein kinase A to phosphorylate the cAMP response element binding protein (CREB). The existence of this complete and functional, nuclear-localized cAMP pathway establishes that cAMP signals in intracellular microdomains and identifies an alternate pathway leading to CREB activation
The Computational Complexity of Generating Random Fractals
In this paper we examine a number of models that generate random fractals.
The models are studied using the tools of computational complexity theory from
the perspective of parallel computation. Diffusion limited aggregation and
several widely used algorithms for equilibrating the Ising model are shown to
be highly sequential; it is unlikely they can be simulated efficiently in
parallel. This is in contrast to Mandelbrot percolation that can be simulated
in constant parallel time. Our research helps shed light on the intrinsic
complexity of these models relative to each other and to different growth
processes that have been recently studied using complexity theory. In addition,
the results may serve as a guide to simulation physics.Comment: 28 pages, LATEX, 8 Postscript figures available from
[email protected]
Regulatory Improvement Legislation: Risk Assessment, Cost-Benefit Analysis, and Judicial Review
As the number, cost, and complexity of federal regulations have grown over the past twenty years, there has been growing interest in the use of analytic tools such as risk assessment and cost-benefit analysis to improve the regulatory process. The application of these tools to public health, safety, and environmental problems has become commonplace in the peer-reviewed scientific and medical literatures. Recent studies prepared by Resources for the Future, the American Enterprise Institute, the Brookings Institution, and the Harvard Center for Risk Analysis have demonstrated how formal analyses can and often do help government agencies achieve more protection against hazards at less cost than would otherwise occur. Although analytic tools hold great promise, their use by federal agencies is neither consistent nor rigorous.
The 103rd, 104th, 105th and 106th Congresses demonstrated sustained interest in the passage of comprehensive legislation governing the employment of these tools in the federal regulatory process. While legislative proposals on this issue have attracted significant bipartisan interest, and recent amendments to particular enabling statutes have incorporated some of these analytical requirements, no comprehensive legislation has been enacted into law since passage of the Administrative Procedure Act in 1946.
The inability to pass such legislation has been attributed to a variety of factors, but a common substantive concern has been uncertainty and controversy about how such legislation should address judicial review issues. For example, the judicial review portion of The Regulatory Improvement Act (S. 981), the 105th Congress\u27s major legislative initiative, was criticized simultaneously as meaningless (for allegedly offering too few opportunities for petitioners to challenge poorly reasoned agency rules) and dangerous (as supposedly enabling petitioners to paralyze even well-reasoned agency rules). Thus, a significant obstacle to regulatory improvement legislation appears to be the conflicting opinions among legal scholars and practitioners about how judicial review issues should be addressed in such legislation. The Clinton Administration and the authors of S. 981 believe they have crafted a workable compromise, one that accommodates the need to bring more rigor and transparency to an agency\u27s decisional processes without imposing excessive judicial review. Nevertheless, it is clear that their agreement on this subject, if included in future legislative deliberations, will be scrutinized and contested.
Recognizing the importance of the judicial review issue to this and, indeed, any effort to improve the regulatory process, the Center for Risk Analysis at the Harvard School of Public Health convened an invitational Workshop of accomplished legal practitioners and scholars to discuss how judicial review should be handled in legislation of this kind. The full-day Workshop was conducted in Washington, D.C. on December 17, 1998. Its purpose was to discuss principles, experiences, and insights that might inform future public debate about how judicial review should be addressed in legislative proposals that entail use of risk assessment and/or cost-benefit analysis in agency decision-making (whether the proposals are comprehensive or agency-specific).
In order to provide the Workshop a practical focus, participants analyzed the provisions of S. 981 (as modified at the request of the Clinton Administration). An exchange of letters between S. 981\u27s chief sponsors and the Clinton Administration defining the terms of the agreement was examined as well. This Report highlights the themes of the Workshop discussion and offers some specific commentary on how proposed legislation (including but not limited to S. 981) could be improved in future legislative deliberations
Regulatory Improvement Legislation: Risk Assessment, Cost-Benefit Analysis, and Judicial Review
As the number, cost, and complexity of federal regulations have grown over the past twenty years, there has been growing interest in the use of analytic tools such as risk assessment and cost-benefit analysis to improve the regulatory process. The application of these tools to public health, safety, and environmental problems has become commonplace in the peer-reviewed scientific and medical literatures. Recent studies prepared by Resources for the Future, the American Enterprise Institute, the Brookings Institution, and the Harvard Center for Risk Analysis have demonstrated how formal analyses can and often do help government agencies achieve more protection against hazards at less cost than would otherwise occur. Although analytic tools hold great promise, their use by federal agencies is neither consistent nor rigorous.
The 103rd, 104th, 105th and 106th Congresses demonstrated sustained interest in the passage of comprehensive legislation governing the employment of these tools in the federal regulatory process. While legislative proposals on this issue have attracted significant bipartisan interest, and recent amendments to particular enabling statutes have incorporated some of these analytical requirements, no comprehensive legislation has been enacted into law since passage of the Administrative Procedure Act in 1946.
The inability to pass such legislation has been attributed to a variety of factors, but a common substantive concern has been uncertainty and controversy about how such legislation should address judicial review issues. For example, the judicial review portion of The Regulatory Improvement Act (S. 981), the 105th Congress\u27s major legislative initiative, was criticized simultaneously as meaningless (for allegedly offering too few opportunities for petitioners to challenge poorly reasoned agency rules) and dangerous (as supposedly enabling petitioners to paralyze even well-reasoned agency rules). Thus, a significant obstacle to regulatory improvement legislation appears to be the conflicting opinions among legal scholars and practitioners about how judicial review issues should be addressed in such legislation. The Clinton Administration and the authors of S. 981 believe they have crafted a workable compromise, one that accommodates the need to bring more rigor and transparency to an agency\u27s decisional processes without imposing excessive judicial review. Nevertheless, it is clear that their agreement on this subject, if included in future legislative deliberations, will be scrutinized and contested.
Recognizing the importance of the judicial review issue to this and, indeed, any effort to improve the regulatory process, the Center for Risk Analysis at the Harvard School of Public Health convened an invitational Workshop of accomplished legal practitioners and scholars to discuss how judicial review should be handled in legislation of this kind. The full-day Workshop was conducted in Washington, D.C. on December 17, 1998. Its purpose was to discuss principles, experiences, and insights that might inform future public debate about how judicial review should be addressed in legislative proposals that entail use of risk assessment and/or cost-benefit analysis in agency decision-making (whether the proposals are comprehensive or agency-specific).
In order to provide the Workshop a practical focus, participants analyzed the provisions of S. 981 (as modified at the request of the Clinton Administration). An exchange of letters between S. 981\u27s chief sponsors and the Clinton Administration defining the terms of the agreement was examined as well. This Report highlights the themes of the Workshop discussion and offers some specific commentary on how proposed legislation (including but not limited to S. 981) could be improved in future legislative deliberations
Differentiating amyloid beta spread in autosomal dominant and sporadic Alzheimer\u27s disease
Amyloid-beta deposition is one of the hallmark pathologies in both sporadic Alzheimer\u27s disease and autosomal-dominant Alzheimer\u27s disease, the latter of which is caused by mutations in genes involved in amyloid-beta processing. Despite amyloid-beta deposition being a centrepiece to both sporadic Alzheimer\u27s disease and autosomal-dominant Alzheimer\u27s disease, some differences between these Alzheimer\u27s disease subtypes have been observed with respect to the spatial pattern of amyloid-beta. Previous work has shown that the spatial pattern of amyloid-beta in individuals spanning the sporadic Alzheimer\u27s disease spectrum can be reproduced with high accuracy using an epidemic spreading model which simulates the diffusion of amyloid-beta across neuronal connections and is constrained by individual rates of amyloid-beta production and clearance. However, it has not been investigated whether amyloid-beta deposition in the rarer autosomal-dominant Alzheimer\u27s disease can be modelled in the same way, and if so, how congruent the spreading patterns of amyloid-beta across sporadic Alzheimer\u27s disease and autosomal-dominant Alzheimer\u27s disease are. We leverage the epidemic spreading model as a data-driven approach to probe individual-level variation in the spreading patterns of amyloid-beta across three different large-scale imaging datasets (2 sporadic Alzheimer\u27s disease, 1 autosomal-dominant Alzheimer\u27s disease). We applied the epidemic spreading model separately to the Alzheimer\u27s Disease Neuroimaging initiative (n = 737), the Open Access Series of Imaging Studies (n = 510) and the Dominantly Inherited Alzheimer\u27s Network (n = 249), the latter two of which were processed using an identical pipeline. We assessed inter-and intra-individual model performance in each dataset separately and further identified the most likely subject-specific epicentre of amyloid-beta spread. Using epicentres defined in previous work in sporadic Alzheimer\u27s disease, the epidemic spreading model provided moderate prediction of the regional pattern of amyloid-beta deposition across all three datasets. We further find that, whilst the most likely epicentre for most amyloid-beta-positive subjects overlaps with the default mode network, 13% of autosomal-dominant Alzheimer\u27s disease individuals were best characterized by a striatal origin of amyloid-beta spread. These subjects were also distinguished by being younger than autosomal-dominant Alzheimer\u27s disease subjects with a default mode network amyloid-beta origin, despite having a similar estimated age of symptom onset. Together, our results suggest that most autosomal-dominant Alzheimer\u27s disease patients express amyloid-beta spreading patterns similar to those of sporadic Alzheimer\u27s disease, but that there may be a subset of autosomal-dominant Alzheimer\u27s disease patients with a separate, striatal phenotype
Recommended from our members
Genomics of Loa loa, a Wolbachia-free filarial parasite of humans
Loa loa, the African eyeworm, is a major filarial pathogen of humans. Unlike most filariae, Loa loa does not contain the obligate intracellular Wolbachia endosymbiont. We describe the 91.4 Mb genome of Loa loa, and the genome of the related filarial parasite Wuchereria bancrofti, and predict 14,907 Loa loa genes based on microfilarial RNA sequencing. By comparing these genomes to that of another filarial parasite, Brugia malayi, and to several other nematode genomes, we demonstrate synteny among filariae but not with non-parasitic nematodes. The Loa loa genome encodes many immunologically relevant genes, as well as protein kinases targeted by drugs currently approved for humans. Despite lacking Wolbachia, Loa loa shows no new metabolic synthesis or transport capabilities compared to other filariae. These results suggest that the role played by Wolbachia in filarial biology is more subtle than previously thought and reveal marked differences between parasitic and non-parasitic nematodes
The “Soluble” Adenylyl Cyclase in Sperm Mediates Multiple Signaling Events Required for Fertilization
SummaryMammalian fertilization is dependent upon a series of bicarbonate-induced, cAMP-dependent processes sperm undergo as they “capacitate,” i.e., acquire the ability to fertilize eggs. Male mice lacking the bicarbonate- and calcium-responsive soluble adenylyl cyclase (sAC), the predominant source of cAMP in male germ cells, are infertile, as the sperm are immotile. Membrane-permeable cAMP analogs are reported to rescue the motility defect, but we now show that these “rescued” null sperm were not hyperactive, displayed flagellar angulation, and remained unable to fertilize eggs in vitro. These deficits uncover a requirement for sAC during spermatogenesis and/or epididymal maturation and reveal limitations inherent in studying sAC function using knockout mice. To circumvent this restriction, we identified a specific sAC inhibitor that allowed temporal control over sAC activity. This inhibitor revealed that capacitation is defined by separable events: induction of protein tyrosine phosphorylation and motility are sAC dependent while acrosomal exocytosis is not dependent on sAC
- …