3,812 research outputs found
Time-dependent response of a zonally averaged ocean–atmosphere–sea ice model to Milankovitch forcing
Author Posting. © The Author(s), 2010. This is the author's version of the work. It is posted here by permission of Springer-Verlag for personal use, not for redistribution. The definitive version was published in Climate Dynamics 6 (2010): 763-779, doi:10.1007/s00382-010-0790-6.An ocean-atmosphere-sea ice model is developed to explore the time-dependent
response of climate to Milankovitch forcing for the time interval 5-3 Myr BP. The ocean
component is a zonally averaged model of the circulation in five basins (Arctic, Atlantic,
Indian, Pacific, and Southern Oceans). The atmospheric component is a one-dimensional
(latitudinal) energy balance model, and the sea-ice component is a thermodynamic model.
Two numerical experiments are conducted. The first experiment does not include sea ice
and the Arctic Ocean; the second experiment does. Results from the two experiments are
used to investigate (i) the response of annual mean surface air and ocean temperatures to
Milankovitch forcing, and (ii) the role of sea ice in this response.
In both experiments, the response of air temperature is dominated by obliquity cycles
at most latitudes. On the other hand, the response of ocean temperature varies with latitude
and depth. Deep water formed between 45°N-65°N in the Atlantic Ocean mainly responds
to precession. In contrast, deep water formed south of 60°S responds to obliquity when sea
ice is not included. Sea ice acts as a time-integrator of summer insolation changes such that
annual mean sea-ice conditions mainly respond to obliquity. Thus, in the presence of sea
ice, air temperature changes over the sea ice are amplified, and temperature changes in deep
water of southern origin are suppressed since water below sea ice is kept near the freezing
point.This work was supported by an NSERC Discovery
Grant awarded to L.A.M. We also thank GEC3 for a Network Grant
Cluster randomised trials in the medical literature: two bibliometric surveys
Background: Several reviews of published cluster randomised trials have reported that about half did not take clustering into account in the analysis, which was thus incorrect and potentially misleading. In this paper I ask whether cluster randomised trials are increasing in both number and quality of reporting. Methods: Computer search for papers on cluster randomised trials since 1980, hand search of trial reports published in selected volumes of the British Medical Journal over 20 years. Results: There has been a large increase in the numbers of methodological papers and of trial reports using the term 'cluster random' in recent years, with about equal numbers of each type of paper. The British Medical Journal contained more such reports than any other journal. In this journal there was a corresponding increase over time in the number of trials where subjects were randomised in clusters. In 2003 all reports showed awareness of the need to allow for clustering in the analysis. In 1993 and before clustering was ignored in most such trials. Conclusion: Cluster trials are becoming more frequent and reporting is of higher quality. Perhaps statistician pressure works
Detection of emphysema progression in alpha 1-antitrypsin deficiency using CT densitometry; Methodological advances
<p>Abstract</p> <p>Background</p> <p>Computer tomography (CT) densitometry is a potential tool for detecting the progression of emphysema but the optimum methodology is uncertain. The level of inspiration affects reproducibility but the ability to adjust for this variable is facilitated by whole lung scanning methods. However, emphysema is frequently localised to sub-regions of the lung and targeted densitometric sampling may be more informative than whole lung assessment.</p> <p>Methods</p> <p>Emphysema progression over a 2-year interval was assessed in 71 patients (alpha 1-antitrypsin deficiency with PiZ phenotype) with CT densitometry, using the 15<sup>th </sup>percentile point (Perc15) and voxel index (VI) -950 Hounsfield Units (HU) and -910 HU (VI -950 and -910) on whole lung, limited single slices, and apical, central and basal thirds. The relationship between whole lung densitometric progression (ΔCT) and change in CT-derived lung volume (ΔCT<sub>Vol</sub>) was characterised, and adjustment for lung volume using statistical modelling was evaluated.</p> <p>Results</p> <p>CT densitometric progression was statistically significant for all methods. ΔCT correlated with ΔCT<sub>Vol </sub>and linear regression indicated that nearly one half of lung density loss was secondary to apparent hyperinflation. The most accurate measure was obtained using a random coefficient model to adjust for lung volume and the greatest progression was detected by targeted sampling of the middle third of the lung.</p> <p>Conclusion</p> <p>Progressive hyperinflation may contribute significantly to loss of lung density, but volume effects and absolute tissue loss can be identified by statistical modelling. Targeted sampling of the middle lung region using Perc15 appears to be the most robust measure of emphysema progression.</p
Dimension-specific attention directs learning and listening on auditory training tasks
The relative contributions of bottom-up versus top-down sensory inputs to auditory learning are not well established. In our experiment, listeners were instructed to perform either a frequency discrimination (FD) task ("FD-train group") or an intensity discrimination (ID) task ("ID-train group") during training on a set of physically identical tones that were impossible to discriminate consistently above chance, allowing us to vary top-down attention whilst keeping bottom-up inputs fixed. A third, control group did not receive any training. Only the FD-train group improved on a FD probe following training, whereas all groups improved on ID following training. However, only the ID-train group also showed changes in performance accuracy as a function of interval with training on the ID task. These findings suggest that top-down, dimension-specific attention can direct auditory learning, even when this learning is not reflected in conventional performance measures of threshold change
A systematic review of biomarkers for disease progression in Parkinson's disease
Peer reviewedPublisher PD
Spatial and topological organization of DNA chains induced by gene co-localization
Transcriptional activity has been shown to relate to the organization of
chromosomes in the eukaryotic nucleus and in the bacterial nucleoid. In
particular, highly transcribed genes, RNA polymerases and transcription factors
gather into discrete spatial foci called transcription factories. However, the
mechanisms underlying the formation of these foci and the resulting topological
order of the chromosome remain to be elucidated. Here we consider a
thermodynamic framework based on a worm-like chain model of chromosomes where
sparse designated sites along the DNA are able to interact whenever they are
spatially close-by. This is motivated by recurrent evidence that there exists
physical interactions between genes that operate together. Three important
results come out of this simple framework. First, the resulting formation of
transcription foci can be viewed as a micro-phase separation of the interacting
sites from the rest of the DNA. In this respect, a thermodynamic analysis
suggests transcription factors to be appropriate candidates for mediating the
physical interactions between genes. Next, numerical simulations of the polymer
reveal a rich variety of phases that are associated with different topological
orderings, each providing a way to increase the local concentrations of the
interacting sites. Finally, the numerical results show that both
one-dimensional clustering and periodic location of the binding sites along the
DNA, which have been observed in several organisms, make the spatial
co-localization of multiple families of genes particularly efficient.Comment: Figures and Supplementary Material freely available on
http://dx.doi.org/10.1371/journal.pcbi.100067
Recommended from our members
Comparison of 5-year progression of retinitis pigmentosa involving the posterior pole among siblings by means of SD-OCT: a retrospective study
The blockchain technology promises to transform finance, money and evengovernments. However, analyses of blockchain applicability and robustness typicallyfocus on isolated systems whose actors contribute mainly by running the consensusalgorithm. Here, we highlight the importance of considering trustless platformswithin the broader ecosystem that includes social and communication networks. Asan example, we analyse the flash-crash observed on 21st June 2017 in the Ethereumplatform and show that a major phenomenon of social coordination led to acatastrophic cascade of events across several interconnected systems. We proposethe concept of “emergent centralisation” to describe situations where a single systembecomes critically important for the functioning of the whole ecosystem, and arguethat such situations are likely to become more and more frequent in interconnectedsocio-technical systems. We anticipate that the systemic approach we propose willhave implications for future assessments of trustless systems and call for the attentionof policy-makers on the fragility of our interconnected and rapidly changing world
Evaluation of the scale, causes and costs of waste medicines. Report of DH funded national project.
- …