974 research outputs found
Estimating the birth prevalence and pregnancy outcomes of congenital malformations worldwide
Congenital anomaly registries have two main surveillance aims: firstly to define baseline epidemiology of important congenital anomalies to facilitate programme, policy and resource planning, and secondly to identify clusters of cases and any other epidemiological changes that could give early warning of environmental or infectious hazards. However, setting up a sustainable registry and surveillance system is resource-intensive requiring national infrastructure for recording all cases and diagnostic facilities to identify those malformations that that are not externally visible. Consequently, not all countries have yet established robust surveillance systems. For these countries, methods are needed to generate estimates of prevalence of these disorders which can act as a starting point for assessing disease burden and service implications. Here, we describe how registry data from high-income settings can be used for generating reference rates that can be used as provisional estimates for countries with little or no observational data on non-syndromic congenital malformations
The Economic Resource Receipt of New Mothers
U.S. federal policies do not provide a universal social safety net of economic support for women during pregnancy or the immediate postpartum period but assume that employment and/or marriage will protect families from poverty. Yet even mothers with considerable human and marital capital may experience disruptions in employment, earnings, and family socioeconomic status postbirth. We use the National Survey of Families and Households to examine the economic resources that mothers with children ages 2 and younger receive postbirth, including employment, spouses, extended family and social network support, and public assistance. Results show that many new mothers receive resources postbirth. Marriage or postbirth employment does not protect new mothers and their families from poverty, but education, race, and the receipt of economic supports from social networks do
Calibration of myocardial T2 and T1 against iron concentration.
BACKGROUND: The assessment of myocardial iron using T2* cardiovascular magnetic resonance (CMR) has been validated and calibrated, and is in clinical use. However, there is very limited data assessing the relaxation parameters T1 and T2 for measurement of human myocardial iron.
METHODS: Twelve hearts were examined from transfusion-dependent patients: 11 with end-stage heart failure, either following death (n=7) or cardiac transplantation (n=4), and 1 heart from a patient who died from a stroke with no cardiac iron loading. Ex-vivo R1 and R2 measurements (R1=1/T1 and R2=1/T2) at 1.5 Tesla were compared with myocardial iron concentration measured using inductively coupled plasma atomic emission spectroscopy.
RESULTS: From a single myocardial slice in formalin which was repeatedly examined, a modest decrease in T2 was observed with time, from mean (± SD) 23.7 ± 0.93 ms at baseline (13 days after death and formalin fixation) to 18.5 ± 1.41 ms at day 566 (p<0.001). Raw T2 values were therefore adjusted to correct for this fall over time. Myocardial R2 was correlated with iron concentration [Fe] (R2 0.566, p<0.001), but the correlation was stronger between LnR2 and Ln[Fe] (R2 0.790, p<0.001). The relation was [Fe] = 5081•(T2)-2.22 between T2 (ms) and myocardial iron (mg/g dry weight). Analysis of T1 proved challenging with a dichotomous distribution of T1, with very short T1 (mean 72.3 ± 25.8 ms) that was independent of iron concentration in all hearts stored in formalin for greater than 12 months. In the remaining hearts stored for <10 weeks prior to scanning, LnR1 and iron concentration were correlated but with marked scatter (R2 0.517, p<0.001). A linear relationship was present between T1 and T2 in the hearts stored for a short period (R2 0.657, p<0.001).
CONCLUSION: Myocardial T2 correlates well with myocardial iron concentration, which raises the possibility that T2 may provide additive information to T2* for patients with myocardial siderosis. However, ex-vivo T1 measurements are less reliable due to the severe chemical effects of formalin on T1 shortening, and therefore T1 calibration may only be practical from in-vivo human studies
Methods to estimate access to care and the effect of interventions on the outcomes of congenital disorders
In the absence of intervention, early-onset congenital disorders lead to pregnancy loss, early death, or disability. Currently, lack of epidemiological data from many settings limits the understanding of the burden of these conditions, thus impeding health planning, policy-making, and commensurate resource allocation. The Modell Global Database of Congenital Disorders (MGDb) seeks to meet this need by combining general biological principles with observational and demographic data, to generate estimates of the burden of congenital disorders. A range of interventions along the life course can modify adverse outcomes associated with congenital disorders. Hence, access to and quality of services available for the prevention and care of congenital disorders affects both their birth prevalence and the outcomes for affected individuals. Information on this is therefore important to enable burden estimates for settings with limited observational data, but is lacking from many settings. This paper, the third in this special issue on methods used in the MGDb for estimating the global burden of congenital disorders, describes key interventions that impact on outcomes of congenital disorders and methods used to estimate their coverage where empirical data are not available
The efficacy of iron chelator regimes in reducing cardiac and hepatic iron in patients with thalassaemia major: a clinical observational study
<p>Abstract</p> <p>Background</p> <p>Available iron chelation regimes in thalassaemia may achieve different changes in cardiac and hepatic iron as assessed by MR. The aim of this study was to assess the efficacy of four available iron chelator regimes in 232 thalassaemia major patients by assessing the rate of change in repeated measurements of cardiac and hepatic MR.</p> <p>Results</p> <p>For the heart, deferiprone and the combination of deferiprone and deferoxamine significantly reduced cardiac iron at all levels of iron loading. As patients were on deferasirox for a shorter time, a second analysis ("Initial interval analysis") assessing the change between the first two recorded MR results for both cardiac and hepatic iron (minimum interval 12 months) was made. Combination therapy achieved the most rapid fall in cardiac iron load at all levels and deferiprone alone was significantly effective with moderate and mild iron load. In the liver, deferasirox effected significant falls in iron load and combination therapy resulted in the most rapid decline.</p> <p>Conclusion</p> <p>With the knowledge of the efficacy of the different available regimes and the specific iron load in the heart and the liver, appropriate tailoring of chelation therapy should allow clearance of iron. Combination therapy is best in reducing both cardiac and hepatic iron, while monotherapy with deferiprone or deferasirox are effective in the heart and liver respectively. The outcomes of this study may be useful to physicians as to the chelation they should prescribe according to the levels of iron load found in the heart and liver by MR.</p
From Household Size to the Life Course
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/66696/2/10.1177_000276427702100207.pd
Comparison of 3 T and 1.5 T for T2* magnetic resonance of tissue iron.
BACKGROUND: T2* magnetic resonance of tissue iron concentration has improved the outcome of transfusion dependant anaemia patients. Clinical evaluation is performed at 1.5 T but scanners operating at 3 T are increasing in numbers. There is a paucity of data on the relative merits of iron quantification at 3 T vs 1.5 T. METHODS: A total of 104 transfusion dependent anaemia patients and 20 normal volunteers were prospectively recruited to undergo cardiac and liver T2* assessment at both 1.5 T and 3 T. Intra-observer, inter-observer and inter-study reproducibility analysis were performed on 20 randomly selected patients for cardiac and liver T2*. RESULTS: Association between heart and liver T2* at 1.5 T and 3 T was non-linear with good fit (R (2) = 0.954, p < 0.001 for heart white-blood (WB) imaging; R (2) = 0.931, p < 0.001 for heart black-blood (BB) imaging; R (2) = 0.993, p < 0.001 for liver imaging). R2* approximately doubled between 1.5 T and 3 T with linear fits for both heart and liver (94, 94 and 105 % respectively). Coefficients of variation for intra- and inter-observer reproducibility, as well as inter-study reproducibility trended to be less good at 3 T (3.5 to 6.5 %) than at 1.5 T (1.4 to 5.7 %) for both heart and liver T2*. Artefact scores for the heart were significantly worse with the 3 T BB sequence (median 4, IQR 2-5) compared with the 1.5 T BB sequence (4 [3-5], p = 0.007). CONCLUSION: Heart and liver T2* and R2* at 3 T show close association with 1.5 T values, but there were more artefacts at 3 T and trends to lower reproducibility causing difficulty in quantifying low T2* values with high tissue iron. Therefore T2* imaging at 1.5 T remains the gold standard for clinical practice. However, in centres where only 3 T is available, equivalent values at 1.5 T may be approximated by halving the 3 T tissue R2* with subsequent conversion to T2*
Interstellar MHD Turbulence and Star Formation
This chapter reviews the nature of turbulence in the Galactic interstellar
medium (ISM) and its connections to the star formation (SF) process. The ISM is
turbulent, magnetized, self-gravitating, and is subject to heating and cooling
processes that control its thermodynamic behavior. The turbulence in the warm
and hot ionized components of the ISM appears to be trans- or subsonic, and
thus to behave nearly incompressibly. However, the neutral warm and cold
components are highly compressible, as a consequence of both thermal
instability in the atomic gas and of moderately-to-strongly supersonic motions
in the roughly isothermal cold atomic and molecular components. Within this
context, we discuss: i) the production and statistical distribution of
turbulent density fluctuations in both isothermal and polytropic media; ii) the
nature of the clumps produced by thermal instability, noting that, contrary to
classical ideas, they in general accrete mass from their environment; iii) the
density-magnetic field correlation (or lack thereof) in turbulent density
fluctuations, as a consequence of the superposition of the different wave modes
in the turbulent flow; iv) the evolution of the mass-to-magnetic flux ratio
(MFR) in density fluctuations as they are built up by dynamic compressions; v)
the formation of cold, dense clouds aided by thermal instability; vi) the
expectation that star-forming molecular clouds are likely to be undergoing
global gravitational contraction, rather than being near equilibrium, and vii)
the regulation of the star formation rate (SFR) in such gravitationally
contracting clouds by stellar feedback which, rather than keeping the clouds
from collapsing, evaporates and diperses them while they collapse.Comment: 43 pages. Invited chapter for the book "Magnetic Fields in Diffuse
Media", edited by Elisabete de Gouveia dal Pino and Alex Lazarian. Revised as
per referee's recommendation
Patients with primary immunodeficiencies are a reservoir of poliovirus and a risk to polio eradication
ABSTARCT: Immunodeficiency-associated vaccine-derived polioviruses (iVDPVs) have been isolated from primary immunodeficiency (PID) patients exposed to oral poliovirus vaccine (OPV). Patients may excrete poliovirus strains for months or years; the excreted viruses are frequently highly divergent from the parental OPV and have been shown to be as neurovirulent as wild virus. Thus, these patients represent a potential reservoir for transmission of neurovirulent polioviruses in the post-eradication era. In support of WHO recommendations to better estimate the prevalence of poliovirus excreters among PIDs and characterize genetic evolution of these strains, 635 patients including 570 with primary antibody deficiencies and 65 combined immunodeficiencies were studied from 13 OPV-using countries. Two stool samples were collected over 4 days, tested for enterovirus, and the poliovirus positive samples were sequenced. Thirteen patients (2%) excreted polioviruses, most for less than 2 months following identification of infection. Five (0.8%) were classified as iVDPVs (only in combined immunodeficiencies and mostly poliovirus serotype 2). Non-polio enteroviruses were detected in 30 patients (4.7%). Patients with combined immunodeficiencies had increased risk of delayed poliovirus clearance compared to primary antibody deficiencies. Usually, iVDPV was detected in subjects with combined immunodeficiencies in a short period of time after OPV exposure, most for less than 6 months. Surveillance for poliovirus excretion among PID patients should be reinforced until polio eradication is certified and the use of OPV is stopped. Survival rates among PID patients are improving in lower and middle income countries, and iVDPV excreters are identified more frequently. Antivirals or enhanced immunotherapies presently in development represent the only potential means to manage the treatment of prolonged excreters and the risk they present to the polio endgame. Keywords: Poliovirus eradication, Immunodeficiency-associated vaccine-derived polioviruses, Oral poliovirus vaccine, Humoral immunodeficiency, Combined immunodeficiency, Primary immunodeficienc
- …