3,617 research outputs found
DNMTs are required for delayed genome instability caused by radiation
This is an open-access article licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License. The article may be redistributed, reproduced, and reused for non-commercial purposes, provided the original source is properly cited - Copyright @ 2012 Landes Bioscience.The ability of ionizing radiation to initiate genomic instability has been harnessed in the clinic where the localized delivery of controlled doses of radiation is used to induce cell death in tumor cells. Though very effective as a therapy, tumor relapse can occur in vivo and its appearance has been attributed to the radio-resistance of cells with stem cell-like features. The molecular mechanisms underlying these phenomena are unclear but there is evidence suggesting an inverse correlation between radiation-induced genomic instability and global hypomethylation. To further investigate the relationship between DNA hypomethylation, radiosensitivity and genomic stability in stem-like cells we have studied mouse embryonic stem cells containing differing levels of DNA methylation due to the presence or absence of DNA methyltransferases. Unexpectedly, we found that global levels of methylation do not determine radiosensitivity. In particular, radiation-induced delayed genomic instability was observed at the Hprt gene locus only in wild-type cells. Furthermore, absence of Dnmt1 resulted in a 10-fold increase in de novo Hprt mutation rate, which was unaltered by radiation. Our data indicate that functional DNMTs are required for radiation-induced genomic instability, and that individual DNMTs play distinct roles in genome stability. We propose that DNMTS may contribute to the acquirement of radio-resistance in stem-like cells.This study is funded by NOTE, BBSRC and the Royal Society Dorothy Hodgkin Research Fellowship
LV Mass Assessed by Echocardiography and CMR, Cardiovascular Outcomes, and Medical Practice
The authors investigated 3 important areas related to the clinical use of left ventricular mass (LVM): accuracy of assessments by echocardiography and cardiac magnetic resonance (CMR), the ability to predict cardiovascular outcomes, and the comparative value of different indexing methods. The recommended formula for echocardiographic estimation of LVM uses linear measurements and is based on the assumption of the left ventricle (LV) as a prolate ellipsoid of revolution. CMR permits a modeling of the LV free of cardiac geometric assumptions or acoustic window dependency, showing better accuracy and reproducibility. However, echocardiography has lower cost, easier availability, and better tolerability. From the MEDLINE database, 26 longitudinal echocardiographic studies and 5 CMR studies investigating LVM or LV hypertrophy as predictors of death or major cardiovascular outcomes were identified. LVM and LV hypertrophy were reliable cardiovascular risk predictors using both modalities. However, no study directly compared the methods for the ability to predict events, agreement in hypertrophy classification, or performance in cardiovascular risk reclassification. Indexing LVM to body surface area was the earliest normalization process used, but it seems to underestimate the prevalence of hypertrophy in obese and overweight subjects. Dividing LVM by height to the allometric power of 1.7 or 2.7 is the most promising normalization method in terms of practicality and usefulness from a clinical and scientific standpoint for scaling myocardial mass to body size. The measurement of LVM, calculation of LVM index, and classification for LV hypertrophy should be standardized by scientific societies across measurement techniques and adopted by clinicians in risk stratification and therapeutic decision making
Left ventricle mass by cardiac magnetic resonance and echocardiography: the multi-ethnic study of atherosclerosis
Understanding evidence: a statewide survey to explore evidence-informed public health decision-making in a local government setting
Background: The value placed on types of evidence within decision-making contexts is highly dependent on individuals, the organizations in which the work and the systems and sectors they operate in. Decision-making processes too are highly contextual. Understanding the values placed on evidence and processes guiding decision-making is crucial to designing strategies to support evidence-informed decision-making (EIDM). This paper describes how evidence is used to inform local government (LG) public health decisions.Methods: The study used mixed methods including a cross-sectional survey and interviews. The Evidence-Informed Decision-Making Tool (EvIDenT) survey was designed to assess three key domains likely to impact on EIDM: access, confidence, and organizational culture. Other elements included the usefulness and influence of sources of evidence (people/groups and resources), skills and barriers, and facilitators to EIDM. Forty-five LGs from Victoria, Australia agreed to participate in the survey and up to four people from each organization were invited to complete the survey (n = 175). To further explore definitions of evidence and generate experiential data on EIDM practice, key informant interviews were conducted with a range of LG employees working in areas relevant to public health.Results: In total, 135 responses were received (75% response rate) and 13 interviews were conducted. Analysis revealed varying levels of access, confidence and organizational culture to support EIDM. Significant relationships were found between domains: confidence, culture and access to research evidence. Some forms of evidence (e.g. community views) appeared to be used more commonly and at the expense of others (e.g. research evidence). Overall, a mixture of evidence (but more internal than external evidence) was influential in public health decision-making in councils. By comparison, a mixture of evidence (but more external than internal evidence) was deemed to be useful in public health decision-making.Conclusions: This study makes an important contribution to understanding how evidence is used within the public health LG context
Boron Isotope Effect in Superconducting MgB
We report the preparation method of, and boron isotope effect for MgB, a
new binary intermetallic superconductor with a remarkably high superconducting
transition temperature (B) = 40.2 K. Measurements of both
temperature dependent magnetization and specific heat reveal a 1.0 K shift in
between MgB and MgB. Whereas such a high transition
temperature might imply exotic coupling mechanisms, the boron isotope effect in
MgB is consistent with the material being a phonon-mediated BCS
superconductor.Comment: One figure and related discussion adde
J/\Psi production, polarization and Color Fluctuations
The hard contributions to the heavy quarkonium-nucleon cross sections are
calculated based on the QCD factorization theorem and the nonrelativistic
quarkonium model. We evaluate the nonperturbative part of these cross sections
which dominates at GeV at the Cern Super Proton
Synchrotron (SPS) and becomes a correction at TeV at
the CERN Large Hadron Collider (LHC). \J production at the CERN SPS is well
described by hard QCD, when the larger absorption cross sections of the
states predicted by QCD are taken into account. We predict an -dependent
polarization of the states. The expansion of small wave packets is
discussed.Comment: 13 pages REVTEX, 1 table, 2 PostScript, corrected some typo
Exploring AI Futures Through Role Play
We present an innovative methodology for studying and teaching the impacts of
AI through a role play game. The game serves two primary purposes: 1) training
AI developers and AI policy professionals to reflect on and prepare for future
social and ethical challenges related to AI and 2) exploring possible futures
involving AI technology development, deployment, social impacts, and
governance. While the game currently focuses on the inter relations between
short --, mid and long term impacts of AI, it has potential to be adapted for a
broad range of scenarios, exploring in greater depths issues of AI policy
research and affording training within organizations. The game presented here
has undergone two years of development and has been tested through over 30
events involving between 3 and 70 participants. The game is under active
development, but preliminary findings suggest that role play is a promising
methodology for both exploring AI futures and training individuals and
organizations in thinking about, and reflecting on, the impacts of AI and
strategic mistakes that can be avoided today.Comment: Accepted to AIE
The AM Canum Venaticorum binary SDSS J173047.59+554518.5
The AM Canum Venaticorum (AM CVn) binaries are a rare group of hydrogen-deficient, ultrashort period, mass-transferring white dwarf binaries and are possible progenitors of Type Ia supernovae. We present time-resolved spectroscopy of the recently discovered AM CVn binary SDSS J173047.59+554518.5. The average spectrum shows strong double-peaked helium emission lines, as well as a variety of metal lines, including neon; this is the second detection of neon in an AM CVn binary, after the much brighter system GP Com. We detect no calcium in the accretion disc, a puzzling feature that has been noted in many of the longer period AM CVn binaries. We measure an orbital period, from the radial velocities of the emission lines, of 35.2 ± 0.2 min, confirming the ultracompact binary nature of the system. The emission lines seen in SDSS J1730 are very narrow, although double-peaked, implying a low-inclination, face-on accretion disc; using the measured velocities of the line peaks, we estimate i ≤ 11°. This low inclination makes SDSS J1730 an excellent system for the identification of emission lines
Global Warming: Forecasts by Scientists versus Scientific Forecasts
In 2007, the Intergovernmental Panel on Climate Change’s Working Group One, a panel of experts established by the World Meteorological Organization and the United Nations Environment Programme, issued its Fourth Assessment Report. The Report included predictions of dramatic increases in average world temperatures over the next 92 years and serious harm resulting from the predicted temperature increases. Using forecasting principles as our guide we asked: Are these forecasts a good basis for developing public policy? Our answer is “no”. To provide forecasts of climate change that are useful for policy-making, one would need to forecast (1) global temperature, (2) the effects of any temperature changes, and (3) the effects of feasible alternative policies. Proper forecasts of all three are necessary for rational policy making. The IPCC WG1 Report was regarded as providing the most credible long-term forecasts of global average temperatures by 31 of the 51 scientists and others involved in forecasting climate change who responded to our survey. We found no references in the 1056-page Report to the primary sources of information on forecasting methods despite the fact these are conveniently available in books, articles, and websites. We audited the forecasting processes described in Chapter 8 of the IPCC’s WG1 Report to assess the extent to which they complied with forecasting principles. We found enough information to make judgments on 89 out of a total of 140 forecasting principles. The forecasting procedures that were described violated 72 principles. Many of the violations were, by themselves, critical. The forecasts in the Report were not the outcome of scientific procedures. In effect, they were the opinions of scientists transformed by mathematics and obscured by complex writing. Research on forecasting has shown that experts’ predictions are not useful in situations involving uncertainly and complexity. We have been unable to identify any scientific forecasts of global warming. Claims that the Earth will get warmer have no more credence than saying that it will get colder
- …