1,851 research outputs found
Achieving Greater Homeland Security: Who Should Pay, and How?
The terrorist attacks of September 11, 2001 in the United States have broadened the public sectorās role in providing āprotective goods and servicesā to include homeland security in addition to national security and public safety (e.g. police and fire protection). Although it is acknowledged that the federal government has a clear responsibility for taking the lead in shaping homeland security policies, the provision of greater homeland security involves significant participation by state and local governments, and the private sector, in addition to the national government.This paper briefly summarizes why private markets are likely to under-invest in homeland security, leading to a need for public action. It provides an overview of both the range of budgetary and non-budgetary tools used by the United States (U.S.) federal government to āfinanceā the provision of homeland security, and the budgetary and non-budgetary cost of these federal actions. The paper also identifies and discusses some of the principle challenges faced in ensuring federal homeland security dollars are āwell-spent.ā While the analysis focuses on the United States, the tools of government employed are applicable to all sovereign nations
Recommended from our members
Summer 1978
Nature of the Organic Coating on Sand Grains of Nonwettable Golf Greens (page 3) Massachusetts Turfgrass Winter Injury (6) Investigation and Treatment of Localized Dry Spots on Sand Golf Greens (8) A Recreation Renaissance (15) UMass Turfgrass Research Fund (20
The Hydrodynamics Laboratory of the California Institute of Technology
This paper presents a description of the Hydrodynamics
Laboratory and its principal pieces of equipment that have
been developed during the last five years. The field of
investigations to be undertaken by the Laboratory is presented
in general terms
Electrophysiological effects of 5-hydroxytryptamine on isolated human atrial myocytes, and the influence of chronic beta-adrenoceptor blockade
<b>1.</b> 5-Hydroxytryptamine (5-HT) has been postulated to play a proarrhythmic role in the human atria via stimulation of 5-HT<sub>4</sub> receptors.
<b>2.</b> The aims of this study were to examine the effects of 5-HT on the L-type Ca<sup>2+</sup> current (<i>I</i><sub>CaL</sub>) action potential duration (APD), the effective refractory period (ERP) and arrhythmic activity in human atrial cells, and to assess the effects of prior treatment with β-adrenoceptor antagonists.
<b>3.</b> Isolated myocytes, from the right atrial appendage of 27 consenting patients undergoing cardiac surgery who were in sinus rhythm, were studied using the whole-cell perforated patch-clamp technique at 37ºC.
<b>4.</b> 5-HT (1 n-10 μM) caused a concentration-dependent increase in <i>I</i><sub>CaL</sub>, which was potentiated in cells from β-blocked (maximum response to 5-HT, E<sub>max</sub>=299±12% increase above control) compared to non-β-blocked patients (E<sub>max</sub>=220±6%, P<0.05), but with no change in either the potency (log EC<sub>50</sub>: -7.09±0.07 vs -7.26±0.06) or Hill coefficient (<i>n</i><sub>H</sub>: 1.5±0.6 vs 1.5±0.3) of the 5-HT concentration-response curve.
<b>5.</b> 5-HT (10 μM) produced a greater increase in the APD at 50% repolarisation (APD50) in cells from β-blocked patients (of 37±10 ms, i.e. 589±197%) vs non-β-blocked patients (of 10±4 ms, i.e. 157±54%; P<0.05). Both the APD<sub>90</sub> and the ERP were unaffected by 5-HT.
<b>6.</b> Arrhythmic activity was observed in response to 5-HT in five of 17 cells (29%) studied from β-blocked, compared to zero of 16 cells from the non-β-blocked patients (P<0.05).
<b>7.</b> In summary, the 5-HT-induced increase in calcium current was associated with a prolonged early plateau phase of repolarisation, but not late repolarisation or refractoriness, and the enhancement of these effects by chronic β-adrenoceptor blockade was associated with arrhythmic potential
Inferences Drawn from a Risk Assessment Compared Directly with a Randomized Trial of a Home Drinking Water Intervention
Risk assessments and intervention trials have been used by the U.S. Environmental Protection Agency to estimate drinking water health risks. Seldom are both methods used concurrently. Between 2001 and 2003, illness data from a trial were collected simultaneously with exposure data, providing a unique opportunity to compare direct risk estimates of waterborne disease from the intervention trial with indirect estimates from a risk assessment. Comparing the group with water treatment (active) with that without water treatment (sham), the estimated annual attributable disease rate (cases per 10,000 persons per year) from the trial provided no evidence of a significantly elevated drinking water risk [attributable risk = ā365 cases/year, sham minus active; 95% confidence interval (CI), ā2,555 to 1,825]. The predicted mean rate of disease per 10,000 persons per person-year from the risk assessment was 13.9 (2.5, 97.5 percentiles: 1.6, 37.7) assuming 4 log removal due to viral disinfection and 5.5 (2.5, 97.5 percentiles: 1.4, 19.2) assuming 6 log removal. Risk assessments are important under conditions of low risk when estimates are difficult to attain from trials. In particular, this assessment pointed toward the importance of attaining site-specific treatment data and the clear need for a better understanding of viral removal by disinfection. Trials provide direct risk estimates, and the upper confidence limit estimates, even if not statistically significant, are informative about possible upper estimates of likely risk. These differences suggest that conclusions about waterborne disease risk may be strengthened by the joint use of these two approaches
Recommended from our members
Antarctic Water Tracks: Microbial Community Responses to Variation in Soil Moisture, pH, and Salinity
Ice-free soils in the McMurdo Dry Valleys select for taxa able to cope with challenging environmental conditions, including extreme chemical water activity gradients, freeze-thaw cycling, desiccation, and solar radiation regimes. The low biotic complexity of Dry Valley soils makes them well suited to investigate environmental and spatial influences on bacterial community structure. Water tracks are annually wetted habitats in the cold-arid soils of Antarctica that form briefly each summer with moisture sourced from snow melt, ground ice thaw, and atmospheric deposition via deliquescence and vapor flow into brines. Compared to neighboring arid soils, water tracks are highly saline and relatively moist habitats. They represent a considerable area (∼5–10 km2) of the Dry Valley terrestrial ecosystem, an area that is expected to increase with ongoing climate change. The goal of this study was to determine how variation in the environmental conditions of water tracks influences the composition and diversity of microbial communities. We found significant differences in microbial community composition between on- and off-water track samples, and across two distinct locations. Of the tested environmental variables, soil salinity was the best predictor of community composition, with members of the Bacteroidetes phylum being relatively more abundant at higher salinities and the Actinobacteria phylum showing the opposite pattern. There was also a significant, inverse relationship between salinity and bacterial diversity. Our results suggest water track formation significantly alters dry soil microbial communities, likely influencing subsequent ecosystem functioning. We highlight how Dry Valley water tracks could be a useful model system for understanding the potential habitability of transiently wetted environments found on the surface of Mars.
</p
Recommended from our members
Emergency Department Blood Pressure Treatment and Outcomes in Adults Presenting with Severe Hypertension
Background: Patients who present to the emergency department (ED) with severe hypertension deļ¬ned as a systolic blood pressure (SBP) ā„180 millimeters of mercury (mm Hg) or diastolic (DBP) ā„120 (mm Hg) without evidence of acute end-organ damage are often deemed high risk and treated acutely in the ED. However, there is a dearth of evidence from large studies with long-term follow-up for the assessment of major adverse cardiovascular events (MACE). We conducted the largest study to date of patients presenting with severe hypertension to identify predictors of MACE and examine whether blood pressure at discharge is associated with heightened risk.
Methods: We enrolled ED patients with a SBP of 180ā220 mm Hg but without signs of end-organ damage and followed them for one year. The primary outcome was MACE within one year of discharge. Secondarily, we performed a propensity-matched analysis to test whether SBP ā¤160 mm Hg at discharge was associated with reduced MACE at 30 days.
Results: A total of 12,044 patients were enrolled. The prevalence of MACE within one year was 1,865 (15.5%). Older age, male gender, history of cardiovascular disease, cerebrovascular disease, diabetes, smoking, presentation with chest pain, altered mental status, dyspnea, treatment with intravenous and oral hydralazine, and oral metoprolol were independent predictors for one-year MACE. Additionally, discharge with an SBP ā¤160 mm Hg was not associated with 30-day MACE-free survival after propensity matching (hazard ratio 0.99, 95% conļ¬dence interval 0.78ā1.25, P = 0.92).
Conclusion: One-year MACE was relatively common in our cohort of ED patients with severe hypertension without acute end-organ damage. However, discharge blood pressure was not associated with 30-day or one-year MACE, suggesting that BP reduction in and of itself is not beneļ¬cial in such patients
Muting, not fragmentation, of functional brain networks under general anesthesia
Ā© 2021 Changes in resting-state functional connectivity (rs-FC) under general anesthesia have been widely studied with the goal of identifying neural signatures of consciousness. This work has commonly revealed an apparent fragmentation of whole-brain network structure during unconsciousness, which has been interpreted as reflecting a break-down in connectivity and a disruption of the brain\u27s ability to integrate information. Here we show, by studying rs-FC under varying depths of isoflurane-induced anesthesia in nonhuman primates, that this apparent fragmentation, rather than reflecting an actual change in network structure, can be simply explained as the result of a global reduction in FC. Specifically, by comparing the actual FC data to surrogate data sets that we derived to test competing hypotheses of how FC changes as a function of dose, we found that increases in whole-brain modularity and the number of network communities ā considered hallmarks of fragmentation ā are artifacts of constructing FC networks by thresholding based on correlation magnitude. Taken together, our findings suggest that deepening levels of unconsciousness are instead associated with the increasingly muted expression of functional networks, an observation that constrains current interpretations as to how anesthesia-induced FC changes map onto existing neurobiological theories of consciousness
- ā¦