2,123 research outputs found
Understanding the Costs and Benefits of Deepwater Oil Drilling Regulation
The purpose of this paper is to provide a conceptual framework for understanding how analysis of costs and benefits might be incorporated into an assessment of regulatory policies affecting deepwater drilling. We begin by providing a framework for analyzing the life-cycle impacts of oil drilling and its alternatives, including onshore drilling and importing oil from abroad. We then provide background estimates of the different sources of oil supplied in the United States, look at how other oil supply sources might respond to regulations on deepwater drilling, and consider the economic costs of these regulations. After providing a comprehensive description of the potential costs and benefits from various types of drilling—including, when possible, estimates of the magnitude of these benefits and costs—we discuss the extent to which these costs and benefits may already be taken into account (or reinforced) through the legal, regulatory, and tax systems and through market mechanisms. We conclude by presenting a framework and simple example of how a cost–benefit analysis might be used to inform regulation of deepwater drilling, and sum up the policy implications of our work.catastrophic oil spill, cost-benefit analysis, government regulation, liability
Finding the signal in the noise: Could social media be utilized for early hospital notification of multiple casualty events?
IntroductionDelayed notification and lack of early information hinder timely hospital based activations in large scale multiple casualty events. We hypothesized that Twitter real-time data would produce a unique and reproducible signal within minutes of multiple casualty events and we investigated the timing of the signal compared with other hospital disaster notification mechanisms.MethodsUsing disaster specific search terms, all relevant tweets from the event to 7 days post-event were analyzed for 5 recent US based multiple casualty events (Boston Bombing [BB], SF Plane Crash [SF], Napa Earthquake [NE], Sandy Hook [SH], and Marysville Shooting [MV]). Quantitative and qualitative analysis of tweet utilization were compared across events.ResultsOver 3.8 million tweets were analyzed (SH 1.8 m, BB 1.1m, SF 430k, MV 250k, NE 205k). Peak tweets per min ranged from 209-3326. The mean followers per tweeter ranged from 3382-9992 across events. Retweets were tweeted a mean of 82-564 times per event. Tweets occurred very rapidly for all events (<2 mins) and represented 1% of the total event specific tweets in a median of 13 minutes of the first 911 calls. A 200 tweets/min threshold was reached fastest with NE (2 min), BB (7 min), and SF (18 mins). If this threshold was utilized as a signaling mechanism to place local hospitals on standby for possible large scale events, in all case studies, this signal would have preceded patient arrival. Importantly, this threshold for signaling would also have preceded traditional disaster notification mechanisms in SF, NE, and simultaneous with BB and MV.ConclusionsSocial media data has demonstrated that this mechanism is a powerful, predictable, and potentially important resource for optimizing disaster response. Further investigated is warranted to assess the utility of prospective signally thresholds for hospital based activation
Social networks : the future for health care delivery
With the rapid growth of online social networking for health, health care systems are experiencing an inescapable increase in complexity. This is not necessarily a drawback; self-organising, adaptive networks could become central to future health care delivery. This paper considers whether social networks composed of patients and their social circles can compete with, or complement, professional networks in assembling health-related information of value for improving health and health care. Using the framework of analysis of a two-sided network – patients and providers – with multiple platforms for interaction, we argue that the structure and dynamics of such a network has implications for future health care. Patients are using social networking to access and contribute health information. Among those living with chronic illness and disability and engaging with social networks, there is considerable expertise in assessing, combining and exploiting information. Social networking is providing a new landscape for patients to assemble health information, relatively free from the constraints of traditional health care. However, health information from social networks currently complements traditional sources rather than substituting for them. Networking among health care provider organisations is enabling greater exploitation of health information for health care planning. The platforms of interaction are also changing. Patient-doctor encounters are now more permeable to influence from social networks and professional networks. Diffuse and temporary platforms of interaction enable discourse between patients and professionals, and include platforms controlled by patients. We argue that social networking has the potential to change patterns of health inequalities and access to health care, alter the stability of health care provision and lead to a reformulation of the role of health professionals. Further research is needed to understand how network structure combined with its dynamics will affect the flow of information and potentially the allocation of health care resources
Recommended from our members
Population Health Impact and Cost-Effectiveness of Tuberculosis Diagnosis with Xpert MTB/RIF: A Dynamic Simulation and Economic Evaluation
Background: The Xpert MTB/RIF test enables rapid detection of tuberculosis (TB) and rifampicin resistance. The World Health Organization recommends Xpert for initial diagnosis in individuals suspected of having multidrug-resistant TB (MDR-TB) or HIV-associated TB, and many countries are moving quickly toward adopting Xpert. As roll-out proceeds, it is essential to understand the potential health impact and cost-effectiveness of diagnostic strategies based on Xpert. Methods and findings: We evaluated potential health and economic consequences of implementing Xpert in five southern African countries—Botswana, Lesotho, Namibia, South Africa, and Swaziland—where drug resistance and TB-HIV coinfection are prevalent. Using a calibrated, dynamic mathematical model, we compared the status quo diagnostic algorithm, emphasizing sputum smear, against an algorithm incorporating Xpert for initial diagnosis. Results were projected over 10- and 20-y time periods starting from 2012. Compared to status quo, implementation of Xpert would avert 132,000 (95% CI: 55,000–284,000) TB cases and 182,000 (97,000–302,000) TB deaths in southern Africa over the 10 y following introduction, and would reduce prevalence by 28% (14%–40%) by 2022, with more modest reductions in incidence. Health system costs are projected to increase substantially with Xpert, by US959 (633–1,485) per disability-adjusted life-year averted over 10 y. Across countries, cost-effectiveness ratios ranged from US1,257 (767–2,276) in Botswana. Assessing outcomes over a 10-y period focuses on the near-term consequences of Xpert adoption, but the cost-effectiveness results are conservative, with cost-effectiveness ratios assessed over a 20-y time horizon approximately 20% lower than the 10-y values. Conclusions: Introduction of Xpert could substantially change TB morbidity and mortality through improved case-finding and treatment, with more limited impact on long-term transmission dynamics. Despite extant uncertainty about TB natural history and intervention impact in southern Africa, adoption of Xpert evidently offers reasonable value for its cost, based on conventional benchmarks for cost-effectiveness. However, the additional financial burden would be substantial, including significant increases in costs for treating HIV and MDR-TB. Given the fundamental influence of HIV on TB dynamics and intervention costs, care should be taken when interpreting the results of this analysis outside of settings with high HIV prevalence. Please see later in the article for the Editors' Summar
Recommended from our members
Turning points: the personal and professional circumstances that lead academics to become middle managers
In the current higher education climate, there is a growing perception that the pressures associated with being an academic middle manager outweigh the perceived rewards of the position. This article investigates the personal and professional circumstances that lead academics to become middle managers by drawing on data from life history interviews undertaken with 17 male and female department heads from a range of disciplines, in a post-1992 UK university. The data suggests that experiencing conflict between personal and professional identities, manifested through different socialization experiences over time, can lead to a ‘turning point’ and a decision that affects a person’s career trajectory. Although the results of this study cannot be generalized, the findings may help other individuals and institutions move towards a firmer understanding of the academic who becomes head of department—in relation to theory, practice and research
Lipid and metabolite profiles of human brain tumors by desorption electrospray ionization-MS
Examination of tissue sections using desorption electrospray ionization (DESI)-MS revealed phospholipid-derived signals that differ between gray matter, white matter, gliomas, meningiomas, and pituitary tumors, allowing their ready discrimination by multivariate statistics. A set of lower mass signals, some corresponding to oncometabolites, including 2-hydroxyglutaric acid and N-acetyl-aspartic acid, was also observed in the DESI mass spectra, and these data further assisted in discrimination between brain parenchyma and gliomas. The combined information from the lipid and metabolite MS profiles recorded by DESI-MS and explored using multivariate statistics allowed successful differentiation of gray matter (n = 223), white matter (n = 66), gliomas (n = 158), meningiomas (n = 111), and pituitary tumors (n = 154) from 58 patients. A linear discriminant model used to distinguish brain parenchyma and gliomas yielded an overall sensitivity of 97.4% and a specificity of 98.5%. Furthermore, a discriminant model was created for tumor types (i.e., glioma, meningioma, and pituitary), which were discriminated with an overall sensitivity of 99.4% and a specificity of 99.7%. Unsupervised multivariate statistics were used to explore the chemical differences between anatomical regions of brain parenchyma and secondary infiltration. Infiltration of gliomas into normal tissue can be detected by DESI-MS. One hurdle to implementation of DESI-MS intraoperatively is the need for tissue freezing and sectioning, which we address by analyzing smeared biopsy tissue. Tissue smears are shown to give the same chemical information as tissue sections, eliminating the need for sectioning before MS analysis. These results lay the foundation for implementation of intraoperative DESI-MS evaluation of tissue smears for rapid diagnosis
Intraoperative assessment of tumor margins during glioma resection by desorption electrospray ionization-mass spectrometry
Gliomas infiltrate into surrounding healthy brain tissue. Microsurgical resection aims for maximal tumor resection while minimizing morbidity. Surgical margins are defined based on the surgeon’s experience, visual observation, and neuronavigation. Surgical margin assessment is rarely undertaken intraoperatively due to time constraints and unreliability of such evaluation. Routine, pathologic intraoperative examination provides no molecular information. Molecular measurements using mass spectrometry can be made rapidly on tissue during surgery to identify tissue types, estimate tumor infiltration, and recognize the presence of prognostic mutations by monitoring oncometabolites and phospholipids. This intraoperative study demonstrates the power of mass spectrometry in assessing diagnostic and prognostic information on discrete surgeon-defined points along the resection margins to improve tumor resection, even in regions without MRI contrast enhancement., Intraoperative desorption electrospray ionization-mass spectrometry (DESI-MS) is used to characterize tissue smears by comparison with a library of DESI mass spectra of pathologically determined tissue types. Measurements are performed in the operating room within 3 min. These mass spectra provide direct information on tumor infiltration into white or gray brain matter based on N-acetylaspartate (NAA) and on membrane-derived complex lipids. The mass spectra also indicate the isocitrate dehydrogenase mutation status of the tumor via detection of 2-hydroxyglutarate, currently assessed postoperatively on biopsied tissue using immunohistochemistry. Intraoperative DESI-MS measurements made at surgeon-defined positions enable assessment of relevant disease state of tissue within the tumor mass and examination of the resection cavity walls for residual tumor. Results for 73 biopsies from 10 surgical resection cases show that DESI-MS allows detection of glioma and estimation of high tumor cell percentage (TCP) at surgical margins with 93% sensitivity and 83% specificity. TCP measurements from NAA are corroborated by indirect measurements based on lipid profiles. Notably, high percentages (>50%) of unresected tumor were found in one-half of the margin biopsy smears, even in cases where postoperative MRI suggested gross total tumor resection. Unresected tumor causes recurrence and malignant progression, as observed within a year in one case examined in this study. These results corroborate the utility of DESI-MS in assessing surgical margins for maximal safe tumor resection. Intraoperative DESI-MS analysis of tissue smears, ex vivo, can be inserted into the current surgical workflow with no alterations. The data underscore the complexity of glioma infiltration
The preparation and properties of 14C-carboxamido-methylated subunits from A2/1957 influenza neuraminidase
A2/1957 influenza neuraminidase (mucopolysaccharide N-acetylneuraminylhydrolase, EC 3.2.1.18) was purified 15-fold from a recombinant virus, with about 25% overall yield of enzymic activity. Neuraminidase contained glucosamine, and a high proportion of serine and threonine. The partial specific volume was 0.713 cm3/g. Reduced neuraminidase was isotopically labeled in vitro by reaction with iodo[14C]-acetamide. When carboxamidomethylated in the absence of urea, enzymically inactive labeled material was obtained with a maximum size similar to native neuraminidase. When carboxamidomethylated in the presence of 6 M urea, labeled, dissociated subunits were obtained that did not associate or regain enzymic activity on removal of urea. The molecular weight of dissociated subunits was determined by sedimentation-diffusion methods as 50 000-54 000, and by sodium dodecyl sulfate-acrylamide gel electrophoresis as about 50 000. Thus native neuraminidase (mol. wt. about 200 000) is probably a tetramer. Neuraminidase contained about 21 cysteine residues per subunit. These appear to be present as disulfide bonds in the native enzyme.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/34154/1/0000440.pd
ConXsense - Automated Context Classification for Context-Aware Access Control
We present ConXsense, the first framework for context-aware access control on
mobile devices based on context classification. Previous context-aware access
control systems often require users to laboriously specify detailed policies or
they rely on pre-defined policies not adequately reflecting the true
preferences of users. We present the design and implementation of a
context-aware framework that uses a probabilistic approach to overcome these
deficiencies. The framework utilizes context sensing and machine learning to
automatically classify contexts according to their security and privacy-related
properties. We apply the framework to two important smartphone-related use
cases: protection against device misuse using a dynamic device lock and
protection against sensory malware. We ground our analysis on a sociological
survey examining the perceptions and concerns of users related to contextual
smartphone security and analyze the effectiveness of our approach with
real-world context data. We also demonstrate the integration of our framework
with the FlaskDroid architecture for fine-grained access control enforcement on
the Android platform.Comment: Recipient of the Best Paper Awar
Sensitivity to grid resolution in the ability of a chemical transport model to simulate observed oxidant chemistry under high-isoprene conditions
Formation of ozone and organic aerosol in continental atmospheres depends on whether isoprene emitted by vegetation is oxidized by the high-NOx pathway (where peroxy radicals react with NO) or by low-NOx pathways (where peroxy radicals react by alternate channels, mostly with HO2). We used mixed layer observations from the SEAC4RS aircraft campaign over the Southeast US to test the ability of the GEOS-Chem chemical transport model at different grid resolutions (0.25° × 0.3125°, 2° × 2.5°, 4° × 5°) to simulate this chemistry under high-isoprene, variable-NOx conditions. Observations of isoprene and NOx over the Southeast US show a negative correlation, reflecting the spatial segregation of emissions; this negative correlation is captured in the model at 0.25° × 0.3125° resolution but not at coarser resolutions. As a result, less isoprene oxidation takes place by the high-NOx pathway in the model at 0.25° × 0.3125° resolution (54 %) than at coarser resolution (59 %). The cumulative probability distribution functions (CDFs) of NOx, isoprene, and ozone concentrations show little difference across model resolutions and good agreement with observations, while formaldehyde is overestimated at coarse resolution because excessive isoprene oxidation takes place by the high-NOx pathway with high formaldehyde yield. The good agreement of simulated and observed concentration variances implies that smaller-scale non-linearities (urban and power plant plumes) are not important on the regional scale. Correlations of simulated vs. observed concentrations do not improve with grid resolution because finer modes of variability are intrinsically more difficult to capture. Higher model resolution leads to decreased conversion of NOx to organic nitrates and increased conversion to nitric acid, with total reactive nitrogen oxides (NOy) changing little across model resolutions. Model concentrations in the lower free troposphere are also insensitive to grid resolution. The overall low sensitivity of modeled concentrations to grid resolution implies that coarse resolution is adequate when modeling continental boundary layer chemistry for global applications
- …