42 research outputs found

    Project Portfolio Management: An Investigation of One Air Force Product Center

    Get PDF
    Over the last two decades, reducing product development times in the DoD has been the focus of many committees, commissions, and research efforts. Despite the implementation of numerous recommendations, the DoD still struggles with long acquisition cycle times. This research is part of the Air Force Cycle Time Reduction Research Program (CTRRP), which grew out of the Cycle Time Reduction Action Plan, developed in 1998. This research focuses on the portfolio management (project selection and resource allocation) part of the CTRRP. The purpose of this research effort was to investigate the use of portfolio management within the Air Force. Specifically, this thesis sought to assess how portfolio management is used in Air Force acquisition and to compare the Air Force\u27s practices to commercial best practices. A comprehensive review of commercial portfolio management literature was conducted. To identify Air Force practices, semi-structured interviews were conducted at one Air Force product center. Personnel in positions most likely to use portfolio management, or have knowledge of its use, were interviewed at the center, wing, and direct reporting group levels. The research found that top performing commercial firms with an effective portfolio management process focus primarily on project selection activities at the front end of the development process, while the Air Force focuses primarily on program execution activities at the back end of the process. Recommendations to make portfolio management more effective in the Air Force are discussed

    Chemical Abundances of Seven Irregular and Three Tidal Dwarf Galaxies in the M81 Group

    Full text link
    We have derived nebular abundances for 10 dwarf galaxies belonging to the M81 Group, including several galaxies which do not have abundances previously reported in the literature. For each galaxy, multiple H \ii regions were observed with GMOS-N at the Gemini Observatory in order to determine abundances of several elements (oxygen, nitrogen, sulfur, neon, and argon). For seven galaxies, at least one H \ii region had a detection of the temperature sensitive [OIII] λ\lambda4363 line, allowing a "direct" determination of the oxygen abundance. No abundance gradients were detected in the targeted galaxies and the observed oxygen abundances are typically in agreement with the well known metallicity-luminosity relation. However, three candidate "tidal dwarf" galaxies lie well off this relation, UGC 5336, Garland, and KDG 61. The nature of these systems suggests that UGC 5336 and Garland are indeed recently formed systems, whereas KDG 61 is most likely a dwarf spheroidal galaxy which lies along the same line of sight as the M81 tidal debris field. We propose that these H \ii regions formed from previously enriched gas which was stripped from nearby massive galaxies (e.g., NGC 3077 and M81) during a recent tidal interaction.Comment: 37 pages, 10 figures, accepted for publication in ApJ. Slit positions in Table 2 have been update

    Primary and secondary defences of squid to cruising and ambush fish predators : variable tactics and their survival value

    Get PDF
    Author Posting. © The Author(s), 2010. This is the author's version of the work. It is posted here by permission of Elsevier B.V. for personal use, not for redistribution. The definitive version was published in Animal Behaviour 81 (2011): 585-594, doi:10.1016/j.anbehav.2010.12.002.Longfin squid (Loligo pealeii) were exposed to two predators, bluefish (Pomatomus saltatrix) and summer flounder (Paralichthys dentatus), representing cruising and ambush foraging tactics, respectively. During 35 trials, 86 predator–prey interactions were evaluated between bluefish and squid, and in 29 trials, 92 interactions were assessed between flounder and squid. With bluefish, squid predominantly used stay tactics (68.6%, 59/86) as initial responses. The most common stay response was to drop to the bottom, while showing a disruptive body pattern, and remain motionless. In 37.0% (34/92) of interactions with flounder, squid did not detect predators camouflaging on the bottom and showed no reaction prior to being attacked. Squid that did react, used flee tactics more often as initial responses (43.5%, 40/92), including flight with or without inking. When all defence behaviours were considered concurrently, flight was identified as the strongest predictor of squid survival during interactions with each predator. Squid that used flight at any time during an attack sequence had high probabilities of survival with bluefish (65%, 20/31) and flounder (51%, 18/35). The most important deimatic/protean behaviour used by squid was inking. Inking caused bluefish to startle (deimatic) and abandon attacks (probability of survival = 61%, 11/18) and caused flounder to misdirect (protean) attacks towards ink plumes rather than towards squid (probability of survival = 56%, 14/25). These are the first published laboratory experiments to evaluate the survival value of antipredator behaviours in a cephalopod. Results demonstrate that squid vary their defence tactics in response to different predators and that the effectiveness of antipredator behaviours is contingent upon the behavioural characteristics of the predator encountered.This study was funded by the Woods Hole Oceanographic Institution Sea Grant Program, the Massachusetts Marine Fisheries Institute, the University of Massachusetts and the Five College Coastal and Marine Sciences Program. R. T. Hanlon acknowledges partial support from ONR grant N000140610202 and the Sholley Foundation

    Modeling Routes of Chronic Wasting Disease Transmission: Environmental Prion Persistence Promotes Deer Population Decline and Extinction

    Get PDF
    Chronic wasting disease (CWD) is a fatal disease of deer, elk, and moose transmitted through direct, animal-to-animal contact, and indirectly, via environmental contamination. Considerable attention has been paid to modeling direct transmission, but despite the fact that CWD prions can remain infectious in the environment for years, relatively little information exists about the potential effects of indirect transmission on CWD dynamics. In the present study, we use simulation models to demonstrate how indirect transmission and the duration of environmental prion persistence may affect epidemics of CWD and populations of North American deer. Existing data from Colorado, Wyoming, and Wisconsin's CWD epidemics were used to define plausible short-term outcomes and associated parameter spaces. Resulting long-term outcomes range from relatively low disease prevalence and limited host-population decline to host-population collapse and extinction. Our models suggest that disease prevalence and the severity of population decline is driven by the duration that prions remain infectious in the environment. Despite relatively low epidemic growth rates, the basic reproductive number, R0, may be much larger than expected under the direct-transmission paradigm because the infectious period can vastly exceed the host's life span. High prion persistence is expected to lead to an increasing environmental pool of prions during the early phases (i.e. approximately during the first 50 years) of the epidemic. As a consequence, over this period of time, disease dynamics will become more heavily influenced by indirect transmission, which may explain some of the observed regional differences in age and sex-specific disease patterns. This suggests management interventions, such as culling or vaccination, will become increasingly less effective as CWD epidemics progress

    Physiological roles for ecto-5’-nucleotidase (CD73)

    Get PDF
    Nucleotides and nucleosides influence nearly every aspect of physiology and pathophysiology. Extracellular nucleotides are metabolized through regulated phosphohydrolysis by a series of ecto-nucleotidases. The formation of extracellular adenosine from adenosine 5’-monophosphate is accomplished primarily through ecto-5’-nucleotidase (CD73), a glycosyl phosphatidylinositol-linked membrane protein found on the surface of a variety of cell types. Recent in vivo studies implicating CD73 in a number of tissue protective mechanisms have provided new insight into its regulation and function and have generated considerable interest. Here, we review contributions of CD73 to cell and tissue stress responses, with a particular emphasis on physiologic responses to regulated CD73 expression and function, as well as new findings utilizing Cd73-deficient animals

    Dynamic purine signaling and metabolism during neutrophil–endothelial interactions

    Get PDF
    During episodes of hypoxia and inflammation, polymorphonuclear leukocytes (PMN) move into underlying tissues by initially passing between endothelial cells that line the inner surface of blood vessels (transendothelial migration, TEM). TEM creates the potential for disturbances in vascular barrier and concomitant loss of extravascular fluid and resultant edema. Recent studies have demonstrated a crucial role for nucleotide metabolism and nucleoside signaling during inflammation. These studies have implicated multiple adenine nucleotides as endogenous tissue protective mechanisms invivo. Here, we review the functional components of vascular barrier, identify strategies for increasing nucleotide generation and nucleoside signaling, and discuss potential therapeutic targets to regulate the vascular barrier during inflammation

    Proceedings of the 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015: advancing efficient methodologies through community partnerships and team science

    Get PDF
    It is well documented that the majority of adults, children and families in need of evidence-based behavioral health interventionsi do not receive them [1, 2] and that few robust empirically supported methods for implementing evidence-based practices (EBPs) exist. The Society for Implementation Research Collaboration (SIRC) represents a burgeoning effort to advance the innovation and rigor of implementation research and is uniquely focused on bringing together researchers and stakeholders committed to evaluating the implementation of complex evidence-based behavioral health interventions. Through its diverse activities and membership, SIRC aims to foster the promise of implementation research to better serve the behavioral health needs of the population by identifying rigorous, relevant, and efficient strategies that successfully transfer scientific evidence to clinical knowledge for use in real world settings [3]. SIRC began as a National Institute of Mental Health (NIMH)-funded conference series in 2010 (previously titled the “Seattle Implementation Research Conference”; $150,000 USD for 3 conferences in 2011, 2013, and 2015) with the recognition that there were multiple researchers and stakeholdersi working in parallel on innovative implementation science projects in behavioral health, but that formal channels for communicating and collaborating with one another were relatively unavailable. There was a significant need for a forum within which implementation researchers and stakeholders could learn from one another, refine approaches to science and practice, and develop an implementation research agenda using common measures, methods, and research principles to improve both the frequency and quality with which behavioral health treatment implementation is evaluated. SIRC’s membership growth is a testament to this identified need with more than 1000 members from 2011 to the present.ii SIRC’s primary objectives are to: (1) foster communication and collaboration across diverse groups, including implementation researchers, intermediariesi, as well as community stakeholders (SIRC uses the term “EBP champions” for these groups) – and to do so across multiple career levels (e.g., students, early career faculty, established investigators); and (2) enhance and disseminate rigorous measures and methodologies for implementing EBPs and evaluating EBP implementation efforts. These objectives are well aligned with Glasgow and colleagues’ [4] five core tenets deemed critical for advancing implementation science: collaboration, efficiency and speed, rigor and relevance, improved capacity, and cumulative knowledge. SIRC advances these objectives and tenets through in-person conferences, which bring together multidisciplinary implementation researchers and those implementing evidence-based behavioral health interventions in the community to share their work and create professional connections and collaborations

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Understanding reduced rotavirus vaccine efficacy in low socio-economic settings.

    Get PDF
    INTRODUCTION: Rotavirus vaccine efficacy ranges from >90% in high socio-economic settings (SES) to 50% in low SES. With the imminent introduction of rotavirus vaccine in low SES countries, understanding reasons for reduced efficacy in these settings could identify strategies to improve vaccine performance. METHODS: We developed a mathematical model to predict rotavirus vaccine efficacy in high, middle and low SES based on data specific for each setting on incidence, protection conferred by natural infection and immune response to vaccination. We then examined factors affecting efficacy. RESULTS: Vaccination was predicted to prevent 93%, 86% and 51% of severe rotavirus gastroenteritis in high, middle and low SES, respectively. Also predicted was that vaccines are most effective against severe disease and efficacy declines with age in low but not high SES. Reduced immunogenicity of vaccination and reduced protection conferred by natural infection are the main factors that compromise efficacy in low SES. DISCUSSION: The continued risk of severe disease in non-primary natural infections in low SES is a key factor underpinning reduced efficacy of rotavirus vaccines. Predicted efficacy was remarkably consistent with observed clinical trial results from different SES, validating the model. The phenomenon of reduced vaccine efficacy can be predicted by intrinsic immunological and epidemiological factors of low SES populations. Modifying aspects of the vaccine (e.g. improving immunogenicity in low SES) and vaccination program (e.g. additional doses) may bring improvements
    corecore