151 research outputs found

    Managing Fault Management Development

    Get PDF
    As the complexity of space missions grows, development of Fault Management (FM) capabilities is an increasingly common driver for significant cost overruns late in the development cycle. FM issues and the resulting cost overruns are rarely caused by a lack of technology, but rather by a lack of planning and emphasis by project management. A recent NASA FM Workshop brought together FM practitioners from a broad spectrum of institutions, mission types, and functional roles to identify the drivers underlying FM overruns and recommend solutions. They identified a number of areas in which increased program and project management focus can be used to control FM development cost growth. These include up-front planning for FM as a distinct engineering discipline; managing different, conflicting, and changing institutional goals and risk postures; ensuring the necessary resources for a disciplined, coordinated approach to end-to-end fault management engineering; and monitoring FM coordination across all mission systems

    Infection and Autoimmunity

    Get PDF

    Forensic economics suggests over 2 percent of US-sold firearms are intended for trafficking south of the border

    Get PDF
    In recent years, Mexico has been wracked by drug-related violence, which has often been carried out with US-manufactured guns. But how many of the weapons used in Mexico originate in the US? In new research, John H. Patterson and Topher L. McDougal estimate that over 200,000 guns from the US found their way to Mexico each year between 2010 and 2012, around 2 percent of US-sold firearms. They write that the flow of American guns into Mexico and Central America is fuelling violence in those countries, which in turn has caused an upsurge in asylum seekers arriving at the US border

    LAND TITLE TRANSFER: A REGRESSION

    Get PDF
    Land is the basic asset of society. Its ownership affords the security upon which our complex credit structure rests. Certainty as to ownership is essential to the continued peace of each landowner or farm owner. So Professor Powell grounds his study of land title registration— the Torrens system —deep in concern for the public welfare. He\u27 could have grounded it deeper. Today our accepted social goals include something more than peace. Public opinion is mobilizing behind maximum utilization for the benefit of all classes. Our governments—federal, state, and municipal—are committed to a program of reconstructing our cities and rehousing at least a third of the nation. Humanitarian sentiment, in the guise inter alia of land-purchase programs, has even begun to extend to the pitifully insecure one-half of our farm population. City planning, rural rehabilitation, metropolitan communities, and garden cities; public subsidies, government financing, graded-tax plans, zoning, and eminent domain—all these are in the headlines and in the air. It takes no prophet to foresee that fundamental reforms in land utilization are hot upon us. Yet for the achievement of such reforms without payment of undue and continued tribute to private monopolies and without fruitless bother and delay—perhaps even if they are to be achieved at all—major changes must be effected in our antiquated, pre-commerce system of land transfer. Cheap, expeditious, and secure methods must be designed, if they are not already available, to replace the present complicated and dilatory methods which, while costly to the individual and burdensome to the public, afford no adequate security of title. Streamlined need cannot long endure horse-and-buggy obstacles to the liquidity of land. It is an ancient query, but its relevance grows: why should not a lot or a farm be as easily acquired and as securely held as a ship or a share of stock or an automobile

    α-Conotoxin Decontamination Protocol Evaluation: What Works and What Doesn’t

    Get PDF
    Nine publically available biosafety protocols for safely handling conotoxin peptides were tested to evaluate their decontamination efficacy. Circular dichroism (CD) spectroscopy and mass spectrometry (MS) were used to assess the effect of each chemical treatment on the secondary and primary structure of α-CTx MII (L10V, E11A). Of the nine decontamination methods tested, treatment with 1% (m/v) solution of the enzymatic detergent Contrex™ EZ resulted in a 76.8% decrease in α-helical content as assessed by the mean residue ellipticity at 222 nm, and partial peptide digestion was demonstrated using high performance liquid chromatography mass spectrometry (HPLC-MS). Additionally, treatment with 6% sodium hypochlorite (m/v) resulted in 80.5% decrease in α-helical content and complete digestion of the peptide. The Contrex™ EZ treatment was repeated with three additional α-conotoxins (α-CTxs), α-CTxs LvIA, ImI and PeIA, which verified the decontamination method was reasonably robust. These results support the use of either 1% Contrex™ EZ solution or 6% sodium hypochlorite in biosafety protocols for the decontamination of α-CTxs in research laboratories

    Results from the NASA Spacecraft Fault Management Workshop: Cost Drivers for Deep Space Missions

    Get PDF
    Fault Management, the detection of and response to in-flight anomalies, is a critical aspect of deep-space missions. Fault management capabilities are commonly distributed across flight and ground subsystems, impacting hardware, software, and mission operations designs. The National Aeronautics and Space Administration (NASA) Discovery & New Frontiers (D&NF) Program Office at Marshall Space Flight Center (MSFC) recently studied cost overruns and schedule delays for five missions. The goal was to identify the underlying causes for the overruns and delays, and to develop practical mitigations to assist the D&NF projects in identifying potential risks and controlling the associated impacts to proposed mission costs and schedules. The study found that four out of the five missions studied had significant overruns due to underestimating the complexity and support requirements for fault management. As a result of this and other recent experiences, the NASA Science Mission Directorate (SMD) Planetary Science Division (PSD) commissioned a workshop to bring together invited participants across government, industry, and academia to assess the state of the art in fault management practice and research, identify current and potential issues, and make recommendations for addressing these issues. The workshop was held in New Orleans in April of 2008. The workshop concluded that fault management is not being limited by technology, but rather by a lack of emphasis and discipline in both the engineering and programmatic dimensions. Some of the areas cited in the findings include different, conflicting, and changing institutional goals and risk postures; unclear ownership of end-to-end fault management engineering; inadequate understanding of the impact of mission-level requirements on fault management complexity; and practices, processes, and tools that have not kept pace with the increasing complexity of mission requirements and spacecraft systems. This paper summarizes the findings and recommendations from that workshop, particularly as fault management development issues affect operations and the development of operations capabilities

    Overcoming Molehills and Mountains Implementing a New Program

    Get PDF
    This slide presentation reviews some of the challenges and accomplishments of implementing a new program. The purpose of the presentation is to: (1) Share the challenges that were encountered formulating a new program concurrent with formulating & implementing new spacecraft development projects: (a) Immature mission concepts put on the fast track (b) Need to reconcile ambitious objectives with cost and budget reality (c) Changes of major stakeholders (d) Timing, timing, timing (e) Changing ground rules, assumptions, and risk tolerance (f) The role of centers, (2) Share the successes to date despite the challenges (3) Demonstrate how interdependencies between the program, projects, NASA HQ environment, and external political forces affect the process, and how expectations must be managed while dealing with external factors and great change

    Native \u3cem\u3eV. californicum\u3c/em\u3e Alkaloid Combinations Induce Differential Inhibition of Sonic Hedgehog Signaling

    Get PDF
    Veratrum californicum is a rich source of steroidal alkaloids such as cyclopamine, a known inhibitor of the Hedgehog (Hh) signaling pathway. Here we provide a detailed analysis of the alkaloid composition of V. californicum by plant part through quantitative analysis of cyclopamine, veratramine, muldamine and isorubijervine in the leaf, stem and root/rhizome of the plant. To determine whether additional alkaloids in the extracts contribute to Hh signaling inhibition, the concentrations of these four alkaloids present in extracts were replicated using commercially available standards, followed by comparison of extracts to alkaloid standard mixtures for inhibition of Hh signaling using Shh-Light II cells. Alkaloid combinations enhanced Hh signaling pathway antagonism compared to cyclopamine alone, and significant differences were observed in the Hh pathway inhibition between the stem and root/rhizome extracts and their corresponding alkaloid standard mixtures, indicating that additional alkaloids present in these extracts are capable of inhibiting Hh signaling

    BED Estimates of HIV Incidence: Resolving the Differences, Making Things Simpler

    Get PDF
    Objective: Develop a simple method for optimal estimation of HIV incidence using the BED capture enzyme immunoassay. Design: Use existing BED data to estimate mean recency duration, false recency rates and HIV incidence with reference to a fixed time period, T. Methods: Compare BED and cohort estimates of incidence referring to identical time frames. Generalize this approach to suggest a method for estimating HIV incidence from any cross-sectional survey. Results: Follow-up and BED analyses of the same, initially HIV negative, cases followed over the same set time period T, produce estimates of the same HIV incidence, permitting the estimation of the BED mean recency period for cases who have been HIV positive for less than T. Follow-up of HIV positive cases over T, similarly, provides estimates of the false-recent rate appropriate for T. Knowledge of these two parameters for a given population allows the estimation of HIV incidence during T by applying the BED method to samples from cross-sectional surveys. An algorithm is derived for providing these estimates, adjusted for the false-recent rate. The resulting estimator is identical to one derived independently using a more formal mathematical analysis. Adjustments improve the accuracy of HIV incidence estimates. Negative incidence estimates result from the use of inappropriate estimates of the false-recent rate and/or from sampling error, not from any error in the adjustment procedure
    • …
    corecore