213 research outputs found

    Space Transportation System Availability Relationships to Life Cycle Cost

    Get PDF
    Future space transportation architectures and designs must be affordable. Consequently, their Life Cycle Cost (LCC) must be controlled. For the LCC to be controlled, it is necessary to identify all the requirements and elements of the architecture at the beginning of the concept phase. Controlling LCC requires the establishment of the major operational cost drivers. Two of these major cost drivers are reliability and maintainability, in other words, the system's availability (responsiveness). Potential reasons that may drive the inherent availability requirement are the need to control the number of unique parts and the spare parts required to support the transportation system's operation. For more typical space transportation systems used to place satellites in space, the productivity of the system will drive the launch cost. This system productivity is the resultant output of the system availability. Availability is equal to the mean uptime divided by the sum of the mean uptime plus the mean downtime. Since many operational factors cannot be projected early in the definition phase, the focus will be on inherent availability which is equal to the mean time between a failure (MTBF) divided by the MTBF plus the mean time to repair (MTTR) the system. The MTBF is a function of reliability or the expected frequency of failures. When the system experiences failures the result is added operational flow time, parts consumption, and increased labor with an impact to responsiveness resulting in increased LCC. The other function of availability is the MTTR, or maintainability. In other words, how accessible is the failed hardware that requires replacement and what operational functions are required before and after change-out to make the system operable. This paper will describe how the MTTR can be equated to additional labor, additional operational flow time, and additional structural access capability, all of which drive up the LCC. A methodology will be presented that provides the decision makers with the understanding necessary to place constraints on the design definition. This methodology for the major drivers will determine the inherent availability, safety, reliability, maintainability, and the life cycle cost of the fielded system. This methodology will focus on the achievement of an affordable, responsive space transportation system. It is the intent of this paper to not only provide the visibility of the relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability, but also to provide the capability to bound the variables, thus providing the insight required to control the system's engineering solution. An example of this visibility is the need to provide integration of similar discipline functions to allow control of the total parts count of the space transportation system. Also, selecting a reliability requirement will place a constraint on parts count to achieve a given inherent availability requirement, or require accepting a larger parts count with the resulting higher individual part reliability requirements. This paper will provide an understanding of the relationship of mean repair time (mean downtime) to maintainability (accessibility for repair), and both mean time between failure (reliability of hardware) and the system inherent availability

    Concepts for Life Cycle Cost Control Required to Achieve Space Transportation Affordability and Sustainability

    Get PDF
    Cost control must be implemented through the establishment of requirements and controlled continually by managing to these requirements. Cost control of the non-recurring side of life cycle cost has traditionally been implemented in both commercial and government programs. The government uses the budget process to implement this control. The commercial approach is to use a similar process of allocating the non-recurring cost to major elements of the program. This type of control generally manages through a work breakdown structure (WBS) by defining the major elements of the program. If the cost control is to be applied across the entire program life cycle cost (LCC), the approach must be addressed very differently. A functional breakdown structure (FBS) is defined and recommended. Use of a FBS provides the visibifity to allow the choice of an integrated solution reducing the cost of providing many different elements of like function. The different functional solutions that drive the hardware logistics, quantity of documentation, operational labor, reliability and maintainability balance, and total integration of the entire system from DDT&E through the life of the program must be fully defined, compared, and final decisions made among these competing solutions. The major drivers of recurring cost have been identified and are presented and discussed. The LCC requirements must be established and flowed down to provide control of LCC. This LCC control will require a structured rigid process similar to the one traditionally used to control weight/performance for space transportation systems throughout the entire program. It has been demonstrated over the last 30 years that without a firm requirement and methodically structured cost control, it is unlikely that affordable and sustainable space transportation system LCC will be achieved

    Proposal for a standard problem for micromagnetic simulations including spin-transfer torque

    No full text
    The spin-transfer torque between itinerant electrons and the magnetization in a ferromagnet is of fundamental interest for the applied physics community. To investigate the spin-transfer torque, powerful simulation tools are mandatory. We propose a micromagnetic standard problem includingthe spin-transfer torque that can be used for the validation and falsication of micromagnetic simulation tools. The work is based on the micromagnetic model extended by the spin-transfer torque in continuously varying magnetizations as proposed by Zhang and Li. The standard problem geometry is a permalloy cuboid of 100 nm edge length and 10 nm thickness, which contains a Landau pattern with a vortex in the center of the structure. A spin-polarized dc current density of 1012 A/m2 flows laterally through the cuboid and moves the vortex core to a new steady-state position. We show that the new vortex-core position is a sensitive measure for the correctness of micromagnetic simulatorsthat include the spin-transfer torque. The suitability of the proposed problem as a standard problem is tested by numerical results from four different finite-difference and finite-element-based simulation tools

    Alcohol consumption and the risk of morbidity and mortality for different stroke types - a systematic review and meta-analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Observational studies have suggested a complex relationship between alcohol consumption and stroke, dependent on sex, type of stroke and outcome (morbidity vs. mortality). We undertook a systematic review and a meta-analysis of studies assessing the association between levels of average alcohol consumption and relative risks of ischemic and hemorrhagic strokes separately by sex and outcome. This meta-analysis is the first to explicitly separate morbidity and mortality of alcohol-attributable stroke and thus has implications for public health and prevention.</p> <p>Methods</p> <p>Using Medical Subject Headings (alcohol drinking, ethanol, cerebrovascular accident, cerebrovascular disorders, and intracranial embolism and thrombosis and the key word stroke), a literature search of MEDLINE, EMBASE, CINAHL, CABS, WHOlist, SIGLE, ETOH, and Web of Science databases between 1980 to June 2009 was performed followed by manual searches of bibliographies of key retrieved articles. From twenty-six observational studies (cohort or case-control) with ischemic or hemorrhagic strokes the relative risk or odds ratios or hazard ratios of stroke associated with alcohol consumption were reported; alcohol consumption was quantified; and life time abstention (manually estimated where data for current abstainers were given) was used as the reference group. Two reviewers independently extracted the information on study design, participant characteristics, level of alcohol consumption, stroke outcome, control for potential confounding factors, risk estimates and key criteria of study quality using a standardized protocol.</p> <p>Results</p> <p>The dose-response relationship for hemorrhagic stroke had monotonically increasing risk for increasing consumption, whereas ischemic stroke showed a curvilinear relationship, with a protective effect of alcohol for low to moderate consumption, and increased risk for higher exposure. For more than 3 drinks on average/day, in general women had higher risks than men, and the risks for mortality were higher compared to the risks for morbidity.</p> <p>Conclusions</p> <p>These results indicate that heavy alcohol consumption increases the relative risk of any stroke while light or moderate alcohol consumption may be protective against ischemic stroke. Preventive measures that should be initiated are discussed.</p

    Identification of unique neoantigen qualities in long-term survivors of pancreatic cancer

    Get PDF
    Pancreatic ductal adenocarcinoma is a lethal cancer with fewer than 7% of patients surviving past 5 years. T-cell immunity has been linked to the exceptional outcome of the few long-term survivors1,2, yet the relevant antigens remain unknown. Here we use genetic, immunohistochemical and transcriptional immunoprofiling, computational biophysics, and functional assays to identify T-cell antigens in long-term survivors of pancreatic cancer. Using whole-exome sequencing and in silico neoantigen prediction, we found that tumours with both the highest neoantigen number and the most abundant CD8+ T-cell infiltrates, but neither alone, stratified patients with the longest survival. Investigating the specific neoantigen qualities promoting T-cell activation in long-term survivors, we discovered that these individuals were enriched in neoantigen qualities defined by a fitness model, and neoantigens in the tumour antigen MUC16 (also known as CA125). A neoantigen quality fitness model conferring greater immunogenicity to neoantigens with differential presentation and homology to infectious disease-derived peptides identified long-term survivors in two independent datasets, whereas a neoantigen quantity model ascribing greater immunogenicity to increasing neoantigen number alone did not. We detected intratumoural and lasting circulating T-cell reactivity to both high-quality and MUC16 neoantigens in long-term survivors of pancreatic cancer, including clones with specificity to both high-quality neoantigens and predicted cross-reactive microbial epitopes, consistent with neoantigen molecular mimicry. Notably, we observed selective loss of high-quality and MUC16 neoantigenic clones on metastatic progression, suggesting neoantigen immunoediting. Our results identify neoantigens with unique qualities as T-cell targets in pancreatic ductal adenocarcinoma. More broadly, we identify neoantigen quality as a biomarker for immunogenic tumours that may guide the application of immunotherapies

    Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease

    Get PDF
    Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.
    corecore