125 research outputs found

    Flexible Session Management in a Distributed Environment

    Full text link
    Many secure communication libraries used by distributed systems, such as SSL, TLS, and Kerberos, fail to make a clear distinction between the authentication, session, and communication layers. In this paper we introduce CEDAR, the secure communication library used by the Condor High Throughput Computing software, and present the advantages to a distributed computing system resulting from CEDAR's separation of these layers. Regardless of the authentication method used, CEDAR establishes a secure session key, which has the flexibility to be used for multiple capabilities. We demonstrate how a layered approach to security sessions can avoid round-trips and latency inherent in network authentication. The creation of a distinct session management layer allows for optimizations to improve scalability by way of delegating sessions to other components in the system. This session delegation creates a chain of trust that reduces the overhead of establishing secure connections and enables centralized enforcement of system-wide security policies. Additionally, secure channels based upon UDP datagrams are often overlooked by existing libraries; we show how CEDAR's structure accommodates this as well. As an example of the utility of this work, we show how the use of delegated security sessions and other techniques inherent in CEDAR's architecture enables US CMS to meet their scalability requirements in deploying Condor over large-scale, wide-area grid systems

    Associations between branched chain amino acid intake and biomarkers of adiposity and cardiometabolic health independent of genetic factors: a twin study

    Get PDF
    Background: Conflicting data exist on the impact of dietary and circulating levels of branched chain amino acids (BCAA) on cardiometabolic health and it is unclear to what extent these relations are mediated by genetics.  Methods: In a cross-sectional study of 1997 female twins we examined associations between BCAA intake, measured using food frequency-questionnaires, and a range of markers of cardiometabolic health, including DXA-measured body fat, blood pressure, HOMA-IR, highsensitivity C-reactive protein (hs-CRP) and lipids. We also measured plasma concentrations of BCAA and known metabolites of amino acid metabolism using untargeted mass spectrometry. Using a within-twin design, multivariable analyses were used to compare the associations between BCAA intake and endpoints of cardiometabolic health, independently of genetic confounding.  Results: Higher BCAA intake was significantly associated with lower HOMA-IR (-0.1, Ptrend 0.02), insulin (-0.5 ”U/mL, P-trend 0.03), hs-CRP -0.3 mg/L, P-trend 0.01), systolic blood pressure (-2.3 mm Hg, P-trend 0.01) and waist-to-height ratio (-0.01, P-trend 0.04), comparing extreme quintiles of intake. These associations persisted in within-pair analysis for monozygotic twins for insulin resistance (P<0.01), inflammation (P=0.03), and blood pressure (P=0.04) suggesting independence from genetic confounding. There were no association between BCAA intake and plasma concentrations, although two metabolitespreviously associated with obesity were inversely associated with BCAA intake (alphahydroxyisovalerate and trans-4-hydroxyproline).  Conclusions: Higher intakes of BCAA were associated, independently of genetics, with lower insulin resistance, inflammation, blood pressure and adiposity-related metabolites. The BCAA intake associated with our findings are easily achievable in the habitual diet.  Abbreviations: BCAA, branched chain amino acids; DBP, diastolic blood pressure; DZ, dizygotic; FFQ, food frequency questionnaire; HDL-C, high density lipoprotein cholesterol; hs-CRP, high sensitivity C-reactive protein; MZ, monozygotic; SBP, systolic blood pressure; T2DM, type 2 diabetes; SBP, systolic blood pressure; WHtR, waist to height rati

    The decreased molar ratio of phytate:zinc improved zinc nutriture in South Koreans for the past 30 years (1969-1998)

    Get PDF
    For the assessment of representative and longitudinal Zn nutriture in South Koreans, Zn, phytate and Ca intakes were determined using four consecutive years of food consumption data taken from Korean National Nutrition Survey Report (KNNSR) every 10 years during 1969-1998. The nutrient intake data are presented for large city and rural areas. Zn intake of South Koreans in both large city and rural areas was low during 1969-1988 having values between 4.5-5.6 mg/d, after then increased to 7.4 (91% Estimated Average Requirements for Koreans, EAR = 8.1 mg/d) and 6.7 mg/d (74% EAR) in 1998 in large city and rural areas, respectively. In 1968, Zn intake was unexpectedly higher in rural areas due to higher grain consumption, but since then until 1988 Zn intake was decreased and increased back in 1998. Food sources for Zn have shifted from plants to a variety of animal products. Phytate intake of South Koreans during 1969-1978 was high mainly due to the consumption of grains and soy products which are major phytate sources, but decreased in 1998. The molar ratios of phytate:Zn and millimmolar ratio of phytate×Ca:Zn were decreased due to the decreased phytate intake in South Koreans, which implies higher zinc bioavailability. The study results suggest that Zn nutriture has improved by increased dietary Zn intakes and the decreased molar ratio of phytate:Zn in South Koreans in both large city and rural areas

    Soil type influences crop mineral composition in Malawi

    Get PDF
    Food supply and composition data can be combined to estimate micronutrient intakes and deficiency risks among populations. These estimates can be improved by using local crop composition data that can capture environmental influences including soil type. This study aimed to provide spatially resolved crop composition data for Malawi, where information is currently limited. Six hundred and fifty-two plant samples, representing 97 edible food items, were sampled from N150 sites in Malawi between 2011 and 2013. Samples were analysed by ICP-MS for up to 58 elements, including the essential minerals calcium (Ca), copper (Cu), iron (Fe), magnesium (Mg), selenium (Se) and zinc (Zn). Maize grain Ca, Cu, Fe, Mg, Se and Zn concentrations were greater from plants grown on calcareous soils than those from the more widespread low-pH soils. Leafy vegetables from calcareous soils had elevated leaf Ca, Cu, Fe and Se concentrations, but lower Zn concentrations. Several foods were found to accumulate high levels of Se, including the leaves of Moringa, a crop not previously been reported in East African food composition data sets. New estimates of national dietary mineral supplies were obtained for non-calcareous and calcareous soils. High risks of Ca (100%), Se (100%) and Zn (57%) dietary deficiencies are likely on non-calcareous soils. Deficiency risks on calcareous soils are high for Ca (97%), but lower for Se (34%) and Zn (31%). Risks of Cu, Fe and Mg deficiencies appear to be low on the basis of dietary supply levels

    A transatlantic perspective on 20 emerging issues in biological engineering

    Get PDF
    Advances in biological engineering are likely to have substantial impacts on global society. To explore these potential impacts we ran a horizon scanning exercise to capture a range of perspectives on the opportunities and risks presented by biological engineering. We first identified 70 potential issues, and then used an iterative process to prioritise 20 issues that we considered to be emerging, to have potential global impact, and to be relatively unknown outside the field of biological engineering. The issues identified may be of interest to researchers, businesses and policy makers in sectors such as health, energy, agriculture and the environment

    Evidence for models of diagnostic service provision in the community: literature mapping exercise and focused rapid reviews

    Get PDF
    Background Current NHS policy favours the expansion of diagnostic testing services in community and primary care settings. Objectives Our objectives were to identify current models of community diagnostic services in the UK and internationally and to assess the evidence for quality, safety and clinical effectiveness of such services. We were also interested in whether or not there is any evidence to support a broader range of diagnostic tests being provided in the community. Review methods We performed an initial broad literature mapping exercise to assess the quantity and nature of the published research evidence. The results were used to inform selection of three areas for investigation in more detail. We chose to perform focused reviews on logistics of diagnostic modalities in primary care (because the relevant issues differ widely between different types of test); diagnostic ultrasound (a key diagnostic technology affected by developments in equipment); and a diagnostic pathway (assessment of breathlessness) typically delivered wholly or partly in primary care/community settings. Databases and other sources searched, and search dates, were decided individually for each review. Quantitative and qualitative systematic reviews and primary studies of any design were eligible for inclusion. Results We identified seven main models of service that are delivered in primary care/community settings and in most cases with the possible involvement of community/primary care staff. Not all of these models are relevant to all types of diagnostic test. Overall, the evidence base for community- and primary care-based diagnostic services was limited, with very few controlled studies comparing different models of service. We found evidence from different settings that these services can reduce referrals to secondary care and allow more patients to be managed in primary care, but the quality of the research was generally poor. Evidence on the quality (including diagnostic accuracy and appropriateness of test ordering) and safety of such services was mixed. Conclusions In the absence of clear evidence of superior clinical effectiveness and cost-effectiveness, the expansion of community-based services appears to be driven by other factors. These include policies to encourage moving services out of hospitals; the promise of reduced waiting times for diagnosis; the availability of a wider range of suitable tests and/or cheaper, more user-friendly equipment; and the ability of commercial providers to bid for NHS contracts. However, service development also faces a number of barriers, including issues related to staffing, training, governance and quality control. Limitations We have not attempted to cover all types of diagnostic technology in equal depth. Time and staff resources constrained our ability to carry out review processes in duplicate. Research in this field is limited by the difficulty of obtaining, from publicly available sources, up-to-date information about what models of service are commissioned, where and from which providers. Future work There is a need for research to compare the outcomes of different service models using robust study designs. Comparisons of ‘true’ community-based services with secondary care-based open-access services and rapid access clinics would be particularly valuable. There are specific needs for economic evaluations and for studies that incorporate effects on the wider health system. There appears to be no easy way of identifying what services are being commissioned from whom and keeping up with local evaluations of new services, suggesting a need to improve the availability of information in this area. Funding The National Institute for Health Research Health Services and Delivery Research programme

    Biomarkers of Nutrition for Development (BOND)—Iron Review

    Get PDF
    This is the fifth in the series of reviews developed as part of the Biomarkers of Nutrition for Development (BOND) program. The BOND Iron Expert Panel (I-EP) reviewed the extant knowledge regarding iron biology, public health implications, and the relative usefulness of currently available biomarkers of iron status from deficiency to overload. Approaches to assessing intake, including bioavailability, are also covered. The report also covers technical and laboratory considerations for the use of available biomarkers of iron status, and concludes with a description of research priorities along with a brief discussion of new biomarkers with potential for use across the spectrum of activities related to the study of iron in human health. The I-EP concluded that current iron biomarkers are reliable for accurately assessing many aspects of iron nutrition. However, a clear distinction is made between the relative strengths of biomarkers to assess hematological consequences of iron deficiency versus other putative functional outcomes, particularly the relationship between maternal and fetal iron status during pregnancy, birth outcomes, and infant cognitive, motor and emotional development. The I-EP also highlighted the importance of considering the confounding effects of inflammation and infection on the interpretation of iron biomarker results, as well as the impact of life stage. Finally, alternative approaches to the evaluation of the risk for nutritional iron overload at the population level are presented, because the currently designated upper limits for the biomarker generally employed (serum ferritin) may not differentiate between true iron overload and the effects of subclinical inflammation

    Accurate diagnosis of latent tuberculosis in children, people who are immunocompromised or at risk from immunosuppression and recent arrivals from countries with a high incidence of tuberculosis: systematic review and economic evaluation

    Full text link

    Systematic Review of Potential Health Risks Posed by Pharmaceutical, Occupational and Consumer Exposures to Metallic and Nanoscale Aluminum, Aluminum Oxides, Aluminum Hydroxide and Its Soluble Salts

    Get PDF
    Aluminum (Al) is a ubiquitous substance encountered both naturally (as the third most abundant element) and intentionally (used in water, foods, pharmaceuticals, and vaccines); it is also present in ambient and occupational airborne particulates. Existing data underscore the importance of Al physical and chemical forms in relation to its uptake, accumulation, and systemic bioavailability. The present review represents a systematic examination of the peer-reviewed literature on the adverse health effects of Al materials published since a previous critical evaluation compiled by Krewski et al. (2007). Challenges encountered in carrying out the present review reflected the experimental use of different physical and chemical Al forms, different routes of administration, and different target organs in relation to the magnitude, frequency, and duration of exposure. Wide variations in diet can result in Al intakes that are often higher than the World Health Organization provisional tolerable weekly intake (PTWI), which is based on studies with Al citrate. Comparing daily dietary Al exposures on the basis of “total Al”assumes that gastrointestinal bioavailability for all dietary Al forms is equivalent to that for Al citrate, an approach that requires validation. Current occupational exposure limits (OELs) for identical Al substances vary as much as 15-fold. The toxicity of different Al forms depends in large measure on their physical behavior and relative solubility in water. The toxicity of soluble Al forms depends upon the delivered dose of Al+ 3 to target tissues. Trivalent Al reacts with water to produce bidentate superoxide coordination spheres [Al(O2)(H2O4)+ 2 and Al(H2O)6 + 3] that after complexation with O2‱−, generate Al superoxides [Al(O2‱)](H2O5)]+ 2. Semireduced AlO2‱ radicals deplete mitochondrial Fe and promote generation of H2O2, O2 ‱ − and OH‱. Thus, it is the Al+ 3-induced formation of oxygen radicals that accounts for the oxidative damage that leads to intrinsic apoptosis. In contrast, the toxicity of the insoluble Al oxides depends primarily on their behavior as particulates. Aluminum has been held responsible for human morbidity and mortality, but there is no consistent and convincing evidence to associate the Al found in food and drinking water at the doses and chemical forms presently consumed by people living in North America and Western Europe with increased risk for Alzheimer\u27s disease (AD). Neither is there clear evidence to show use of Al-containing underarm antiperspirants or cosmetics increases the risk of AD or breast cancer. Metallic Al, its oxides, and common Al salts have not been shown to be either genotoxic or carcinogenic. Aluminum exposures during neonatal and pediatric parenteral nutrition (PN) can impair bone mineralization and delay neurological development. Adverse effects to vaccines with Al adjuvants have occurred; however, recent controlled trials found that the immunologic response to certain vaccines with Al adjuvants was no greater, and in some cases less than, that after identical vaccination without Al adjuvants. The scientific literature on the adverse health effects of Al is extensive. Health risk assessments for Al must take into account individual co-factors (e.g., age, renal function, diet, gastric pH). Conclusions from the current review point to the need for refinement of the PTWI, reduction of Al contamination in PN solutions, justification for routine addition of Al to vaccines, and harmonization of OELs for Al substances

    Testing Standards for Sporicides

    Get PDF
    Sporicidal products are of considerable importance in healthcare environments due to the requirement for products that are capable of dealing with contamination with Clostridium difficile spores. Sporicidal testing standards to validate the claims of sporicidal activity are an important tool in the evaluation of commercial sporicides. Within Europe there are a number of sporicidal testing standards which are often used to validate the claims of commercial sporicides. However, the extent to which these standards reflect the practical application of sporicides in healthcare settings is limited since they employ long contact times (ïżœ30 min) and do not involve surface contamination. Alternative international standards are available which employ contaminated carriers rather than spore suspensions, and the Organisation for Economic Cooperation and Development is currently developing a unified set of standards which are more realistic in their design than the currently available European standards. This paper reviews the currently available testing standards for sporicides, highlighting the key procedural differences between them and the extent to which they reflect the practical application of sporicidal products. Some of the common problems and errors associated with the application of the European sporicidal standard methods are also highlighted and discussed. Finally gaps in the currently available testing standards are identified and discussed
    • 

    corecore