1,577 research outputs found

    E-HR and employee self-service in a British public sector organisation: an exploratory analysis

    Get PDF
    The purpose of the paper is to present empirical research on the use of an Employee Self-Service (ESS) system in a British public sector work organization, and issues associated with its introduction. A case study approach, detailing interviews with managers and employees is used. Content analysis and selective coding are employed to analyze data. Results indicate that management and employee perceptions of using ESS differ, and challenges arise in ESS implementation, including: the relevant HR role; cultural/emotional adjustments needed from staff; and those associated with appropriate organizational development/change and project organization/management. The limitations of this study include interviews with a small number of staff, which limits the generalizatility of results. Practical implications and recommendations are provided, namely HR staff need to build a coherent approach regarding ESS implementation. The value of the paper is that it represents new empirical data on ESS in practice, and a critical appraisal of it from remote home workers. Moreover, it contributes to research via contrasting findings with the prescriptive/descriptive consultancy-led literature

    Disentangling Climate and Disturbance Effects on Regional Vegetation Greening Trends

    Get PDF
    Productivity of northern latitude forests is an important driver of the terrestrial carbon cycle and is already responding to climate change. Studies ofthe satellite-derived Normalized Difference VegetationIndex (NDVI) for northern latitudes indicate recent changes in plant productivity. These detected greening and browning trends are often attributedto a lengthening of the growing season from warming temperatures. Yet, disturbance-recovery dynamics are strong drivers of productivity and can mask direct effects of climate change. Here, we analyze 1-km resolution NDVI data from 1989to 2014 for the northern latitude forests of the Greater Yellowstone Ecosystem for changes in plant productivity to address the following questions:(1) To what degree has greening taken place in the GYE over the past three decades? and (2) What is the relative importance of disturbance and climate in explaining NDVI trends? We found that the spatial extents of statistically significant productivity trends were limited to local greening and browning areas. Disturbance history, predominately fire disturbance, was a major driver of these detected NDVI trends. After accounting for fire-,insect-, and human-caused disturbances, increasing productivity trends remained. Productivity of northern latitude forests is generally considered temperature-limited; yet, we found that precipitation was a key driver of greening in the GYE

    The effect of 14 weeks of vitamin D3 supplementation on antimicrobial peptides and proteins in athletes

    Get PDF
    Heavy training is associated with increased respiratory infection risk and antimicrobial proteins are important in defence against oral and respiratory tract infections. We examined the effect of 14 weeks of vitamin D3 supplementation (5000 IU/day) on the resting plasma cathelicidin concentration and the salivary secretion rates of secretory immunoglobulin A (SIgA), cathelicidin, lactoferrin and lysozyme in athletes during a winter training period. Blood and saliva were obtained at the start of the study from 39 healthy men who were randomly allocated to vitamin D3 supplement or placebo. Blood samples were also collected at the end of the study; saliva samples were collected after 7 and 14 weeks. Plasma total 25(OH)D concentration increased by 130% in the vitamin D3 group and decreased by 43% in the placebo group (both P=0.001). The percentage change of plasma cathelicidin concentration in the vitamin D3 group was higher than in the placebo group (P=0.025). Only in the vitamin D3 group, the saliva SIgA and cathelicidin secretion rates increased over time (both P=0.03). A daily 5000 IU vitamin D3 supplement has a beneficial effect in up-regulating the expression of SIgA and cathelicidin in athletes during a winter training period which could improve resistance to respiratory infections

    Anabranching and maximum flow efficiency in Magela Creek, northern Australia

    Get PDF
    Anabranching is the prevailing river pattern found along alluvial tracts of the world's largest rivers. Hydraulic geometry and bed material discharge are compared between single channel and anabranching reaches up to 4 times bank-full discharge in Magela Creek, northern Australia. The anabranching channels exhibit greater sediment transporting capacity per unit available stream power, i.e., maximum flow efficiency (MFE). Simple flume experiments corroborate our field results showing the flow efficiency gains associated with anabranching, and highlight the prospect of a dominant anabranch, which is found in many anabranching rivers. These results demonstrate that anabranching can constitute a stable river pattern in dynamic equilibrium under circumstances in which a continuous single channel would be unable to maintain sediment conveyance. We propose the existence of a flow efficiency continuum that embraces dynamic equilibrium and disequilibrium (vertically accreting) anabranching rivers

    A Randomised, Blinded, Placebo-Controlled, Dose Escalation Study of the Tolerability and Efficacy of Filgrastim for Haemopoietic Stem Cell Mobilisation in Patients With Severe Active Rheumatoid Arthritis

    Get PDF
    Autologous haemopoietic stem cell transplantation (HSCT) represents a potential therapy for severe rheumatoid arthritis (RA). As a prelude to clinical trails, the safety and efficacy of haemopoietic stem cell (HSC) mobilisation required investigation as colony-stimulating factors (CSFs) have been reported to flare RA. A double-blind, randomised placebo-controlled dose escalation study was performed. Two cohorts of eight patients fulfilling strict eligibility criteria for severe active RA (age median 40 years, range 24-60 years; median disease duration 10.5 years, range 2-18 years) received filgrastim (r-Hu-methionyl granulocyte(G)-(SF) at 5 and 10 microg/kg/day, randomised in a 5:3 ratio with placebo. Patients were unblinded on the fifth day of treatment and those randomised to filgrastim underwent cell harvesting (leukapheresis) daily until 2 X 10^6/kg CD34+ cells (haemopoietic stem and progenitor cells) were obtained. Patients were assessed by clinical and laboratory parameters before, during and after filgrastim administration. RA flare was defined as an increase of 30% or more in two of the following parameters: tender joint count, swollen joint count or pain score. Efficacy was assessed by quantitation of CD34+ cells and CFU-GM. One patient in the 5 microg/kg/day group and two patients in the 10 microg/kg/day group fulfilled criteria for RA flare, although this did not preclude successful stem cell collection. Median changes in swollen and tender joint counts were not supportive of filgrastim consistently causing exacerbation of disease, but administration of filgrastim at 10 microg/kg/day was associated with rises in median C-reactive protein and median rheumatoid factor compared with placebo. Other adverse events were well recognised for filgrastim and included bone pain (80%) and increases in alkaline phosphatase (four-fold) and lactate dehydrogenase (two-fold). With respect to efficacy, filgrastim at 10 microg/kg/day was more efficient with all patients (n = 5) achieving target CD34+ cell counts with a single leukapheresis (median = 2.8, range = 2.3-4.8 X 10^6/kg, median CFU-GM = 22.1, range = 4.2-102.9 X 10^4/kg), whereas 1-3 leukaphereses were necessary to achieve the target yield using 5 microg/kg/day. We conclude that filgrastim may be administered to patients with severe active RA for effective stem cell mobilisation. Flare of RA occurs in a minority of patients and is more likely with 10 than 5 microg/kg/day. However, on balance, 10 microg/kg/day remains the dose of choice in view of more efficient CD34+ cell mobilisation

    Analysis of lesion localisation at colonoscopy: outcomes from a multi-centre U.K. study

    Get PDF
    Background: Colonoscopy is currently the gold standard for detection of colorectal lesions, but may be limited in anatomically localising lesions. This audit aimed to determine the accuracy of colonoscopy lesion localisation, any subsequent changes in surgical management and any potentially influencing factors. Methods: Patients undergoing colonoscopy prior to elective curative surgery for colorectal lesion/s were included from 8 registered U.K. sites (2012–2014). Three sets of data were recorded: patient factors (age, sex, BMI, screener vs. symptomatic, previous abdominal surgery); colonoscopy factors (caecal intubation, scope guide used, colonoscopist accreditation) and imaging modality. Lesion localisation was standardised with intra-operative location taken as the gold standard. Changes to surgical management were recorded. Results: 364 cases were included; majority of lesions were colonic, solitary, malignant and in symptomatic referrals. 82% patients had their lesion/s correctly located at colonoscopy. Pre-operative CT visualised lesion/s in only 73% of cases with a reduction in screening patients (64 vs. 77%; p = 0.008). 5.2% incorrectly located cases at colonoscopy underwent altered surgical management, including conversion to open. Univariate analysis found colonoscopy accreditation, scope guide use, incomplete colonoscopy and previous abdominal surgery significantly influenced lesion localisation. On multi-variate analysis, caecal intubation and scope guide use remained significant (HR 0.35, 0.20–0.60 95% CI and 0.47; 0.25–0.88, respectively). Conclusion: Lesion localisation at colonoscopy is incorrect in 18% of cases leading to potentially significant surgical management alterations. As part of accreditation, colonoscopists need lesion localisation training and awareness of when inaccuracies can occur

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    International Dateline-The Art of Acquiring CD-ROM Technology

    Get PDF

    Risk factors for high anti-HHV-8 antibody titers (≥1:51,200) in black, HIV-1 negative South African cancer patients: a case control study

    Get PDF
    Background: Infection with human herpesvirus 8 (HHV-8) is the necessary causal agent in the development of Kaposi's sarcoma (KS). Infection with HIV-1, male gender and older age all increase risk for KS. However, the geographic distribution of HHV-8 and KS both prior to the HIV/AIDS epidemic and with HIV/AIDS suggest the presence of an additional co-factor in the development of KS. Methods: Between January 1994 and October 1997, we interviewed 2576 black in-patients with cancer in Johannesburg and Soweto, South Africa. Blood was tested for antibodies against HIV-1 and HHV-8 and the study was restricted to 2191 HIV-1 negative patients. Antibodies against the latent nuclear antigen of HHV-8 encoded by orf73 were detected with an indirect immunofluorescence assay. We examined the relationship between high anti-HHV-8 antibody titers (≥1:51,200) and sociodemographic and behavioral factors using unconditional logistic regression models. Variables that were significant at p = 0.10 were included in multivariate analysis. Results: Of the 2191 HIV-1 negative patients who did not have Kaposi's sarcoma, 854 (39.0%) were positive for antibodies against HHV-8 according to the immunofluorescent assay. Among those seropositive for HHV-8, 530 (62.1%) had low titers (1:200), 227 (26.6%) had medium titers (1:51,200) and 97 (11.4%) had highest titers (1:204,800). Among the 2191 HIV-1 negative patients, the prevalence of high anti-HHV-8 antibody titers (≥1:51,200) was independently associated with increasing age (ptrend = 0.04), having a marital status of separated or divorced (p = 0.003), using wood, coal or charcoal as fuel for cooking 20 years ago instead of electricity (p = 0.02) and consuming traditional maize beer more than one time a week (p = 0.02; p-trend for increasing consumption = 0.05) although this may be due to chance given the large number of predictors considered in this analysis. Conclusions: Among HIV-negative subjects, patients with high anti-HHV-8 antibody titers are characterized by older age. Other associations that may be factors in the development of high anti- HHV-8 titers include exposure to poverty or a low socioeconomic status environment and consumption of traditional maize beer. The relationship between these variables and high anti- HHV-8 titers requires further, prospective study
    corecore