1,511 research outputs found

    Role of tax knowledge and skills: What are the graduate skills required by small to medium accounting firms

    Get PDF
    Smallandmediumaccounting(SMA)firmscanaccountforapproximately40percentofgraduaterecruitmentinAustralia.DoesthecontextofobtainingemploymentwithaSMAfirm require graduates to have certain knowledge and skills?This article reportsthefindings of a study into the technical and generic skills required by graduatescommencing employment within an Australian SMA firm. The findings suggest thattogether with financial statement and reporting, tax knowledge is highly valued forgraduatestoaSMAfirm.Intermsoftax,thisalsoincludestheabilitytousetaxsoftware.Also,thegenericskillsofcommunication,teamworkandethicsarehighlyregarded.Thisraises the question as to whether current university degrees are providing adequatetechnicalandgenericskilldevelopmentforthosegraduatesseekingemploymentwithaSMAfirm

    North American carbon dioxide sources and sinks: magnitude, attribution, and uncertainty

    Get PDF
    North America is both a source and sink of atmospheric carbon dioxide (CO2). Continental sources - such as fossil-fuel combustion in the US and deforestation in Mexico - and sinks - including most ecosystems, and particularly secondary forests - add and remove CO2 from the atmosphere, respectively. Photosynthesis converts CO2 into carbon as biomass, which is stored in vegetation, soils, and wood products. However, ecosystem sinks compensate for only similar to 35% of the continent's fossil-fuel-based CO2 emissions; North America therefore represents a net CO2 source. Estimating the magnitude of ecosystem sinks, even though the calculation is confounded by uncertainty as a result of individual inventory- and model-based alternatives, has improved through the use of a combined approach. Front Ecol Environ 2012; 10(10): 512-519, doi:10.1890/12006

    Scottish and Newcastle antiemetic pre-treatment for paracetamol poisoning study (SNAP)

    Get PDF
    BACKGROUND: Paracetamol (acetaminophen) poisoning remains the commonest cause of acute liver injury in Europe and North America. The intravenous (IV) N-acetylcysteine (NAC) regimen introduced in the 1970s has continued effectively unchanged. This involves 3 different infusion regimens (dose and time) lasting over 20 hours. The same weight-related dose of NAC is used irrespective of paracetamol dose. Complications include frequent nausea and vomiting, anaphylactoid reactions and dosing errors. We designed a randomised controlled study investigating the efficacy of antiemetic pre-treatment (ondansetron) using standard NAC and a modified, shorter, regimen. METHODS/DESIGN: We designed a double-blind trial using a 2 × 2 factorial design involving four parallel groups. Pre-treatment with ondansetron 4 mg IV was compared against placebo on nausea and vomiting following the standard (20.25 h) regimen, or a novel 12 h NAC regimen in paracetamol poisoning. Each delivered 300 mg/kg bodyweight NAC. Randomisation was stratified on: paracetamol dose, perceived risk factors, and time to presentation. The primary outcome was the incidence of nausea and vomiting following NAC. In addition the frequency of anaphylactoid reactions and end of treatment liver function documented. Where clinically necessary further doses of NAC were administered as per standard UK protocols at the end of the first antidote course. DISCUSSION: This study is primarily designed to test the efficacy of prophylactic anti-emetic therapy with ondansetron, but is the first attempt to formally examine new methods of administering IV NAC in paracetamol overdose. We anticipate, from volunteer studies, that nausea and vomiting will be less frequent with the new NAC regimen. In addition as anaphylactoid response appears related to plasma concentrations of both NAC and paracetamol anaphylactoid reactions should be less likely. This study is not powered to assess the relative efficacy of the two NAC regimens, however it will give useful information to power future studies. As the first formal randomised clinical trial in this patient group in over 30 years this study will also provide information to support further studies in patients in paracetamol overdose, particularly, when linked with modern novel biomarkers of liver damage, patients at different toxicity risk. TRIAL REGISTRATION: EudraCT number 2009-017800-10, ClinicalTrials.gov IdentifierNCT0105027

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Strong interface-induced spin-orbit coupling in graphene on WS2

    Get PDF
    Interfacial interactions allow the electronic properties of graphene to be modified, as recently demonstrated by the appearance of satellite Dirac cones in the band structure of graphene on hexagonal boron nitride (hBN) substrates. Ongoing research strives to explore interfacial interactions in a broader class of materials in order to engineer targeted electronic properties. Here we show that at an interface with a tungsten disulfide (WS2) substrate, the strength of the spin-orbit interaction (SOI) in graphene is very strongly enhanced. The induced SOI leads to a pronounced low-temperature weak anti-localization (WAL) effect, from which we determine the spin-relaxation time. We find that spin-relaxation time in graphene is two-to-three orders of magnitude smaller on WS2 than on SiO2 or hBN, and that it is comparable to the intervalley scattering time. To interpret our findings we have performed first-principle electronic structure calculations, which both confirm that carriers in graphene-on-WS2 experience a strong SOI and allow us to extract a spin-dependent low-energy effective Hamiltonian. Our analysis further shows that the use of WS2 substrates opens a possible new route to access topological states of matter in graphene-based systems.Comment: Originally submitted version in compliance with editorial guidelines. Final version with expanded discussion of the relation between theory and experiments to be published in Nature Communication

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    Being there: a preliminary study examining the role of presence in Internet Gaming Disorder

    Get PDF
    Internet Gaming Disorder (IGD) has been introduced as an emerging mental health condition requiring further study. Associations between IGD and gaming presence (i.e., absorption in the virtual environment) have been implied. The aim of the present study was twofold: (a) to evaluate the extent to which presence contributes to IGD severity and, (b) to examine longitudinal differences in IGD according to the initial level of presence experienced. The participants comprising 125 emerging adults aged 18 to 29 years completed either: (i) three face-to-face assessments (one month apart, over three months) or (ii) a cross-sectional, online assessment. IGD was assessed with the nine-item IGD Scale Short Form and presence was assessed using the Presence Questionnaire. Regression and latent growth modelling analyses were conducted. Findings demonstrated that the level of gaming presence related to IGD severity but not to linear change in severity over a three-month period. The study shows that emergent adults who play internet games may be at a high risk of IGD given a more salient sense of being present within the gaming environment. Clinical implications considering prevention and intervention initiatives are discussed

    Enriching Peptide Libraries for Binding Affinity and Specificity Through Computationally Directed Library Design

    Get PDF
    Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model.National Institute of General Medical Sciences (U.S.) (Award R01 GM110048

    Angular and Current-Target Correlations in Deep Inelastic Scattering at HERA

    Get PDF
    Correlations between charged particles in deep inelastic ep scattering have been studied in the Breit frame with the ZEUS detector at HERA using an integrated luminosity of 6.4 pb-1. Short-range correlations are analysed in terms of the angular separation between current-region particles within a cone centred around the virtual photon axis. Long-range correlations between the current and target regions have also been measured. The data support predictions for the scaling behaviour of the angular correlations at high Q2 and for anti-correlations between the current and target regions over a large range in Q2 and in the Bjorken scaling variable x. Analytic QCD calculations and Monte Carlo models correctly describe the trends of the data at high Q2, but show quantitative discrepancies. The data show differences between the correlations in deep inelastic scattering and e+e- annihilation.Comment: 26 pages including 10 figures (submitted to Eur. J. Phys. C
    corecore