2,380 research outputs found

    Effects of zinc and fluoride on the remineralisation of artificial carious lesions under simulated plaque fluid conditions.

    Get PDF
    The aim was to study the effects of zinc (Zn) and fluoride (F) on remineralisation at plaque fluid concentrations. Artificial carious lesions were created in 2 acid-gel demineralising systems (initially infinitely undersaturated and partially saturated with respect to enamel) giving lesions with different mineral distribution characteristics (high and low R values, respectively) but similar integrated mineral loss values. Lesions of both types were assigned to 1 of 4 groups and remineralised for 5 days at 37°C. Zn and F were added, based on plaque fluid concentrations 1 h after application, to give 4 treatments: 231 μmol/l Zn, 10.5 μmol/l F, Zn/F combined and an unmodified control solution (non-F/non-Zn). Subsequently remineralisation was measured using microradiography. High-R lesions were analysed for calcium, phosphorus, F and Zn using electron probe micro-analysis. All lesions underwent statistically significant remineralisation. For low-R lesions, remineralisation was in the order F(a) < non-F/non-Zn(a) < Zn(a, b) < Zn/F(b), and for high-R lesions F(a) < non-F/non-Zn(b) < Zn(b) < Zn/F(c) (treatments with the same superscript letter not significantly different, at p < 0.05). Qualitatively, remineralisation occurred throughout non-F/non-Zn and Zn groups, predominantly at the surface zone (F) and within the lesion body (Zn/F). Electron probe micro-analysis revealed Zn in relatively large amounts in the outer regions (Zn, Zn/F). F was abundant not only at the surface (F), but also in the lesion body (Zn/F). Calcium:phosphate ratios were similar to hydroxyapatite (all). To conclude, under static remineralising conditions simulating plaque fluid, Zn/F treatment gave significantly greater remineralisation than did F treatment, possibly because Zn in the Zn/F group maintained greater surface zone porosity compared with F, facilitating greater lesion body remineralisation

    A review of information flow diagrammatic models for product-service systems

    Get PDF
    A product-service system (PSS) is a combination of products and services to create value for both customers and manufacturers. Modelling a PSS based on function orientation offers a useful way to distinguish system inputs and outputs with regards to how data are consumed and information is used, i.e. information flow. This article presents a review of diagrammatic information flow tools, which are designed to describe a system through its functions. The origin, concept and applications of these tools are investigated, followed by an analysis of information flow modelling with regards to key PSS properties. A case study of selection laser melting technology implemented as PSS will then be used to show the application of information flow modelling for PSS design. A discussion based on the usefulness of the tools in modelling the key elements of PSS and possible future research directions are also presented

    Ratio of the Isolated Photon Cross Sections at \sqrt{s} = 630 and 1800 GeV

    Get PDF
    The inclusive cross section for production of isolated photons has been measured in \pbarp collisions at s=630\sqrt{s} = 630 GeV with the \D0 detector at the Fermilab Tevatron Collider. The photons span a transverse energy (ETE_T) range from 7-49 GeV and have pseudorapidity η<2.5|\eta| < 2.5. This measurement is combined with to previous \D0 result at s=1800\sqrt{s} = 1800 GeV to form a ratio of the cross sections. Comparison of next-to-leading order QCD with the measured cross section at 630 GeV and ratio of cross sections show satisfactory agreement in most of the ETE_T range.Comment: 7 pages. Published in Phys. Rev. Lett. 87, 251805, (2001

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector

    Get PDF
    The inclusive and dijet production cross-sections have been measured for jets containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The measurements use data corresponding to an integrated luminosity of 34 pb^-1. The b-jets are identified using either a lifetime-based method, where secondary decay vertices of b-hadrons in jets are reconstructed using information from the tracking detectors, or a muon-based method where the presence of a muon is used to identify semileptonic decays of b-hadrons inside jets. The inclusive b-jet cross-section is measured as a function of transverse momentum in the range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet cross-section is measured as a function of the dijet invariant mass in the range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets and the angular variable chi in two dijet mass regions. The results are compared with next-to-leading-order QCD predictions. Good agreement is observed between the measured cross-sections and the predictions obtained using POWHEG + Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet cross-section. However, it does not reproduce the measured inclusive cross-section well, particularly for central b-jets with large transverse momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final version published in European Physical Journal

    Multiple populations in globular clusters. Lessons learned from the Milky Way globular clusters

    Full text link
    Recent progress in studies of globular clusters has shown that they are not simple stellar populations, being rather made of multiple generations. Evidence stems both from photometry and spectroscopy. A new paradigm is then arising for the formation of massive star clusters, which includes several episodes of star formation. While this provides an explanation for several features of globular clusters, including the second parameter problem, it also opens new perspectives about the relation between globular clusters and the halo of our Galaxy, and by extension of all populations with a high specific frequency of globular clusters, such as, e.g., giant elliptical galaxies. We review progress in this area, focusing on the most recent studies. Several points remain to be properly understood, in particular those concerning the nature of the polluters producing the abundance pattern in the clusters and the typical timescale, the range of cluster masses where this phenomenon is active, and the relation between globular clusters and other satellites of our Galaxy.Comment: In press (The Astronomy and Astrophysics Review

    Observation of associated near-side and away-side long-range correlations in √sNN=5.02  TeV proton-lead collisions with the ATLAS detector

    Get PDF
    Two-particle correlations in relative azimuthal angle (Δϕ) and pseudorapidity (Δη) are measured in √sNN=5.02  TeV p+Pb collisions using the ATLAS detector at the LHC. The measurements are performed using approximately 1  μb-1 of data as a function of transverse momentum (pT) and the transverse energy (ΣETPb) summed over 3.1<η<4.9 in the direction of the Pb beam. The correlation function, constructed from charged particles, exhibits a long-range (2<|Δη|<5) “near-side” (Δϕ∼0) correlation that grows rapidly with increasing ΣETPb. A long-range “away-side” (Δϕ∼π) correlation, obtained by subtracting the expected contributions from recoiling dijets and other sources estimated using events with small ΣETPb, is found to match the near-side correlation in magnitude, shape (in Δη and Δϕ) and ΣETPb dependence. The resultant Δϕ correlation is approximately symmetric about π/2, and is consistent with a dominant cos⁡2Δϕ modulation for all ΣETPb ranges and particle pT

    Validity of willingness to pay measures under preference uncertainty

    Get PDF
    This paper is part of the project ACCEPT, which is funded by the German Federal Ministry for Education and Research (grant number 01LA1112A). The publication of this article was funded by the Open Access fund of the Leibniz Association. All data is available on the project homepage (https://www.ifw-kiel.de/forschung/umwelt/projekte/accept) and from Figshare (https://dx.doi.org/10.6084/m9.figshare.3113050.v1).Recent studies in the marketing literature developed a new method for eliciting willingness to pay (WTP) with an open-ended elicitation format: the Range-WTP method. In contrast to the traditional approach of eliciting WTP as a single value (Point-WTP), Range-WTP explicitly allows for preference uncertainty in responses. The aim of this paper is to apply Range-WTP to the domain of contingent valuation and to test for its theoretical validity and robustness in comparison to the Point-WTP. Using data from two novel large-scale surveys on the perception of solar radiation management (SRM), a little-known technique for counteracting climate change, we compare the performance of both methods in the field. In addition to the theoretical validity (i.e. the degree to which WTP values are consistent with theoretical expectations), we analyse the test-retest reliability and stability of our results over time. Our evidence suggests that the Range-WTP method clearly outperforms the Point-WTP method.Publisher PDFPeer reviewe

    Search for direct pair production of the top squark in all-hadronic final states in proton-proton collisions at s√=8 TeV with the ATLAS detector

    Get PDF
    The results of a search for direct pair production of the scalar partner to the top quark using an integrated luminosity of 20.1fb−1 of proton–proton collision data at √s = 8 TeV recorded with the ATLAS detector at the LHC are reported. The top squark is assumed to decay via t˜→tχ˜01 or t˜→ bχ˜±1 →bW(∗)χ˜01 , where χ˜01 (χ˜±1 ) denotes the lightest neutralino (chargino) in supersymmetric models. The search targets a fully-hadronic final state in events with four or more jets and large missing transverse momentum. No significant excess over the Standard Model background prediction is observed, and exclusion limits are reported in terms of the top squark and neutralino masses and as a function of the branching fraction of t˜ → tχ˜01 . For a branching fraction of 100%, top squark masses in the range 270–645 GeV are excluded for χ˜01 masses below 30 GeV. For a branching fraction of 50% to either t˜ → tχ˜01 or t˜ → bχ˜±1 , and assuming the χ˜±1 mass to be twice the χ˜01 mass, top squark masses in the range 250–550 GeV are excluded for χ˜01 masses below 60 GeV

    The computational therapeutic: exploring Weizenbaum's ELIZA as a history of the present

    Get PDF
    This paper explores the history of ELIZA, a computer programme approximating a Rogerian therapist, developed by Jospeh Weizenbaum at MIT in the 1970s, as an early AI experiment. ELIZA’s reception provoked Weizenbaum to re-appraise the relationship between ‘computer power and human reason’ and to attack the ‘powerful delusional thinking’ about computers and their intelligence that he understood to be widespread in the general public and also amongst experts. The root issue for Weizenbaum was whether human thought could be ‘entirely computable’ (reducible to logical formalism). This also provoked him to re-consider the nature of machine intelligence and to question the instantiation of its logics in the social world, which would come to operate, he said, as a ‘slow acting poison’. Exploring Weizenbaum’s 20th Century apostasy, in the light of ELIZA, illustrates ways in which contemporary anxieties and debates over machine smartness connect to earlier formations. In particular, this article argues that it is in its designation as a computational therapist that ELIZA is most significant today. ELIZA points towards a form of human–machine relationship now pervasive, a precursor of the ‘machinic therapeutic’ condition we find ourselves in, and thus speaks very directly to questions concerning modulation, autonomy, and the new behaviorism that are currently arising
    corecore