292 research outputs found

    Investigation of the erosive potential of sour novelty sweets

    Get PDF
    Provides a background about the link between acidic beverages and dental erosion. Discusses the potential risk of developing dental erosion upon the frequent consumption of novelty sweets. Provides information which could be used by dental personnel in counselling patients who consume novelty sweets or at risk of developing dental erosion. Abstract Background The expansion of the novelty sweets market in the UK has major potential public health implications in children and young adults as they may cause dental erosion. Objective To investigate the erosive potential of the novelty sweets in term of their physiochemical properties and amount of enamel loss. Subjects and methods The pH of a variety of novelty sweets was tested in vitro using a pH meter and the neutralisable acidity was assessed by titrating the sweets against 0.1M NaOH. The viscosity of the novelty sweets was measured using a rotational viscometer. The wettability of enamel by each sweet was measured using dynamic contact angle analyser. Enamel loss was assessed using contact profilometry. Results The pH ranged from 1.8–3.2, the neutralisable acidity ranged from 9–201 ml of 0.1 NaOH. The viscosity of the novelty sweets that come in liquid form ranged from 2–594 mPa s. The surface enamel erosion ranged from 1.95–15.77 μm and from 2.5–17.6 μm with and without immersing in saliva for 1 hour before immersing in acidic solution respectively. The amount of subsurface enamel loss was ranged from 0.75 to 2.3 μm following ultrasonication at 0 min of acidic attack and from 0.23 to 0.85 μm at 60 minutes of acidic attack while immersed in saliva. The contact angle between enamel surface and four sweet was less than the angle formed between the orange juice and the enamel which caused more wettability of enamel. Conclusion The pH is lower than the critical value for enamel erosion (5.5), high neutralisable acidity and high sugar content strongly suggest that these sweets may cause significant amount of dental erosion clinically. In addition, the degree of wettability of enamel by solution is an important factor to consider in determining the enamel loss caused by acidic solution. Immediate tooth brushing would cause further enamel loss as a result of the mechanical removal of softened enamel. However, it has been suggested that postponing brushing after erosive attack should be reconsidered

    The prevalence of obesity and the knowledge, attitude and practice of healthy lifestyle among the adult population in Kampung Banyuk, Kampung Kerto and Kampung Langup

    Get PDF
    Background Obesity has become a great public health concern and prevent!ive measures need to be done. Objective The objective of this research is to determine the prevalence ofobesity and their knowledge, attitude and practices (KAP) towards a healthy lifestyle among the residents in Kampung Banyok, Kampung Kerto and Kampung Langup. Methods A cross-sectional study was done among 126 randomly selected villagers aged 18 years and above from the three selected villages. They were interviewed based on a questionnaire and their body mass index (BMI) was calculated. Results It was found that more than half of the respondents are obese. Among the respondents, for the healthy lifestyle component, the level of good KAP is 69.2%, 46.8% and 60.3% respectively. As for obesity component, the level of good KAP is 60.3%, 54% and 54.8% respectively. Among the obese respondents, the level of knowledge and attitude on healthy lifestyle and obesity is better. Obese respondents have better obesity preventive practice while the non-obese respondents have better practice on healthy lifestyle. The only significant correlation noted was between knowledge and practice on obesity albeit a negative one. Conclusion The level of knowledge and attitude and practice on healthy lifestyle among the obese respondents are encouraged but more effort in the preventive practice ofobesity shou1d be done to reduce the prevalence.

    An effectiveness analysis of healthcare systems using a systems theoretic approach

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The use of accreditation and quality measurement and reporting to improve healthcare quality and patient safety has been widespread across many countries. A review of the literature reveals no association between the accreditation system and the quality measurement and reporting systems, even when hospital compliance with these systems is satisfactory. Improvement of health care outcomes needs to be based on an appreciation of the whole system that contributes to those outcomes. The research literature currently lacks an appropriate analysis and is fragmented among activities. This paper aims to propose an integrated research model of these two systems and to demonstrate the usefulness of the resulting model for strategic research planning.</p> <p>Methods/design</p> <p>To achieve these aims, a systematic integration of the healthcare accreditation and quality measurement/reporting systems is structured hierarchically. A holistic systems relationship model of the administration segment is developed to act as an investigation framework. A literature-based empirical study is used to validate the proposed relationships derived from the model. Australian experiences are used as evidence for the system effectiveness analysis and design base for an adaptive-control study proposal to show the usefulness of the system model for guiding strategic research.</p> <p>Results</p> <p>Three basic relationships were revealed and validated from the research literature. The systemic weaknesses of the accreditation system and quality measurement/reporting system from a system flow perspective were examined. The approach provides a system thinking structure to assist the design of quality improvement strategies. The proposed model discovers a fourth implicit relationship, a feedback between quality performance reporting components and choice of accreditation components that is likely to play an important role in health care outcomes. An example involving accreditation surveyors is developed that provides a systematic search for improving the impact of accreditation on quality of care and hence on the accreditation/performance correlation.</p> <p>Conclusion</p> <p>There is clear value in developing a theoretical systems approach to achieving quality in health care. The introduction of the systematic surveyor-based search for improvements creates an adaptive-control system to optimize health care quality. It is hoped that these outcomes will stimulate further research in the development of strategic planning using systems theoretic approach for the improvement of quality in health care.</p

    25th RCOphth Congress, President's Session paper:25 years of progress in medical retina

    Get PDF
    The quarter century since the foundation of the Royal College of Ophthalmologists has coincided with immense change in the subspecialty of medical retina, which has moved from being the province of a few dedicated enthusiasts to being an integral, core part of ophthalmology in every eye department. In age-related macular degeneration, there has been a move away from targeted, destructive laser therapy, dependent on fluorescein angiography to intravitreal injection therapy of anti-growth factor agents, largely guided by optical coherence tomography. As a result of these changes, ophthalmologists have witnessed a marked improvement in visual outcomes for their patients with wet age-related macular degeneration (AMD), while at the same time developing and enacting entirely novel ways of delivering care. In the field of diabetic retinopathy, this period also saw advances in laser technology and a move away from highly destructive laser photocoagulation treatment to gentler retinal laser treatments. The introduction of intravitreal therapies, both steroids and anti-growth factor agents, has further advanced the treatment of diabetic macular oedema. This era has also seen in the United Kingdom the introduction of a coordinated national diabetic retinopathy screening programme, which offers an increasing hope that the burden of blindness from diabetic eye disease can be lessened. Exciting future advances in retinal imaging, genetics, and pharmacology will allow us to further improve outcomes for our patients and for ophthalmologists specialising in medical retina, the future looks very exciting but increasingly busy

    The P2X1 receptor and platelet function

    Get PDF
    Extracellular nucleotides are ubiquitous signalling molecules, acting via the P2 class of surface receptors. Platelets express three P2 receptor subtypes, ADP-dependent P2Y1 and P2Y12 G-protein-coupled receptors and the ATP-gated P2X1 non-selective cation channel. Platelet P2X1 receptors can generate significant increases in intracellular Ca2+, leading to shape change, movement of secretory granules and low levels of αIIbβ3 integrin activation. P2X1 can also synergise with several other receptors to amplify signalling and functional events in the platelet. In particular, activation of P2X1 receptors by ATP released from dense granules amplifies the aggregation responses to low levels of the major agonists, collagen and thrombin. In vivo studies using transgenic murine models show that P2X1 receptors amplify localised thrombosis following damage of small arteries and arterioles and also contribute to thromboembolism induced by intravenous co-injection of collagen and adrenaline. In vitro, under flow conditions, P2X1 receptors contribute more to aggregate formation on collagen-coated surfaces as the shear rate is increased, which may explain their greater contribution to localised thrombosis in arterioles compared to venules within in vivo models. Since shear increases substantially near sites of stenosis, anti-P2X1 therapy represents a potential means of reducing thrombotic events at atherosclerotic plaques

    Measurement and interpretation of same-sign W boson pair production in association with two jets in pp collisions at s = 13 TeV with the ATLAS detector

    Get PDF
    This paper presents the measurement of fducial and diferential cross sections for both the inclusive and electroweak production of a same-sign W-boson pair in association with two jets (W±W±jj) using 139 fb−1 of proton-proton collision data recorded at a centre-of-mass energy of √s = 13 TeV by the ATLAS detector at the Large Hadron Collider. The analysis is performed by selecting two same-charge leptons, electron or muon, and at least two jets with large invariant mass and a large rapidity diference. The measured fducial cross sections for electroweak and inclusive W±W±jj production are 2.92 ± 0.22 (stat.) ± 0.19 (syst.)fb and 3.38±0.22 (stat.)±0.19 (syst.)fb, respectively, in agreement with Standard Model predictions. The measurements are used to constrain anomalous quartic gauge couplings by extracting 95% confdence level intervals on dimension-8 operators. A search for doubly charged Higgs bosons H±± that are produced in vector-boson fusion processes and decay into a same-sign W boson pair is performed. The largest deviation from the Standard Model occurs for an H±± mass near 450 GeV, with a global signifcance of 2.5 standard deviations

    Search for pair production of squarks or gluinos decaying via sleptons or weak bosons in final states with two same-sign or three leptons with the ATLAS detector

    Get PDF
    A search for pair production of squarks or gluinos decaying via sleptons or weak bosons is reported. The search targets a final state with exactly two leptons with same-sign electric charge or at least three leptons without any charge requirement. The analysed data set corresponds to an integrated luminosity of 139 fb−1 of proton-proton collisions collected at a centre-of-mass energy of 13 TeV with the ATLAS detector at the LHC. Multiple signal regions are defined, targeting several SUSY simplified models yielding the desired final states. A single control region is used to constrain the normalisation of the WZ + jets background. No significant excess of events over the Standard Model expectation is observed. The results are interpreted in the context of several supersymmetric models featuring R-parity conservation or R-parity violation, yielding exclusion limits surpassing those from previous searches. In models considering gluino (squark) pair production, gluino (squark) masses up to 2.2 (1.7) TeV are excluded at 95% confidence level

    Combination of searches for heavy spin-1 resonances using 139 fb−1 of proton-proton collision data at s = 13 TeV with the ATLAS detector

    Get PDF
    A combination of searches for new heavy spin-1 resonances decaying into different pairings of W, Z, or Higgs bosons, as well as directly into leptons or quarks, is presented. The data sample used corresponds to 139 fb−1 of proton-proton collisions at = 13 TeV collected during 2015–2018 with the ATLAS detector at the CERN Large Hadron Collider. Analyses selecting quark pairs (qq, bb, , and tb) or third-generation leptons (τν and ττ) are included in this kind of combination for the first time. A simplified model predicting a spin-1 heavy vector-boson triplet is used. Cross-section limits are set at the 95% confidence level and are compared with predictions for the benchmark model. These limits are also expressed in terms of constraints on couplings of the heavy vector-boson triplet to quarks, leptons, and the Higgs boson. The complementarity of the various analyses increases the sensitivity to new physics, and the resulting constraints are stronger than those from any individual analysis considered. The data exclude a heavy vector-boson triplet with mass below 5.8 TeV in a weakly coupled scenario, below 4.4 TeV in a strongly coupled scenario, and up to 1.5 TeV in the case of production via vector-boson fusion

    Software Performance of the ATLAS Track Reconstruction for LHC Run 3

    Get PDF
    Charged particle reconstruction in the presence of many simultaneous proton–proton (pp) collisions in the LHC is a challenging task for the ATLAS experiment’s reconstruction software due to the combinatorial complexity. This paper describes the major changes made to adapt the software to reconstruct high-activity collisions with an average of 50 or more simultaneous pp interactions per bunch crossing (pileup) promptly using the available computing resources. The performance of the key components of the track reconstruction chain and its dependence on pile-up are evaluated, and the improvement achieved compared to the previous software version is quantified. For events with an average of 60 pp collisions per bunch crossing, the updated track reconstruction is twice as fast as the previous version, without significant reduction in reconstruction efficiency and while reducing the rate of combinatorial fake tracks by more than a factor two
    corecore