48 research outputs found
A new framework for sign language alphabet hand posture recognition using geometrical features through artificial neural network (part 1)
Hand pose tracking is essential in sign languages. An automatic recognition of performed hand signs facilitates a number of applications, especially for people with speech impairment to communication with normal people. This framework which is called ASLNN proposes a new hand posture recognition technique for the American sign language alphabet based on the neural network which works on the geometrical feature extraction of hands. A user’s hand is captured by a three-dimensional depth-based sensor camera; consequently, the hand is segmented according to the depth analysis features. The proposed system is called depth-based geometrical sign language recognition as named DGSLR. The DGSLR adopted in easier hand segmentation approach, which is further used in segmentation applications. The proposed geometrical feature extraction framework improves the accuracy of recognition due to unchangeable features against hand orientation compared to discrete cosine transform and moment invariant. The findings of the iterations demonstrate the combination of the extracted features resulted to improved accuracy rates. Then, an artificial neural network is used to drive desired outcomes. ASLNN is proficient to hand posture recognition and provides accuracy up to 96.78% which will be discussed on the additional paper of this authors in this journal
Measurement of the nuclear modification factor for muons from charm and bottom hadrons in Pb+Pb collisions at 5.02 TeV with the ATLAS detector
Heavy-flavour hadron production provides information about the transport properties and microscopic structure of the quark-gluon plasma created in ultra-relativistic heavy-ion collisions. A measurement of the muons from semileptonic decays of charm and bottom hadrons produced in Pb+Pb and pp collisions at a nucleon-nucleon centre-of-mass energy of 5.02 TeV with the ATLAS detector at the Large Hadron Collider is presented. The Pb+Pb data were collected in 2015 and 2018 with sampled integrated luminosities of 208 mu b(-1) and 38 mu b(-1), respectively, and pp data with a sampled integrated luminosity of 1.17 pb(-1) were collected in 2017. Muons from heavy-flavour semileptonic decays are separated from the light-flavour hadronic background using the momentum imbalance between the inner detector and muon spectrometer measurements, and muons originating from charm and bottom decays are further separated via the muon track's transverse impact parameter. Differential yields in Pb+Pb collisions and differential cross sections in pp collisions for such muons are measured as a function of muon transverse momentum from 4 GeV to 30 GeV in the absolute pseudorapidity interval vertical bar eta vertical bar < 2. Nuclear modification factors for charm and bottom muons are presented as a function of muon transverse momentum in intervals of Pb+Pb collision centrality. The bottom muon results are the most precise measurement of b quark nuclear modification at low transverse momentum where reconstruction of B hadrons is challenging. The measured nuclear modification factors quantify a significant suppression of the yields of muons from decays of charm and bottom hadrons, with stronger effects for muons from charm hadron decays
A search for an unexpected asymmetry in the production of e+μ− and e−μ+ pairs in proton-proton collisions recorded by the ATLAS detector at root s = 13 TeV
This search, a type not previously performed at ATLAS, uses a comparison of the production cross sections for e(+)mu(-) and e(-)mu(+) pairs to constrain physics processes beyond the Standard Model. It uses 139 fb(-1) of proton-proton collision data recorded at root s = 13 TeV at the LHC. Targeting sources of new physics which prefer final states containing e(+)mu(-) and e(-)mu(+), the search contains two broad signal regions which are used to provide model-independent constraints on the ratio of cross sections at the 2% level. The search also has two special selections targeting supersymmetric models and leptoquark signatures. Observations using one of these selections are able to exclude, at 95% confidence level, singly produced smuons with masses up to 640 GeV in a model in which the only other light sparticle is a neutralino when the R-parity-violating coupling lambda(23)(1)' is close to unity. Observations using the other selection exclude scalar leptoquarks with masses below 1880 GeV when g(1R)(eu) = g(1R)(mu c) = 1, at 95% confidence level. The limit on the coupling reduces to g(1R)(eu) = g(1R)(mu c) = 0.46 for a mass of 1420 GeV
General practitioners' deprescribing decisions in older adults with polypharmacy: a case vignette study in 31 countries
BACKGROUND: General practitioners (GPs) should regularly review patients' medications and, if necessary, deprescribe, as inappropriate polypharmacy may harm patients' health. However, deprescribing can be challenging for physicians. This study investigates GPs' deprescribing decisions in 31 countries. METHODS: In this case vignette study, GPs were invited to participate in an online survey containing three clinical cases of oldest-old multimorbid patients with potentially inappropriate polypharmacy. Patients differed in terms of dependency in activities of daily living (ADL) and were presented with and without history of cardiovascular disease (CVD). For each case, we asked GPs if they would deprescribe in their usual practice. We calculated proportions of GPs who reported they would deprescribe and performed a multilevel logistic regression to examine the association between history of CVD and level of dependency on GPs' deprescribing decisions. RESULTS: Of 3,175 invited GPs, 54% responded (N = 1,706). The mean age was 50 years and 60% of respondents were female. Despite differences across GP characteristics, such as age (with older GPs being more likely to take deprescribing decisions), and across countries, overall more than 80% of GPs reported they would deprescribe the dosage of at least one medication in oldest-old patients (> 80 years) with polypharmacy irrespective of history of CVD. The odds of deprescribing was higher in patients with a higher level of dependency in ADL (OR =1.5, 95%CI 1.25 to 1.80) and absence of CVD (OR =3.04, 95%CI 2.58 to 3.57). INTERPRETATION: The majority of GPs in this study were willing to deprescribe one or more medications in oldest-old multimorbid patients with polypharmacy. Willingness was higher in patients with increased dependency in ADL and lower in patients with CVD
Protocol for validation of the Global Scales for Early Development (GSED) for children under 3 years of age in seven countries
IntroductionChildren’s early development is affected by caregiving experiences, with lifelong health and well-being implications. Governments and civil societies need population-based measures to monitor children’s early development and ensure that children receive the care needed to thrive. To this end, the WHO developed the Global Scales for Early Development (GSED) to measure children’s early development up to 3 years of age. The GSED includes three measures for population and programmatic level measurement: (1) short form (SF) (caregiver report), (2) long form (LF) (direct administration) and (3) psychosocial form (PF) (caregiver report). The primary aim of this protocol is to validate the GSED SF and LF. Secondary aims are to create preliminary reference scores for the GSED SF and LF, validate an adaptive testing algorithm and assess the feasibility and preliminary validity of the GSED PF.Methods and analysisWe will conduct the validation in seven countries (Bangladesh, Brazil, Côte d’Ivoire, Pakistan, The Netherlands, People's Republic of China, United Republic of Tanzania), varying in geography, language, culture and income through a 1-year prospective design, combining cross-sectional and longitudinal methods with 1248 children per site, stratified by age and sex. The GSED generates an innovative common metric (Developmental Score: D-score) using the Rasch model and a Development for Age Z-score (DAZ). We will evaluate six psychometric properties of the GSED SF and LF: concurrent validity, predictive validity at 6 months, convergent and discriminant validity, and test–retest and inter-rater reliability. We will evaluate measurement invariance by comparing differential item functioning and differential test functioning across sites.Ethics and disseminationThis study has received ethical approval from the WHO (protocol GSED validation 004583 20.04.2020) and approval in each site. Study results will be disseminated through webinars and publications from WHO, international organisations, academic journals and conference proceedings.Registration detailsOpen Science Frameworkhttps://osf.io/on 19 November 2021 (DOI 10.17605/OSF.IO/KX5T7; identifier: osf-registrations-kx5t7-v1).</jats:sec
Opening practice: Supporting Reproducibility and Critical Spatial Data Science
This paper reflects on a number of trends towards a more open and reproducible approach to geographic and spatial data science over recent years. In particular, it considers trends towards Big Data, and the impacts this is having on spatial data analysis and modelling. It identifies a turn in academia towards coding as a core analytic tool, and away from proprietary software tools offering ‘black boxes’ where the internal workings of the analysis are not revealed. It is argued that this closed form software is problematic and considers a number of ways in which issues identified in spatial data analysis (such as the MAUP) could be overlooked when working with closed tools, leading to problems of interpretation and possibly inappropriate actions and policies based on these. In addition, this paper considers the role that reproducible and open spatial science may play in such an approach, taking into account the issues raised. It highlights the dangers of failing to account for the geographical properties of data, now that all data are spatial (they are collected somewhere), the problems of a desire for n = all observations in data science and it identifies the need for a critical approach. This is one in which openness, transparency, sharing and reproducibility provide a mantra for defensible and robust spatial data science