266 research outputs found

    The effect of tourists’ technology adoption propensity on the acceptance of smart tourism apps

    Get PDF
    STA are becoming popular as tourists’ increasing relies on mobile devices in their trip to explore the destination. Therefore, the adoption of STA is crucial to the development of smart tourism. Extant literature mainly focuses on the application of different technology acceptance models. This study explores the impact of tourists’ attitude about technology on their intention to use STA. The technology adoption propensity (TAP) scale was used to measure the technology readiness of tourists in this study. A survey with a structured questionnaire was used to collect data in this study. The respondents were asked to study the introduction of a STA similar to those displayed on an App store and then complete the questionnaire. A total of 355 valid questionnaires were collected. The data were analyzed using the Partial least-squares method (PLS). Since TAP is a multi-dimensional scale, a second-order analysis was performed. From the TAP measures, tourists generally believe that technology changes and improve their daily lives, making their lives easier. However, technology is a double-edged sword, which will bring some adverse effects while improving the tourist's living standard. The result of the path analysis reveals that all the hypotheses proposed in this study are valid. The TAP of tourists has a positive influence on usage intention with trust and curiosity as two partial mediating variables. TAP has a stronger influence on the tourists' curiosity than trust, and curiosity has a stronger effect on tourists’ intentions to use STA than trust. Tourists with higher TAP will plead to increased curiosity about STA, that will prompt them to try, understand, and continue using the STA. The higher the tourists’ trust in the STA, the more willing they would choose and use STA

    Cost-benefit analysis of introducing next-generation sequencing (metagenomic) pathogen testing in the setting of pyrexia of unknown origin

    Get PDF
    Pyrexia of unknown origin (PUO) is defined as a temperature of >38.3°C that lasts for >3 weeks, where no cause can be found despite appropriate investigation. Existing protocols for the work-up of PUO can be extensive and costly, motivating the application of recent advances in molecular diagnostics to pathogen testing. There have been many reports describing various analytical methods and performance of metagenomic pathogen testing in clinical samples but the economics of it has been less well studied. This study pragmatically evaluates the feasibility of introducing metagenomic testing in this setting by assessing the relative cost of clinically-relevant strategies employing this investigative tool under various cost and performance scenarios using Singapore as a demonstration case, and assessing the price and performance benchmarks, which would need to be achieved for metagenomic testing to be potentially considered financially viable relative to the current diagnostic standard. This study has some important limitations: we examined only impact of introducing the metagenomic test to the overall diagnostic cost and excluded costs associated with hospitalization and makes assumptions about the performance of the routine diagnostic tests, limiting the cost of metagenomic test, and the lack of further work-up after positive pathogen detection by the metagenomic test. However, these assumptions were necessary to keep the model within reasonable limits. In spite of these, the simplified presentation lends itself to the illustration of the key insights of our paper. In general, we find the use of metagenomic testing as second-line investigation is effectively dominated, and that use of metagenomic testing at first-line would typically require higher rates of detection or lower cost than currently available in order to be justifiable purely as a cost-saving measure. We conclude that current conditions do not warrant a widespread rush to deploy metagenomic testing to resolve any and all uncertainty, but rather as a front-line technology that should be used in specific contexts, as a supplement to rather than a replacement for careful clinical judgement

    A compendium of multi-omics data illuminating host responses to lethal human virus infections

    Get PDF
    Human infections caused by viral pathogens trigger a complex gamut of host responses that limit disease, resolve infection, generate immunity, and contribute to severe disease or death. Here, we present experimental methods and multi-omics data capture approaches representing the global host response to infection generated from 45 individual experiments involving human viruses from the Orthomyxoviridae, Filoviridae, Flaviviridae, and Coronaviridae families. Analogous experimental designs were implemented across human or mouse host model systems, longitudinal samples were collected over defined time courses, and global multi-omics data (transcriptomics, proteomics, metabolomics, and lipidomics) were acquired by microarray, RNA sequencing, or mass spectrometry analyses. For comparison, we have included transcriptomics datasets from cells treated with type I and type II human interferon. Raw multi-omics data and metadata were deposited in public repositories, and we provide a central location linking the raw data with experimental metadata and ready-to-use, quality-controlled, statistically processed multi-omics datasets not previously available in any public repository. This compendium of infection-induced host response data for reuse will be useful for those endeavouring to understand viral disease pathophysiology and network biology

    Bio-analytical Assay Methods used in Therapeutic Drug Monitoring of Antiretroviral Drugs-A Review

    Get PDF

    Measurement of the View the tt production cross-section using eÎŒ events with b-tagged jets in pp collisions at √s = 13 TeV with the ATLAS detector

    Get PDF
    This paper describes a measurement of the inclusive top quark pair production cross-section (σttÂŻ) with a data sample of 3.2 fb−1 of proton–proton collisions at a centre-of-mass energy of √s = 13 TeV, collected in 2015 by the ATLAS detector at the LHC. This measurement uses events with an opposite-charge electron–muon pair in the final state. Jets containing b-quarks are tagged using an algorithm based on track impact parameters and reconstructed secondary vertices. The numbers of events with exactly one and exactly two b-tagged jets are counted and used to determine simultaneously σttÂŻ and the efficiency to reconstruct and b-tag a jet from a top quark decay, thereby minimising the associated systematic uncertainties. The cross-section is measured to be: σttÂŻ = 818 ± 8 (stat) ± 27 (syst) ± 19 (lumi) ± 12 (beam) pb, where the four uncertainties arise from data statistics, experimental and theoretical systematic effects, the integrated luminosity and the LHC beam energy, giving a total relative uncertainty of 4.4%. The result is consistent with theoretical QCD calculations at next-to-next-to-leading order. A fiducial measurement corresponding to the experimental acceptance of the leptons is also presented

    Search for TeV-scale gravity signatures in high-mass final states with leptons and jets with the ATLAS detector at sqrt [ s ] = 13TeV

    Get PDF
    A search for physics beyond the Standard Model, in final states with at least one high transverse momentum charged lepton (electron or muon) and two additional high transverse momentum leptons or jets, is performed using 3.2 fb−1 of proton–proton collision data recorded by the ATLAS detector at the Large Hadron Collider in 2015 at √s = 13 TeV. The upper end of the distribution of the scalar sum of the transverse momenta of leptons and jets is sensitive to the production of high-mass objects. No excess of events beyond Standard Model predictions is observed. Exclusion limits are set for models of microscopic black holes with two to six extra dimensions

    The performance of the jet trigger for the ATLAS detector during 2011 data taking

    Get PDF
    The performance of the jet trigger for the ATLAS detector at the LHC during the 2011 data taking period is described. During 2011 the LHC provided proton–proton collisions with a centre-of-mass energy of 7 TeV and heavy ion collisions with a 2.76 TeV per nucleon–nucleon collision energy. The ATLAS trigger is a three level system designed to reduce the rate of events from the 40 MHz nominal maximum bunch crossing rate to the approximate 400 Hz which can be written to offline storage. The ATLAS jet trigger is the primary means for the online selection of events containing jets. Events are accepted by the trigger if they contain one or more jets above some transverse energy threshold. During 2011 data taking the jet trigger was fully efficient for jets with transverse energy above 25 GeV for triggers seeded randomly at Level 1. For triggers which require a jet to be identified at each of the three trigger levels, full efficiency is reached for offline jets with transverse energy above 60 GeV. Jets reconstructed in the final trigger level and corresponding to offline jets with transverse energy greater than 60 GeV, are reconstructed with a resolution in transverse energy with respect to offline jets, of better than 4 % in the central region and better than 2.5 % in the forward direction
    • 

    corecore