723 research outputs found
Building the BRIDGE: closing the gap on digital exclusion
Access to products, services and government is increasingly reliant on people being able to use information and communications technologies: from computers to mobile phones. Whilst there are many obvious benefits to those already familiar with the technology, those that do not have the skills or inclination to interact through such technology can get excluded and this may eventually lead to a permanent disadvantage. These groups within society can be very large according to the UK government, with 70% of over 65s reported as never having used the Internet (www.statistics.gov.uk, 2008).
As companies grow in scale and design products and services for global rather than local markets it becomes harder to track these partially excluded groups. This is reported as a growing 'psychic distance' between the designers of technologies and the prospective users of those technologies, with a risk that those excluded from the market today will become effectively invisible to designers of future products. Such users' requirements of technology no longer inform the design process and create a digital divide that is socially constructed rather than economically constrained. This is neither good for society nor business, where such exclusion may alienate, as well as prevent business from identifying and engaging with latent demand for their products and services.
This project aims to build a 'Bridge' from the needs of technologically excluded users to the capabilities of suppliers of products and services. This will be achieved through exploration of users' expectations, desires and needs and by building design guidelines to help address them. The project will extrapolate the results of this work to wider markets.
In order to realise these goals, a combination is needed of qualitative research methods to deliver a detailed picture of user needs, and quantitative methods to map that to the data that large global corporations would typically hold about their current customers and markets.
User needs identified through qualitative methods need to be related to behavioural characteristics observed through data analysis and modelling of demand within global markets. This element of the project builds on direct engagement with industry, both with designers, and their existing customers, as well as the organisational processes and data that relate one to the other and informs the designer's view of their users.
Through direct engagement with users, designers and producers, BRIDGE will contribute to the design of new products, services and interfaces. As design improves and becomes more socially inclusive, better and more sustainable relationships can be established with consumers. This knowledge can be used to identify opportunities for expansion within global markets for UK industry and hence has the potential to benefit individuals, society and the economy overall
A bivariate extension of the Hosking and Wallis goodness-of-fit measure for regional distributions
The Beta Generalized Exponential Distribution
We introduce the beta generalized exponential distribution that includes the
beta exponential and generalized exponential distributions as special cases. We
provide a comprehensive mathematical treatment of this distribution. We derive
the moment generating function and the th moment thus generalizing some
results in the literature. Expressions for the density, moment generating
function and th moment of the order statistics also are obtained. We discuss
estimation of the parameters by maximum likelihood and provide the information
matrix. We observe in one application to real data set that this model is quite
flexible and can be used quite effectively in analyzing positive data in place
of the beta exponential and generalized exponential distributions
A bivariate extension of the Hosking and Wallis goodness-of-fit measure for regional distributions
This study presents a bivariate extension of the goodness-of-fit measure for regional frequency distributions developed by Hosking and Wallis [1993] for use with the method of L-moments. Utilising the approximate joint normal distribution of the regional L-skewness and L-kurtosis, a graphical representation of the confidence region on the L-moment diagram can be constructed as an ellipsoid. Candidate distributions can then be accepted where the corresponding theoretical relationship between the L-skewness and L-kurtosis intersects the confidence region, and the chosen distribution would be the one that minimises the Mahalanobis distance measure. Based on a set of Monte Carlo simulations it is demonstrated that the new bivariate measure generally selects the true population distribution more frequently than the original method. Results are presented to show that the new measure remains robust when applied to regions where the level of inter-site correlation is at a level found in real world regions. Finally the method is applied to two different case studies involving annual maximum peak flow data from Italian and British catchments to identify suitable regional frequency distributions
Recommended from our members
What do emergency physicians in charge do? A qualitative observational study
INTRODUCTION: The emergency physician in charge role has developed in many large EDs to assist with patient flow. We aimed to describe and classify the problem-solving actions that this role requires. METHODS: We interviewed senior emergency physicians and performed iterative, qualitative observations, using continuous reflective inquiry, in a single centre. We reviewed and classified these approaches by consensus. RESULTS: Nine different problem-solving approaches were identified. These are deflecting, front loading, placing, plucking, flooding, targeting, chasing, guiding and juggling. These are useful for training and developing our understanding of how to manage an ED. CONCLUSIONS: Emergency physicians in charge have a number of problem-solving approaches that can be readily defined. We have described and categorised these. These results are potentially useful for developing decision support software.This work was funded by Cambridge University Health Partners and Cambridge University Hospitals Foundation Trust. The research was also part funded by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care East of England (CLAHRC EoE) at Cambridge and Peterborough NHS Foundation Trust. The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health
A genome-wide study of Hardy–Weinberg equilibrium with next generation sequence data
Statistical tests for Hardy–Weinberg equilibrium have been an important tool for detecting genotyping errors in the past, and remain important in the quality control of next generation sequence data. In this paper, we analyze complete chromosomes of the 1000 genomes project by using exact test procedures for autosomal and X-chromosomal variants. We find that the rate of disequilibrium largely exceeds what might be expected by chance alone for all chromosomes. Observed disequilibrium is, in about 60% of the cases, due to heterozygote excess. We suggest that most excess disequilibrium can be explained by sequencing problems, and hypothesize mechanisms that can explain exceptional heterozygosities. We report higher rates of disequilibrium for the MHC region on chromosome 6, regions flanking centromeres and p-arms of acrocentric chromosomes. We also detected long-range haplotypes and areas with incidental high disequilibrium. We report disequilibrium to be related to read depth, with variants having extreme read depths being more likely to be out of equilibrium. Disequilibrium rates were found to be 11 times higher in segmental duplications and simple tandem repeat regions. The variants with significant disequilibrium are seen to be concentrated in these areas. For next generation sequence data, Hardy–Weinberg disequilibrium seems to be a major indicator for copy number variation.Peer ReviewedPostprint (published version
Agreements About Extra-Dyadic Sex in Gay Men's Relationships: Exploring Differences in Relationship Quality by Agreement Type and Rule-Breaking Behavior
Combined linkage and association analysis of classical Hodgkin lymphoma
Peer reviewedPublisher PD
Agreements About Extra-Dyadic Sex in Gay Men's Relationships: Exploring Differences in Relationship Quality by Agreement Type and Rule-Breaking Behavior
Recommended from our members
Exploratory Analysis of Mutations in Circulating Tumour DNA as Biomarkers of Treatment Response for Patients with Relapsed High-Grade Serous Ovarian Carcinoma: A Retrospective Study
Circulating tumour DNA (ctDNA) carrying tumour-specific sequence alterations may provide a minimally invasive means to dynamically assess tumour burden and response to treatment in cancer patients. Somatic mutations are a defining feature of high-grade serous ovarian carcinoma (HGSOC). We tested whether these mutations could be used as personalised markers to monitor tumour burden and early changes as a predictor of response and time to progression (TTP).
We performed a retrospective analysis of serial plasma samples collected during routine clinical visits from 40 patients with HGSOC undergoing heterogeneous standard of care treatment. Patient-specific assays were developed for 31 unique mutations identified in formalin-fixed paraffin-embedded tumour DNA from these patients. These assays were used to quantify ctDNA in 318 plasma samples using microfluidic digital PCR. The mutant allele fraction (TP53MAF) was compared to serum CA-125, the current gold-standard response marker for HGSOC in blood, as well as to disease volume on computed tomography scans by volumetric analysis. Changes after one cycle of treatment were compared with TTP. The median TP53MAF prior to treatment in 51 relapsed treatment courses was 8% (interquartile range [IQR] 1.2%-22%) compared to 0.7% (IQR 0.3%-2.0%) for seven untreated newly diagnosed stage IIIC/IV patients. TP53MAF correlated with volumetric measurements (Pearson = 0.59, 32 cm, ctDNA was detected at ≥20 amplifiable copies per millilitre of plasma. In 49 treatment courses for relapsed disease, pre-treatment TP53MAF concentration, but not CA-125, was associated with TTP. Response to chemotherapy was seen earlier with ctDNA, with a median time to nadir of 37 d (IQR 28-54) compared with a median time to nadir of 84 d (IQR 42-116) for CA-125. In 32 relapsed treatment courses evaluable for response after one cycle of chemotherapy, a decrease in TP53MAF of >60% was an independent predictor of TTP in multivariable analysis (hazard ratio 0.22, 95% CI 0.07-0.67, = 0.008). Conversely, a decrease in TP53MAF of ≤60% was associated with poor response and identified cases with TTP < 6 mo with 71% sensitivity (95% CI 42%-92%) and 88% specificity (95% CI 64%-99%). Specificity was improved when patients with recent drainage of ascites were excluded. Ascites drainage led to a reduction of TP53MAF concentration. The limitations of this study include retrospective design, small sample size, and heterogeneity of treatment within the cohort.
In this retrospective study, we demonstrated that ctDNA is correlated with volume of disease at the start of treatment in women with HGSOC and that a decrease of ≤60% in TP53MAF after one cycle of chemotherapy was associated with shorter TTP. These results provide evidence that ctDNA has the potential to be a highly specific early molecular response marker in HGSOC and warrants further investigation in larger cohorts receiving uniform treatment.This work was supported by Cancer Research UK Grant numbers: A15601 (JDB), A11906 (NR), A20240 (NR), A18072 (JDB). JDB was supported by the National Institute for Health Research Cambridge Biomedical Research Centre. CAP was supported in part by the Academy of Medical Sciences, the Wellcome Trust, British Heart Foundation and Arthritis Research UK
- …