17,120 research outputs found
Clinicians' experiences of using the MCA (2005) with people with intellectual disabilities
Section A is a narrative synthesis of the empirical literature of professionalsâ knowledge of the
MCA and how they apply it when working with people with intellectual disabilities (ID).
Eleven papers were identified for inclusion in this review. Four themes, with subthemes, were
identified: âprocesses involvedâ, âworking with complexityâ, âknowledge gaps and variabilityâ
and âassessor needsâ. Methodological strengths and weaknesses are also considered. Findings
are discussed in relation to clinical implications and recommendations for future research are
outlined.
Section B is an empirical study using Interpretative Phenomenological Analysis to explore the
experiences of clinicians using the MCA (2005) with people with ID to assess capacity to
consent to sex. Eight clinicians, who had completed between 2 and 40-50 (mode=2) MCA
assessments regarding consent to sex. Three superordinate themes, with subthemes, are
outlined and discussed in relation to the existing literature. Limitations, clinical implications
and areas of future research are considered
Recommended from our members
The influence of blockchains and internet of things on global value chain
Copyright © 2022 The Authors. Despite the increasing proliferation of deploying the internet of things (IoT) in the global value chain (GVC), several challenges might lead to a lack of trust among value chain partners, for example, technical challenges (i.e., confidentiality, authenticity, and privacy); and security challenges (i.e., counterfeiting, physical tampering, and data theft). In this study, we argue that blockchain technology (BT), when combined with the IoT ecosystem, will strengthen GVC and enhance value creation and capture among value chain partners. Therefore, we examine the impact of BT combined with the IoT ecosystem and how it can be utilized to enhance value creation and capture among value chain partners. We collected data through an online survey, and 265 U.K. Agri-food retailers completed the survey. Our data were analyzed using structural equation modeling. Our finding reveals that BT enhances GVC by improving IoT scalability, security, and traceability combined with the IoT ecosystem. Moreover, the combination of BT and IoT strengthens GVC and creates more value for value chain partners, which serves as a competitive advantage. Finally, our research outlines the theoretical and practical contribution of combining BT and the IoT ecosystem
Data-to-text generation with neural planning
In this thesis, we consider the task of data-to-text generation, which takes non-linguistic
structures as input and produces textual output. The inputs can take the form of
database tables, spreadsheets, charts, and so on. The main application of data-to-text
generation is to present information in a textual format which makes it accessible to
a layperson who may otherwise find it problematic to understand numerical figures.
The task can also automate routine document generation jobs, thus improving human
efficiency. We focus on generating long-form text, i.e., documents with multiple paragraphs. Recent approaches to data-to-text generation have adopted the very successful
encoder-decoder architecture or its variants. These models generate fluent (but often
imprecise) text and perform quite poorly at selecting appropriate content and ordering
it coherently. This thesis focuses on overcoming these issues by integrating content
planning with neural models. We hypothesize data-to-text generation will benefit from
explicit planning, which manifests itself in (a) micro planning, (b) latent entity planning, and (c) macro planning. Throughout this thesis, we assume the input to our
generator are tables (with records) in the sports domain. And the output are summaries
describing what happened in the game (e.g., who won/lost, ..., scored, etc.).
We first describe our work on integrating fine-grained or micro plans with data-to-text generation. As part of this, we generate a micro plan highlighting which records
should be mentioned and in which order, and then generate the document while taking
the micro plan into account.
We then show how data-to-text generation can benefit from higher level latent entity planning. Here, we make use of entity-specific representations which are dynam ically updated. The text is generated conditioned on entity representations and the
records corresponding to the entities by using hierarchical attention at each time step.
We then combine planning with the high level organization of entities, events, and
their interactions. Such coarse-grained macro plans are learnt from data and given
as input to the generator. Finally, we present work on making macro plans latent
while incrementally generating a document paragraph by paragraph. We infer latent
plans sequentially with a structured variational model while interleaving the steps of
planning and generation. Text is generated by conditioning on previous variational
decisions and previously generated text.
Overall our results show that planning makes data-to-text generation more interpretable, improves the factuality and coherence of the generated documents and re duces redundancy in the output document
Unraveling the effect of sex on human genetic architecture
Sex is arguably the most important differentiating characteristic in most mammalian
species, separating populations into different groups, with varying behaviors, morphologies,
and physiologies based on their complement of sex chromosomes, amongst other factors. In
humans, despite males and females sharing nearly identical genomes, there are differences
between the sexes in complex traits and in the risk of a wide array of diseases. Sex provides
the genome with a distinct hormonal milieu, differential gene expression, and environmental
pressures arising from gender societal roles. This thus poses the possibility of observing
gene by sex (GxS) interactions between the sexes that may contribute to some of the
phenotypic differences observed. In recent years, there has been growing evidence of GxS,
with common genetic variation presenting different effects on males and females. These
studies have however been limited in regards to the number of traits studied and/or
statistical power. Understanding sex differences in genetic architecture is of great
importance as this could lead to improved understanding of potential differences in
underlying biological pathways and disease etiology between the sexes and in turn help
inform personalised treatments and precision medicine.
In this thesis we provide insights into both the scope and mechanism of GxS across the
genome of circa 450,000 individuals of European ancestry and 530 complex traits in the UK
Biobank. We found small yet widespread differences in genetic architecture across traits
through the calculation of sex-specific heritability, genetic correlations, and sex-stratified
genome-wide association studies (GWAS). We further investigated whether sex-agnostic
(non-stratified) efforts could potentially be missing information of interest, including sex-specific trait-relevant loci and increased phenotype prediction accuracies. Finally, we
studied the potential functional role of sex differences in genetic architecture through sex
biased expression quantitative trait loci (eQTL) and gene-level analyses.
Overall, this study marks a broad examination of the genetics of sex differences. Our findings
parallel previous reports, suggesting the presence of sexual genetic heterogeneity across
complex traits of generally modest magnitude. Furthermore, our results suggest the need to
consider sex-stratified analyses in future studies in order to shed light into possible sex-specific molecular mechanisms
Recommended from our members
Reliable Decision-Making with Imprecise Models
The rapid growth in the deployment of autonomous systems across various sectors has generated considerable interest in how these systems can operate reliably in large, stochastic, and unstructured environments. Despite recent advances in artificial intelligence and machine learning, it is challenging to assure that autonomous systems will operate reliably in the open world. One of the causes of unreliable behavior is the impreciseness of the model used for decision-making. Due to the practical challenges in data collection and precise model specification, autonomous systems often operate based on models that do not represent all the details in the environment. Even if the system has access to a comprehensive decision-making model that accounts for all the details in the environment and all possible scenarios the agent may encounter, it may be intractable to solve this complex model optimally. Consequently, this complex, high fidelity model may be simplified to accelerate planning, introducing imprecision. Reasoning with such imprecise models affects the reliability of autonomous systems. A system\u27s actions may sometimes produce unexpected, undesirable consequences, which are often identified after deployment. How can we design autonomous systems that can operate reliably in the presence of uncertainty and model imprecision?
This dissertation presents solutions to address three classes of model imprecision in a Markov decision process, along with an analysis of the conditions under which bounded-performance can be guaranteed. First, an adaptive outcome selection approach is introduced to devise risk-aware reduced models of the environment that efficiently balance the trade-off between model simplicity and fidelity, to accelerate planning in resource-constrained settings. Second, a framework that extends stochastic shortest path framework to problems with imperfect information about the goal state during planning is introduced, along with two solution approaches to solve this problem. Finally, two complementary solution approaches are presented to minimize the negative side effects of agent actions. The techniques presented in this dissertation enable an autonomous system to detect and mitigate undesirable behavior, without redesigning the model entirely
A productive response to legacy system petrification
Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
The influence of blockchains and internet of things on global value chain
Despite the increasing proliferation of deploying the Internet of Things (IoT) in global value chain (GVC), several challenges might lead to a lack of trust among value chain partners, e.g., technical challenges (i.e., confidentiality, authenticity, and privacy); and security challenges (i.e., counterfeiting, physical tempering, and data theft). In this study, we argue that Blockchain technology, when combined with the IoT ecosystem, will strengthen GVC and enhance value creation and capture among value chain partners. Thus, we examine the impact of Blockchain technology when combined with the IoT ecosystem and how it can be utilized to enhance value creation and capture among value chain partners. We collected data through an online survey, and 265 UK Agri-food retailers completed the survey. Our data were analyzed using structural equation modelling (SEM). Our finding reveals that Blockchain technology enhances GVC by improving IoT scalability, security, and traceability when combined with the IoT ecosystem. Which, in turn, strengthens GVC and creates more value for value chain partners â which serves as a competitive advantage. Finally, our research outlines the theoretical and practical contribution of combining Blockchain technology and the IoT ecosystem
Recommended from our members
Explainable CNN with fuzzy tree regularization for respiratory sound analysis
National Key R&D Program of China under Grant 2020YFA0908700; National Nature Science Foundation of China under Grants 62073225, 62072315, 61836005 and 62006157; Natural Science Foundation of Guangdong Province-Outstanding Youth Program under Grant 2019B151502018; Guangdong âPearl River Talent Recruitment Programâ under Grant 2019ZT08X603; Shenzhen Science and Technology Program under Grant JCYJ20210324093808021; Shenzhen Science and Technology Innovation Commission under Grant R2020A045
- âŠ