423 research outputs found

    The feasibility of a randomised controlled trial of Dyadic Developmental Psychotherapy

    Get PDF
    Background: Maltreated children have significant and complex problems which clinicians find difficult to diagnose and treat. Previous US pilot work suggests that Dyadic Developmental Psychotherapy (DDP) may be effective; however, rigorous evidence from a randomised controlled trial (RCT) is lacking. The purpose of this study is to establish the feasibility of an RCT of DDP by exploring the ways that DDP is operating across different UK sites and the impacts of current practice on the potential set-up of an RCT. Methods: Qualitative methods (interviews, focus groups and teleconferences) were used to explore trial feasibility with therapists and service managers from teams implementing both DDP and possible control interventions. Data were analysed thematically and related to various aspects of trial design. Results: DDP was commonly regarded as having a particular congruence with the complexity of maltreatment-associated problems and a common operating model of DDP was evident across sites. A single control therapy was harder to establish, however, and it is likely to be a non-specific and context-dependent intervention/s offered within mainstream Child and Adolescent Mental Health Services (CAMHS). Because a ‘gold standard’ Treatment as Usual (TAU) does not currently exist, randomisation between DDP and TAU (CAMHS) therefore looks feasible and ethical. The nature of family change during DDP was regarded as multi-faceted, non-linear and relationship-based. Assessment tools need to be carefully considered in terms of their ability to capture change that covers both individual child and family-based functioning. Conclusions: An RCT of DDP is feasible and timely. This study has demonstrated widespread interest, support and engagement regarding an RCT and permissions have been gained from sites that have shown readiness to participate. As maltreated children are among the most vulnerable in society, and as there are currently no treatments with RCT evidence, such a trial would be a major advance in the field

    Protocol for the economic evaluation of a complex intervention to improve the mental health of maltreated infants and children in foster care in the UK (The BeST? services trial)

    Get PDF
    Introduction: Children who have experienced abuse and neglect are at increased risk of mental and physical health problems throughout life. This places an enormous burden on individuals, families and society in terms of health services, education, social care and judiciary sectors. Evidence suggests that early intervention can mitigate the negative consequences of child maltreatment, exerting long-term positive effects on the health of maltreated children entering foster care. However, evidence on cost-effectiveness of such complex interventions is limited. This protocol describes the first economic evaluation of its kind in the UK. Methods and analysis: An economic evaluation alongside the Best Services Trial (BeST?) has been prospectively designed to identify, measure and value key resource and outcome impacts arising from the New Orleans intervention model (NIM) (an infant mental health service) compared with case management (CM) (enhanced social work services as usual). A within-trial economic evaluation and long-term model from a National Health Service/Personal Social Service and a broader societal perspective will be undertaken alongside the National Institute for Health Research (NIHR)–Public Health Research Unit (PHRU)-funded randomised multicentre BeST?. BeST? aims to evaluate NIM compared with CM for maltreated children entering foster care in a UK context. Collection of Paediatric Quality of Life Inventory (PedsQL) and the recent mapping of PedsQL to EuroQol-5-Dimensions (EQ-5D) will facilitate the estimation of quality-adjusted life years specific to the infant population for a cost–utility analysis. Other effectiveness outcomes will be incorporated into a cost-effectiveness analysis (CEA) and cost-consequences analysis (CCA). A long-term economic model and multiple economic evaluation frameworks will provide decision-makers with a comprehensive, multiperspective guide regarding cost-effectiveness of NIM. The long-term population health economic model will be developed to synthesise trial data with routine linked data and key government sector parameters informed by literature. Methods guidance for population health economic evaluation will be adopted (lifetime horizon, 1.5% discount rate for costs and benefits, CCA framework, multisector perspective). Ethics and dissemination: Ethics approval was obtained by the West of Scotland Ethics Committee. Results of the main trial and economic evaluation will be submitted for publication in a peer-reviewed journal as well as published in the peer-reviewed NIHR journals library (Public Health Research Programme). Trial registration number: NCT02653716; Pre-results

    Using Automated Reasoning Systems on Molecular Computing

    Get PDF
    This paper is focused on the interplay between automated reasoning systems (as theoretical and formal devices to study the correctness of a program) and DNA computing (as practical devices to handle DNA strands to solve classical hard problems with laboratory techniques). To illustrate this work we have proven in the PVS proof checker, the correctness of a program, in a sticker based model for DNA computation, solving the pairwise disjoint families problem. Also we introduce the formalization of the Floyd–Hoare logic for imperative programs

    Information Presentation

    Get PDF
    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. The major areas of work, or subtasks, within this DRP are: 1) Displays, 2) Controls, 3) Electronic Procedures and Fault Management, and 4) Human Performance Modeling. This DRP is a collaborative effort between researchers at Johnson Space Center and Ames Research Center

    Fast Searching in Packed Strings

    Get PDF
    Given strings PP and QQ the (exact) string matching problem is to find all positions of substrings in QQ matching PP. The classical Knuth-Morris-Pratt algorithm [SIAM J. Comput., 1977] solves the string matching problem in linear time which is optimal if we can only read one character at the time. However, most strings are stored in a computer in a packed representation with several characters in a single word, giving us the opportunity to read multiple characters simultaneously. In this paper we study the worst-case complexity of string matching on strings given in packed representation. Let mnm \leq n be the lengths PP and QQ, respectively, and let σ\sigma denote the size of the alphabet. On a standard unit-cost word-RAM with logarithmic word size we present an algorithm using time O\left(\frac{n}{\log_\sigma n} + m + \occ\right). Here \occ is the number of occurrences of PP in QQ. For m=o(n)m = o(n) this improves the O(n)O(n) bound of the Knuth-Morris-Pratt algorithm. Furthermore, if m=O(n/logσn)m = O(n/\log_\sigma n) our algorithm is optimal since any algorithm must spend at least \Omega(\frac{(n+m)\log \sigma}{\log n} + \occ) = \Omega(\frac{n}{\log_\sigma n} + \occ) time to read the input and report all occurrences. The result is obtained by a novel automaton construction based on the Knuth-Morris-Pratt algorithm combined with a new compact representation of subautomata allowing an optimal tabulation-based simulation.Comment: To appear in Journal of Discrete Algorithms. Special Issue on CPM 200

    Building the Minimal Automaton of A*X in Linear Time, When X Is of Bounded Cardinality

    Get PDF
    International audienceWe present an algorithm for constructing the minimal automaton recognizing A∗X, where the pattern X is a set of m (that is a fixed integer) non-empty words over a finite alphabet A whose sum of lengths is n. This algorithm, inspired by Brzozowski's minimization algorithm, uses sparse lists to achieve a linear time complexity with respect to n

    Formal Reasoning about Efficient Data Structures: A Case Study in ACL2

    Get PDF
    We describe in this paper the formal verification, using the ACL2 system, of a syntactic unification algorithm where terms are represented as directed acyclic graphs (dags) and these graphs are stored in a single-threaded object (stobj). The use of stobjs allows destructive operations on data (thus improving the performance of the algorithm), while maintaining the applicative semantics of ACL2. We intend to show how ACL2 provides an environment where execution of algorithms with efficient data structures and formal reasoning about them can be carried out.Ministerio de Ciencia y Tecnología TIC2000-1368-C03-0

    Using CLIPS to Detect Network Intrusions

    Get PDF
    We describe how to build a network intrusion detection sensor by slightly modifying NASA's CLIPS source code introducing some new features. An overview of the system is presented emphasizing the strategies used to inter-operate between the packet capture engine written in C and CLIPS. Some extensions were developed in order to manipulate timestamps, multiple string pattern matching and certainty factors. Several Snort functions and plugins were adapted and used for packet decoding and preprocessing. A rule translator was also built to reuse most of the Snort's attack signatures. Despite some performance drawbacks, results prove that CLIPS can be used for real-time network intrusion detection under certain conditions. Several attack signatures using CLIPS rules are showed in the appendix. By mixing CLIPS with Snort features, it was possible to introduce flexibility and expressiveness to network intrusion detection

    Extracting Spatial Information from Noise Measurements of Multi-Spatial-Mode Quantum States

    Full text link
    We show that it is possible to use the spatial quantum correlations present in twin beams to extract information about the shape of a mask in the path of one of the beams. The scheme, based on noise measurements through homodyne detection, is useful in the regime where the number of photons is low enough that direct detection with a photodiode is difficult but high enough that photon counting is not an option. We find that under some conditions the use of quantum states of light leads to an enhancement of the sensitivity in the estimation of the shape of the mask over what can be achieved with a classical state with equivalent properties (mean photon flux and noise properties). In addition, we show that the level of enhancement that is obtained is a result of the quantum correlations and cannot be explained with only classical correlations
    corecore