37,334 research outputs found

    Missed opportunities: Module design to meet the learning and access needs of practitioners - A work based learning pilot in the rehabilitation setting

    Get PDF
    It is with great pleasure that this report is presented as a result of an exciting project that truly exemplified partnership working. For a Higher Education Institution to come together with an NHS organisation to negotiate and tailor an education initiative in direct response to the needs of both the organisation and its staff is a very positive direction of travel. The project has been possible through the enthusiasm and commitment of its partners, their contribution of resources including time and funding, and the support of others who have played a part in enabling it to happen. The willingness of the students taking part in the pilot module should be recognised as much of what we have learnt from the process and the evaluation of it, will more directly benefit future students rather than the participating students themselves. As with any pilot, there are risks and where challenges have not been foreseen they have been addressed along the way, flexibly and promptly. Whilst a relatively small project, it has generated much interest from others interested in work based learning approaches and potential students from across the health care professions wanting to take part in future courses. On behalf of the Project Team, I hope you find the report useful and encourage you to make contact if you require further information, wish to explore work based learning opportunities (uni-discipline or multi-professional) here at the University or would like to discuss research or evaluation

    Decremental All-Pairs ALL Shortest Paths and Betweenness Centrality

    Full text link
    We consider the all pairs all shortest paths (APASP) problem, which maintains the shortest path dag rooted at every vertex in a directed graph G=(V,E) with positive edge weights. For this problem we present a decremental algorithm (that supports the deletion of a vertex, or weight increases on edges incident to a vertex). Our algorithm runs in amortized O(\vstar^2 \cdot \log n) time per update, where n=|V|, and \vstar bounds the number of edges that lie on shortest paths through any given vertex. Our APASP algorithm can be used for the decremental computation of betweenness centrality (BC), a graph parameter that is widely used in the analysis of large complex networks. No nontrivial decremental algorithm for either problem was known prior to our work. Our method is a generalization of the decremental algorithm of Demetrescu and Italiano [DI04] for unique shortest paths, and for graphs with \vstar =O(n), we match the bound in [DI04]. Thus for graphs with a constant number of shortest paths between any pair of vertices, our algorithm maintains APASP and BC scores in amortized time O(n^2 \log n) under decremental updates, regardless of the number of edges in the graph.Comment: An extended abstract of this paper will appear in Proc. ISAAC 201

    COCO_TS Dataset: Pixel-level Annotations Based on Weak Supervision for Scene Text Segmentation

    Full text link
    The absence of large scale datasets with pixel-level supervisions is a significant obstacle for the training of deep convolutional networks for scene text segmentation. For this reason, synthetic data generation is normally employed to enlarge the training dataset. Nonetheless, synthetic data cannot reproduce the complexity and variability of natural images. In this paper, a weakly supervised learning approach is used to reduce the shift between training on real and synthetic data. Pixel-level supervisions for a text detection dataset (i.e. where only bounding-box annotations are available) are generated. In particular, the COCO-Text-Segmentation (COCO_TS) dataset, which provides pixel-level supervisions for the COCO-Text dataset, is created and released. The generated annotations are used to train a deep convolutional neural network for semantic segmentation. Experiments show that the proposed dataset can be used instead of synthetic data, allowing us to use only a fraction of the training samples and significantly improving the performances

    Tightness for a stochastic Allen--Cahn equation

    Full text link
    We study an Allen-Cahn equation perturbed by a multiplicative stochastic noise which is white in time and correlated in space. Formally this equation approximates a stochastically forced mean curvature flow. We derive uniform energy bounds and prove tightness of of solutions in the sharp interface limit, and show convergence to phase-indicator functions.Comment: 27 pages, final Version to appear in "Stochastic Partial Differential Equations: Analysis and Computations". In Version 4, Proposition 6.3 is new. It replaces and simplifies the old propositions 6.4-6.

    From LCF to Isabelle/HOL

    Get PDF
    Interactive theorem provers have developed dramatically over the past four decades, from primitive beginnings to today's powerful systems. Here, we focus on Isabelle/HOL and its distinctive strengths. They include automatic proof search, borrowing techniques from the world of first order theorem proving, but also the automatic search for counterexamples. They include a highly readable structured language of proofs and a unique interactive development environment for editing live proof documents. Everything rests on the foundation conceived by Robin Milner for Edinburgh LCF: a proof kernel, using abstract types to ensure soundness and eliminate the need to store proofs. Compared with the research prototypes of the 1970s, Isabelle is a practical and versatile tool. It is used by system designers, mathematicians and many others

    An integrated system and framework for development of medical applications and products based on medical imaging data

    Get PDF
    Cranial defects which are caused by bone tumors or traffic accidents are treated by cranioplasty techniques. Cranioplasty implants are required to protect the underlying brain, correct major aesthetic deformities, or both. With the rapid develop-ment of computer graphics, medical image processing (MIP) and manufacturing technologies in recent decades, nowadays, personalised cranioplasty implants can be designed and made to improve the quality of cranial defect treatments. However, software tools for MIP and 3D modelling of implants are ex-pensive; and they normally require high technical skills. Espe-cially, the process of design and development of personalised cranioplasty implants normally requires a multidisciplinary team, including experts in MIP, 3D design and modelling, and Biomedical Engineering; this leads to challenges and difficulties for technology transfers and implementations in hospitals. This research is aimed at developing, in particular, cost-effective solutions and tools for design and modeling of personalised cranioplasty implants, and to simplify the design and modelling of implants, as well as to reduce the design and modeling time. In this way, surgeons and engineers can conveniently and easily design personalised cranioplasty implants, without the need of using complex MIP and CAD tools; and as a result the cost of implants will be minimised

    Supermassive black holes do not correlate with dark matter halos of galaxies

    Full text link
    Supermassive black holes have been detected in all galaxies that contain bulge components when the galaxies observed were close enough so that the searches were feasible. Together with the observation that bigger black holes live in bigger bulges, this has led to the belief that black hole growth and bulge formation regulate each other. That is, black holes and bulges "coevolve". Therefore, reports of a similar correlation between black holes and the dark matter halos in which visible galaxies are embedded have profound implications. Dark matter is likely to be nonbaryonic, so these reports suggest that unknown, exotic physics controls black hole growth. Here we show - based in part on recent measurements of bulgeless galaxies - that there is almost no correlation between dark matter and parameters that measure black holes unless the galaxy also contains a bulge. We conclude that black holes do not correlate directly with dark matter. They do not correlate with galaxy disks, either. Therefore black holes coevolve only with bulges. This simplifies the puzzle of their coevolution by focusing attention on purely baryonic processes in the galaxy mergers that make bulges.Comment: 12 pages, 9 Postscript figures, 1 table; published in Nature (20 January 2011

    Dynamic Adaptation on Non-Stationary Visual Domains

    Full text link
    Domain adaptation aims to learn models on a supervised source domain that perform well on an unsupervised target. Prior work has examined domain adaptation in the context of stationary domain shifts, i.e. static data sets. However, with large-scale or dynamic data sources, data from a defined domain is not usually available all at once. For instance, in a streaming data scenario, dataset statistics effectively become a function of time. We introduce a framework for adaptation over non-stationary distribution shifts applicable to large-scale and streaming data scenarios. The model is adapted sequentially over incoming unsupervised streaming data batches. This enables improvements over several batches without the need for any additionally annotated data. To demonstrate the effectiveness of our proposed framework, we modify associative domain adaptation to work well on source and target data batches with unequal class distributions. We apply our method to several adaptation benchmark datasets for classification and show improved classifier accuracy not only for the currently adapted batch, but also when applied on future stream batches. Furthermore, we show the applicability of our associative learning modifications to semantic segmentation, where we achieve competitive results

    The Parameterized Complexity of Centrality Improvement in Networks

    Full text link
    The centrality of a vertex v in a network intuitively captures how important v is for communication in the network. The task of improving the centrality of a vertex has many applications, as a higher centrality often implies a larger impact on the network or less transportation or administration cost. In this work we study the parameterized complexity of the NP-complete problems Closeness Improvement and Betweenness Improvement in which we ask to improve a given vertex' closeness or betweenness centrality by a given amount through adding a given number of edges to the network. Herein, the closeness of a vertex v sums the multiplicative inverses of distances of other vertices to v and the betweenness sums for each pair of vertices the fraction of shortest paths going through v. Unfortunately, for the natural parameter "number of edges to add" we obtain hardness results, even in rather restricted cases. On the positive side, we also give an island of tractability for the parameter measuring the vertex deletion distance to cluster graphs

    Machine-Checked Proofs For Realizability Checking Algorithms

    Full text link
    Virtual integration techniques focus on building architectural models of systems that can be analyzed early in the design cycle to try to lower cost, reduce risk, and improve quality of complex embedded systems. Given appropriate architectural descriptions, assume/guarantee contracts, and compositional reasoning rules, these techniques can be used to prove important safety properties about the architecture prior to system construction. For these proofs to be meaningful, each leaf-level component contract must be realizable; i.e., it is possible to construct a component such that for any input allowed by the contract assumptions, there is some output value that the component can produce that satisfies the contract guarantees. We have recently proposed (in [1]) a contract-based realizability checking algorithm for assume/guarantee contracts over infinite theories supported by SMT solvers such as linear integer/real arithmetic and uninterpreted functions. In that work, we used an SMT solver and an algorithm similar to k-induction to establish the realizability of a contract, and justified our approach via a hand proof. Given the central importance of realizability to our virtual integration approach, we wanted additional confidence that our approach was sound. This paper describes a complete formalization of the approach in the Coq proof and specification language. During formalization, we found several small mistakes and missing assumptions in our reasoning. Although these did not compromise the correctness of the algorithm used in the checking tools, they point to the value of machine-checked formalization. In addition, we believe this is the first machine-checked formalization for a realizability algorithm.Comment: 14 pages, 1 figur
    • …
    corecore