60 research outputs found

    Advanced Simulation and Computing FY12-13 Implementation Plan, Volume 2, Revision 0.5

    Full text link

    The design and evaluation of discrete wearable medical devices for vital signs monitoring

    Get PDF
    The observation, recording and appraisal of an individual’s vital signs, namely temperature, heart rate, blood pressure, respiratory rate and blood oxygen saturation (SpO2), are key components in the assessment of their health and wellbeing. Measurements provide valuable diagnostic data, facilitating clinical diagnosis, management and monitoring. Respiratory rate sensing is perhaps the most under-utilised of all the vital signs, being routinely assessed by observation or estimated algorithmically from respiratory-induced beat-to-beat variation in heart rate. Moreover there is an unmet need for wearable devices that can measure all or most of the vital signs. This project therefore aims to a) develop a device that can measure respiratory rate and b) develop a wearable device that can measure all or most of the vital signs. An accelerometer-based clavicular respiratory motion sensor was developed and compared with a similar thoracic motion sensor and reference using exhalatory flow. Pilot study results established that the clavicle sensor accurately tracked the reference in monitoring respiratory rate and outperformed the thoracic device. An Ear-worn Patient Monitoring System (EPMS) was also developed, providing a discrete telemonitoring device capable of rapidly measuring tympanic temperature, heart rate, SpO2 and activity level. The results of a comparative pilot study against reference instruments revealed that heart rate matched the reference for accuracy, while temperature under read (< 1°C) and SpO2 was inconsistent with poor correlation. In conclusion, both of the prototype devices require further development. The respiratory sensor would benefit from product engineering and larger scale testing to fully exploit the technology, but could find use in both hospital and community-based The design and evaluation of discrete wearable medical devices for vital signs monitoring DG Pitts ii Cranfield University monitoring. The EPMS has potential for clinical and community use, having demonstrated its capability of rapidly capturing and wirelessly transmitting vital signs readings. Further development is nevertheless required to improve the thermometer probe and resolve outstanding issues with SpO2 readings

    Computing Competencies for Undergraduate Data Science Curricula: ACM Data Science Task Force

    Get PDF
    At the August 2017 ACM Education Council meeting, a task force was formed to explore a process to add to the broad, interdisciplinary conversation on data science, with an articulation of the role of computing discipline-specific contributions to this emerging field. Specifically, the task force would seek to define what the computing/computational contributions are to this new field, and provide guidance on computing-specific competencies in data science for departments offering such programs of study at the undergraduate level. There are many stakeholders in the discussion of data science – these include colleges and universities that (hope to) offer data science programs, employers who hope to hire a workforce with knowledge and experience in data science, as well as individuals and professional societies representing the fields of computing, statistics, machine learning, computational biology, computational social sciences, digital humanities, and others. There is a shared desire to form a broad interdisciplinary definition of data science and to develop curriculum guidance for degree programs in data science. This volume builds upon the important work of other groups who have published guidelines for data science education. There is a need to acknowledge the definition and description of the individual contributions to this interdisciplinary field. For instance, those interested in the business context for these concepts generally use the term “analytics”; in some cases, the abbreviation DSA appears, meaning Data Science and Analytics. This volume is the third draft articulation of computing-focused competencies for data science. It recognizes the inherent interdisciplinarity of data science and situates computing-specific competencies within the broader interdisciplinary space

    An Investigation into quality assurance of the Open Source Software Development model

    Get PDF
    A thesis submitted in partial fulfilment of the requirements of the University of Wolverhampton for the degree of Doctor of PhilosophyThe Open Source Software Development (OSSD) model has launched products in rapid succession and with high quality, without following traditional quality practices of accepted software development models (Raymond 1999). Some OSSD projects challenge established quality assurance approaches, claiming to be successful through partial contrary techniques of standard software development. However, empirical studies of quality assurance practices for Open Source Software (OSS) are rare (Glass 2001). Therefore, further research is required to evaluate the quality assurance processes and methods within the OSSD model. The aim of this research is to improve the understanding of quality assurance practices under the OSSD model. The OSSD model is characterised by a collaborative, distributed development approach with public communication, free participation, free entry to the project for newcomers and unlimited access to the source code. The research examines applied quality assurance practices from a process view rather than from a product view. The research follows ideographic and nomothetic methodologies and adopts an antipositivist epistemological approach. An empirical research of applied quality assurance practices in OSS projects is conducted through the literature research. The survey research method is used to gain empirical evidence about applied practices. The findings are used to validate the theoretical knowledge and to obtain further expertise about practical approaches. The findings contribute to the development of a quality assurance framework for standard OSSD approaches. The result is an appropriate quality model with metrics that the requirements of the OSSD support. An ideographic approach with case studies is used to extend the body of knowledge and to assess the feasibility and applicability of the quality assurance framework. In conclusion, the study provides further understanding of the applied quality assurance processes under the OSSD model and shows how a quality assurance framework can support the development processes with guidelines and measurements

    Ranking-based approaches for localizing faults

    Get PDF

    Reining in the Functional Verification of Complex Processor Designs with Automation, Prioritization, and Approximation

    Full text link
    Our quest for faster and efficient computing devices has led us to processor designs with enormous complexity. As a result, functional verification, which is the process of ascertaining the correctness of a processor design, takes up a lion's share of the time and cost spent on making processors. Unfortunately, functional verification is only a best-effort process that cannot completely guarantee the correctness of a design, often resulting in defective products that may have devastating consequences.Functional verification, as practiced today, is unable to cope with the complexity of current and future processor designs. In this dissertation, we identify extensive automation as the essential step towards scalable functional verification of complex processor designs. Moreover, recognizing that a complete guarantee of design correctness is impossible, we argue for systematic prioritization and prudent approximation to realize fast and far-reaching functional verification solutions. We partition the functional verification effort into three major activities: planning and test generation, test execution and bug detection, and bug diagnosis. Employing a perspective we refer to as the automation, prioritization, and approximation (APA) approach, we develop solutions that tackle challenges across these three major activities. In pursuit of efficient planning and test generation for modern systems-on-chips, we develop an automated process for identifying high-priority design aspects for verification. In addition, we enable the creation of compact test programs, which, in our experiments, were up to 11 times smaller than what would otherwise be available at the beginning of the verification effort. To tackle challenges in test execution and bug detection, we develop a group of solutions that enable the deployment of automatic and robust mechanisms for catching design flaws during high-speed functional verification. By trading accuracy for speed, these solutions allow us to unleash functional verification platforms that are over three orders of magnitude faster than traditional platforms, unearthing design flaws that are otherwise impossible to reach. Finally, we address challenges in bug diagnosis through a solution that fully automates the process of pinpointing flawed design components after detecting an error. Our solution, which identifies flawed design units with over 70% accuracy, eliminates weeks of diagnosis effort for every detected error.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/137057/1/birukw_1.pd

    A Novel Malware Target Recognition Architecture for Enhanced Cyberspace Situation Awareness

    Get PDF
    The rapid transition of critical business processes to computer networks potentially exposes organizations to digital theft or corruption by advanced competitors. One tool used for these tasks is malware, because it circumvents legitimate authentication mechanisms. Malware is an epidemic problem for organizations of all types. This research proposes and evaluates a novel Malware Target Recognition (MaTR) architecture for malware detection and identification of propagation methods and payloads to enhance situation awareness in tactical scenarios using non-instruction-based, static heuristic features. MaTR achieves a 99.92% detection accuracy on known malware with false positive and false negative rates of 8.73e-4 and 8.03e-4 respectively. MaTR outperforms leading static heuristic methods with a statistically significant 1% improvement in detection accuracy and 85% and 94% reductions in false positive and false negative rates respectively. Against a set of publicly unknown malware, MaTR detection accuracy is 98.56%, a 65% performance improvement over the combined effectiveness of three commercial antivirus products

    IMPROVING THE RIGOR OF THE LATENT PRINT EXAMINATION PROCESS

    Get PDF
    This PhD thesis is a synthesis of a portfolio of interrelated previously published work that was conducted to improve the rigor, standardization, transparency, and quantifiability of the latent print examination process. The core of the work relates to the development, adoption, and implications of the Extended Feature Set (EFS). EFS is a formal international standard (incorporated in ANSI/NIST-ITL) that defines a method of characterizing the information content of friction ridge impressions — allowing latent print examiners to unambiguously document the bases of their determinations during examination. EFS is the enabling technology that has made all of the other elements of this portfolio of work possible: evaluations of the accuracy and reliability of latent print examiners’ determinations, evaluations of the reliability of examiners’ feature markup, evaluations of examiners’ assessments of sufficiency, evaluations of latent print quality, development of quality and distortion metrics, evaluations of AFIS accuracy, and the development of training materials to assist in improving the uniformity of examiners’ annotations of the features and attributes of friction ridge impressions. The thesis summarizes these previous publications, as well as discussing their implications and possible future research and tools that could leverage this body of work. -- Cette recherche doctorale prĂ©sente la synthĂšse d’un portfolio de travaux et de publications ayant pour objectif d’amĂ©liorer la rigueur, la standardisation, la transparence et la quantification dans le cadre du processus d’identification des traces papillaires. L’élĂ©ment fondateur de cette recherche est le dĂ©veloppement, l’adoption et les implications du Extended Feature Set (EFS). EFS est un standard formel international (incorporĂ© dans ANSI/NIST-ITL) qui dĂ©finit la mĂ©thode de description des caractĂ©ristiques prĂ©sentes sur les impressions papillaires. Il permet aux experts en lophoscopie de documenter de maniĂšre non-ambiguĂ« les observations qui sont Ă  la base des conclusions formulĂ©es Ă  la suite des examens. EFS a Ă©tĂ© le facilitateur qui a rendu possible tous les autres Ă©lĂ©ments de ce portfolio de recherches, Ă  savoir : l’évaluation de la fiabilitĂ© et l’exactitude des conclusions des experts en matiĂšre de traces papillaires, l’évaluation de la fidĂ©litĂ© des annotations des experts, le dĂ©veloppement de mesures de qualitĂ© et de la distorsion des traces, l’évaluation de l’exactitude des systĂšmes AFIS et finalement le dĂ©veloppement d’une formation visant Ă  amĂ©liorer la reproductibilitĂ©, entre experts, des annotations des caractĂ©ristiques papillaires et de leurs attributs. Cette recherche doctorale prĂ©sente une synthĂšse de l’ensemble de ces travaux publiĂ©s et discute des implications de ceux-ci, des voies de recherche future ainsi que les outils qui pourraient y ĂȘtre associĂ©s

    Using population biobanks to understand complex traits, rare diseases, and their shared genetic architecture

    Get PDF
    The study of the role of genetic variability in common traits has led to a growing number of studies aimed at representing whole populations. These studies gather multiple layers of information on healthy and non-healthy individuals at large scales, constituting what is known as population biobanks.In this thesis I took advantage of the potential of these population biobanks to measure the influence of genetic variation in common and rare traits. I explored the mechanisms behind these by exploring their interaction with conditions, physiological measurements, and habits in general and healthy population. First, I used the Lifelines cohort, with genetic information of Dutch population. Here, my colleagues and I explored traits with different levels of genetic influence we uncovered associations between both Blood type and dairy consumption with human gut microbiome function and composition, and we identified a protective factor for a rare type of cardiomyopathy with potential use for diagnosis.Additionally, within a global collaboration across world-wide biobanks totaling &gt; 2 million individuals, we demonstrated the robustness of the connections between genetic variation and 14 different diseases across the populations. We also provided methodological guidance for the combination of the effects of genetic variation to calculate the risk of disease in studies including biobanks with populations of different ethnic backgrounds.Overall, my PhD research contributed on identifying and validating which factors are relevant for potential clinical applications, and provided guidelines to be used in future genetic studies on common traits and diseases at a global scale

    Faculty Publications and Creative Works 2003

    Get PDF
    Faculty Publications & Creative Works is an annual compendium of scholarly and creative activities of University of New Mexico faculty during the noted calendar year. It serves to illustrate the robust and active intellectual pursuits conducted by the faculty in support of teaching and research at UNM
    • 

    corecore