752,802 research outputs found

    Use of soft computing and numerical analysis in design, analysis and management of pavement systems

    Get PDF
    There are a number of components of pavement engineering, including pavement management, pavement analysis and design, and pavement materials. Historically, the field of pavement management has been interested in monitoring post-construction condition, timing of preventive maintenance and rehabilitation treatments, and economic analysis of alternatives. On the other hand, the field of pavement analysis and design has dealt with optimizing pavement structure; with optimum structure, a pavement system is expected to survive during its service life for given traffic and climate conditions. The performance of pavement materials has been improved to achieve the long-lasting and lower-maintenance pavement systems. A data-driven comprehensive approach considering all aspects of pavement engineering together could be a future direction for advancing pavement engineering practices. In order to achieve a data-driven comprehensive approach considering all aspects of pavement engineering together as outlined above, a data-driven and efficient pavement design, analysis and management concept has been proposed in this study. To serve as elements of this concept, several models related to pavement structural response models, pavement performance prediction models, and pavement remaining service life (RSL) models have been developed. First, to enable faster three-dimensional finite element (3D-FE) computations of design stresses, artificial neural network (ANN)-based surrogate computational pavement structural response models were developed. These models produce an estimate of the top-down bending stress close to that computed by 3D-FE analysis in rigid airport pavements in a fraction of the time. Second, longitudinal cracking mechanisms of widened jointed plain concrete pavements (JPCP) were demonstrated and their longitudinal cracking potential was evaluated using numerical analysis. Third, the Federal Aviation Administration’s (FAA) current rigid airfield pavement design methodology has been evaluated in great detail to better identify research gaps and remaining needs with respect to cracking failure models so that recommendations could be made as to how current methodology could be improved to accommodate top-down and bottom-up cracking failure modes. Fourth, a detailed step-by-step methodology for the development of a framework for pavement performance and RSL prediction models was explained using real pavement performance data obtained from the Iowa Department of Transportation (DOT)’s Pavement Management Information System (PMIS) database

    Integrated Framework for Data Quality and Security Evaluation on Mobile Devices

    Get PDF
    Data quality (DQ) is an important concept that is used in the design and employment of information, data management, decision making, and engineering systems with multiple applications already available for solving specific problems. Unfortunately, conventional approaches to DQ evaluation commonly do not pay enough attention or even ignore the security and privacy of the evaluated data. In this research, we develop a framework for the DQ evaluation of the sensor originated data acquired from smartphones, that incorporates security and privacy aspects into the DQ evaluation pipeline. The framework provides support for selecting the DQ metrics and implementing their calculus by integrating diverse sensor data quality and security metrics. The framework employs a knowledge graph to facilitate its adaptation in new applications development and enables knowledge accumulation. Privacy aspects evaluation is demonstrated by the detection of novel and sophisticated attacks on data privacy on the example of colluded applications attack recognition. We develop multiple calculi for DQ and security evaluation, such as a hierarchical fuzzy rules expert system, neural networks, and an algebraic function. Case studies that demonstrate the framework\u27s performance in solving real-life tasks are presented, and the achieved results are analyzed. These case studies confirm the framework\u27s capability of performing comprehensive DQ evaluations. The framework development resulted in producing multiple products, and tools such as datasets and Android OS applications. The datasets include the knowledge base of sensors embedded in modern mobile devices and their quality analysis, technological signals recordings of smartphones during the normal usage, and attacks on users\u27 privacy. These datasets are made available for public use and can be used for future research in the field of data quality and security. We also released under an open-source license a set of Android OS tools that can be used for data quality and security evaluation

    Medical Data Architecture Platform and Recommended Requirements for a Medical Data System for Exploration Missions

    Get PDF
    The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically- relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm of medical data management on the International Space Station. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products derived from the third MDA prototype development will directly inform exploration medical system requirements for Level of Care IV in Gateway missions. In fiscal year 2019, the MDA project developed Test Bed 3, the third iteration in a series of prototypes, that featured integrations with cognition tool data, ultrasound image analytics and core Flight Software (cFS). Maintaining a layered architecture design, the framework implemented a plug-in, modular approach in the integration of these external data sources. An early version of MDA Test Bed 3 software was deployed and operated in a simulated analog environment that was part of the Next Space Technologies for Exploration Partnerships (NextSTEP) Gateway tests of multiple habitat prototypes. In addition, the MDA team participated in the Gateway Test and Verification Demonstration, where the MDA cFS applications was integrated with Gateway-in-a-Box software to send and receive medically relevant data over a simulated vehicle network. This software demonstration was given to ExMC and Gateway Program stakeholders at the NASA Johnson Space Center Integrated Power, Avionics and Software (iPAS) facility. Also, the integrated prototypes served as a vehicle to provide Level 5 requirements for the Crew Health and Performance Habitat Data System for Gateway Missions (Medical Level of Care IV). In the upcoming fiscal year, the MDA project will continue to provide systems engineering and vertical prototypes to refine requirements for medical Level of Care IV and inform requirements for Level of Care V

    Clinical Decision Support Systems (CDSS) for preventive management of COPD patients

    Get PDF
    Background The use of information and communication technologies to manage chronic diseases allows the application of integrated care pathways, and the optimization and standardization of care processes. Decision support tools can assist in the adherence to best-practice medicine in critical decision points during the execution of a care pathway. Objectives The objectives are to design, develop, and assess a clinical decision support system (CDSS) offering a suite of services for the early detection and assessment of chronic obstructive pulmonary disease (COPD), which can be easily integrated into a healthcare providers' work-flow. Methods The software architecture model for the CDSS, interoperable clinical-knowledge representation, and inference engine were designed and implemented to form a base CDSS framework. The CDSS functionalities were iteratively developed through requirement-adjustment/development/validation cycles using enterprise-grade software-engineering methodologies and technologies. Within each cycle, clinical-knowledge acquisition was performed by a health-informatics engineer and a clinical-expert team. Results A suite of decision-support web services for (i) COPD early detection and diagnosis, (ii) spirometry quality-control support, (iii) patient stratification, was deployed in a secured environment on-line. The CDSS diagnostic performance was assessed using a validation set of 323 cases with 90% specificity, and 96% sensitivity. Web services were integrated in existing health information system platforms. Conclusions Specialized decision support can be offered as a complementary service to existing policies of integrated care for chronic-disease management. The CDSS was able to issue recommendations that have a high degree of accuracy to support COPD case-finding. Integration into healthcare providers' work-flow can be achieved seamlessly through the use of a modular design and service-oriented architecture that connect to existing health information systems

    Improving Emergency Department Patient Flow Through Near Real-Time Analytics

    Get PDF
    ABSTRACT IMPROVING EMERGENCY DEPARTMENT PATIENT FLOW THROUGH NEAR REAL-TIME ANALYTICS This dissertation research investigates opportunities for developing effective decision support models that exploit near real-time (NRT) information to enhance the operational intelligence within hospital Emergency Departments (ED). Approaching from a systems engineering perspective, the study proposes a novel decision support framework for streamlining ED patient flow that employs machine learning, statistical and operations research methods to facilitate its operationalization. ED crowding has become the subject of significant public and academic attention, and it is known to cause a number of adverse outcomes to the patients, ED staff as well as hospital revenues. Despite many efforts to investigate the causes, consequences and interventions for ED overcrowding in the past two decades, scientific knowledge remains limited in regards to strategies and pragmatic approaches that actually improve patient flow in EDs. Motivated by the gaps in research, we develop a near real-time triage decision support system to reduce ED boarding and improve ED patient flow. The proposed system is a novel variant of a newsvendor modeling framework that integrates patient admission probability prediction within a proactive ward-bed reservation system to improve the effectiveness of bed coordination efforts and reduce boarding times for ED patients along with the resulting costs. Specifically, we propose a cost-sensitive bed reservation policy that recommends optimal bed reservation times for patients right during triage. The policy relies on classifiers that estimate the probability that the ED patient will be admitted using the patient information collected and readily available at triage or right after. The policy is cost-sensitive in that it accounts for costs associated with patient admission prediction misclassification as well as costs associated with incorrectly selecting the reservation time. To achieve the objective of this work, we also addressed two secondary objectives: first, development of models to predict the admission likelihood and target admission wards of ED patients; second, development of models to estimate length-of-stay (LOS) of ED patients. For the first secondary objective, we develop an algorithm that incorporates feature selection into a state-of-the-art and powerful probabilistic Bayesian classification method: multi-class relevance vector machine. For the second objective, we investigated the performance of hazard rate models (in particual, the non-parametric Cox proportional hazard model, parametric hazard rate models, as well as artificial neural networks for modeling the hazard rate) to estimate ED LOS by using the information that is available at triage or right after as the covariates in the models. The proposed models are tested using extensive historical data from several U.S. Department of Veterans Affairs Medical Centers (VAMCs) in the Mid-West. The Case Study using historical data from a VAMC demonstrates that applying the proposed framework leads to significant savings associated with reduced boarding times, in particular, for smaller wards with high levels of utilization. For theory, our primary contribution is the development of a cost sensitive ward-bed reservation model that effectively accounts for various costs and uncertainties. This work also contributes to the development of an integrated feature selection method for classification by developing and validating the mathematical derivation for feature selection during mRVM learning. Another contribution stems from investigating how much the ED LOS estimation can be improved by incorporating the information regarding ED orderable item lists. Overall, this work is a successful application of mixed methods of operation research, machine learning and statistics to the important domain of health care system efficiency improvement

    A holistic method for improving software product and process quality

    Get PDF
    The concept of quality in general is elusive, multi-faceted and is perceived differently by different stakeholders. Quality is difficult to define and extremely difficult to measure. Deficient software systems regularly result in failures which often lead to significant financial losses but more importantly to loss of human lives. Such systems need to be either scrapped and replaced by new ones or corrected/improved through maintenance. One of the most serious challenges is how to deal with legacy systems which, even when not failing, inevitably require upgrades, maintenance and improvement because of malfunctioning or changing requirements, or because of changing technologies, languages, or platforms. In such cases, the dilemma is whether to develop solutions from scratch or to re-engineer a legacy system. This research addresses this dilemma and seeks to establish a rigorous method for the derivation of indicators which, together with management criteria, can help decide whether restructuring of legacy systems is advisable. At the same time as the software engineering community has been moving from corrective methods to preventive methods, concentrating not only on both product quality improvement and process quality improvement has become imperative. This research investigation combines Product Quality Improvement, primarily through the re-engineering of legacy systems; and Process Improvement methods, models and practices, and uses a holistic approach to study the interplay of Product and Process Improvement. The re-engineering factor rho, a composite metric was proposed and validated. The design and execution of formal experiments tested hypotheses on the relationship of internal (code-based) and external (behavioural) metrics. In addition to proving the hypotheses, the insights gained on logistics challenges resulted in the development of a framework for the design and execution of controlled experiments in Software Engineering. The next part of the research resulted in the development of the novel, generic and, hence, customisable Quality Model GEQUAMO, which observes the principle of orthogonality, and combines a top-down analysis of the identification, classification and visualisation of software quality characteristics, and a bottom-up method for measurement and evaluation. GEQUAMO II addressed weaknesses that were identified during various GEQUAMO implementations and expert validation by academics and practitioners. Further work on Process Improvement investigated the Process Maturity and its relationship to Knowledge Sharing, resulted in the development of the I5P Visualisation Framework for Performance Estimation through the Alignment of Process Maturity and Knowledge Sharing. I5P was used in industry and was validated by experts from academia and industry. Using the principles that guided the creation of the GEQUAMO model, the CoFeD visualisation framework, was developed for comparative quality evaluation and selection of methods, tools, models and other software artifacts. CoFeD is very useful as the selection of wrong methods, tools or even personnel is detrimental to the survival and success of projects and organisations, and even to individuals. Finally, throughout the many years of research and teaching Software Engineering, Information Systems, Methodologies, I observed the ambiguities of terminology and the use of one term to mean different concepts and one concept to be expressed in different terms. These practices result in lack of clarity. Thus my final contribution comes in my reflections on terminology disambiguation for the achievement of clarity, and the development of a framework for achieving disambiguation of terms as a necessary step towards gaining maturity and justifying the use of the term “Engineering” 50 years since the term Software Engineering was coined. This research resulted in the creation of new knowledge in the form of novel indicators, models and frameworks which can aid quantification and decision making primarily on re-engineering of legacy code and on the management of process and its improvement. The thesis also contributes to the broader debate and understanding of problems relating to Software Quality, and establishes the need for a holistic approach to software quality improvement from both the product and the process perspectives

    A framework for design engineering education in a global context

    Get PDF
    This paper presents a framework for teaching design engineering in a global context using innovative technologies to enable distributed teams to work together effectively across international and cultural boundaries. The DIDET Framework represents the findings of a 5-year project conducted by the University of Strathclyde, Stanford University and Olin College which enhanced student learning opportunities by enabling them to partake in global, team based design engineering projects, directly experiencing different cultural contexts and accessing a variety of digital information sources via a range of innovative technology. The use of innovative technology enabled the formalization of design knowledge within international student teams as did the methods that were developed for students to store, share and reuse information. Coaching methods were used by teaching staff to support distributed teams and evaluation work on relevant classes was carried out regularly to allow ongoing improvement of learning and teaching and show improvements in student learning. Major findings of the 5 year project include the requirement to overcome technological, pedagogical and cultural issues for successful eLearning implementations. The DIDET Framework encapsulates all the conclusions relating to design engineering in a global context. Each of the principles for effective distributed design learning is shown along with relevant findings and suggested metrics. The findings detailed in the paper were reached through a series of interventions in design engineering education at the collaborating institutions. Evaluation was carried out on an ongoing basis and fed back into project development, both on the pedagogical and the technological approaches

    DIDET: Digital libraries for distributed, innovative design education and teamwork. Final project report

    Get PDF
    The central goal of the DIDET Project was to enhance student learning opportunities by enabling them to partake in global, team based design engineering projects, in which they directly experience different cultural contexts and access a variety of digital information sources via a range of appropriate technology. To achieve this overall project goal, the project delivered on the following objectives: 1. Teach engineering information retrieval, manipulation, and archiving skills to students studying on engineering degree programs. 2. Measure the use of those skills in design projects in all years of an undergraduate degree program. 3. Measure the learning performance in engineering design courses affected by the provision of access to information that would have been otherwise difficult to access. 4. Measure student learning performance in different cultural contexts that influence the use of alternative sources of information and varying forms of Information and Communications Technology. 5. Develop and provide workshops for staff development. 6. Use the measurement results to annually redesign course content and the digital libraries technology. The overall DIDET Project approach was to develop, implement, use and evaluate a testbed to improve the teaching and learning of students partaking in global team based design projects. The use of digital libraries and virtual design studios was used to fundamentally change the way design engineering is taught at the collaborating institutions. This was done by implementing a digital library at the partner institutions to improve learning in the field of Design Engineering and by developing a Global Team Design Project run as part of assessed classes at Strathclyde, Stanford and Olin. Evaluation was carried out on an ongoing basis and fed back into project development, both on the class teaching model and the LauLima system developed at Strathclyde to support teaching and learning. Major findings include the requirement to overcome technological, pedagogical and cultural issues for successful elearning implementations. A need for strong leadership has been identified, particularly to exploit the benefits of cross-discipline team working. One major project output still being developed is a DIDET Project Framework for Distributed Innovative Design, Education and Teamwork to encapsulate all project findings and outputs. The project achieved its goal of embedding major change to the teaching of Design Engineering and Strathclyde's new Global Design class has been both successful and popular with students
    • …
    corecore