103,342 research outputs found

    Process of designing robust, dependable, safe and secure software for medical devices: Point of care testing device as a case study

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Copyright © 2013 Sivanesan Tulasidas et al. This paper presents a holistic methodology for the design of medical device software, which encompasses of a new way of eliciting requirements, system design process, security design guideline, cloud architecture design, combinatorial testing process and agile project management. The paper uses point of care diagnostics as a case study where the software and hardware must be robust, reliable to provide accurate diagnosis of diseases. As software and software intensive systems are becoming increasingly complex, the impact of failures can lead to significant property damage, or damage to the environment. Within the medical diagnostic device software domain such failures can result in misdiagnosis leading to clinical complications and in some cases death. Software faults can arise due to the interaction among the software, the hardware, third party software and the operating environment. Unanticipated environmental changes and latent coding errors lead to operation faults despite of the fact that usually a significant effort has been expended in the design, verification and validation of the software system. It is becoming increasingly more apparent that one needs to adopt different approaches, which will guarantee that a complex software system meets all safety, security, and reliability requirements, in addition to complying with standards such as IEC 62304. There are many initiatives taken to develop safety and security critical systems, at different development phases and in different contexts, ranging from infrastructure design to device design. Different approaches are implemented to design error free software for safety critical systems. By adopting the strategies and processes presented in this paper one can overcome the challenges in developing error free software for medical devices (or safety critical systems).Brunel Open Access Publishing Fund

    BIM adoption and implementation for architectural practices

    Get PDF
    Severe issues about data acquisition and management arise during the design creation and development due to complexity, uncertainty and ambiguity. BIM (Building Information Modelling) is a tool for a team based lean design approach towards improved architectural practice across the supply chain. However, moving from a CAD (Computer Aided Design) approach to BIM (Building Information Modelling) represents a fundamental change for individual disciplines and the construction industry as a whole. Although BIM has been implemented by large practices, it is not widely used by SMEs (Small and Medium Sized Enterprises). Purpose: This paper aims to present a systematic approach for BIM implementation for Architectural SMEs at the organizational level Design/Methodology/Approach: The research is undertaken through a KTP (Knowledge transfer Partnership) project between the University of Salford and John McCall Architects (JMA) a SME based in Liverpool. The overall aim of the KTP is to develop lean design practice through BIM adoption. The BIM implementation approach uses a socio-technical view which does not only consider the implementation of technology but also considers the socio-cultural environment that provides the context for its implementation. The action research oriented qualitative and quantitative research is used for discovery, comparison, and experimentation as it provides ĂŻÂżÂœlearning by doingĂŻÂżÂœ. Findings: The strategic approach to BIM adoption incorporated people, process and technology equally and led to capacity building through the improvements in process, technological infrastructure and upskilling of JMA staff to attain efficiency gains and competitive advantages. Originality/Value: This paper introduces a systematic approach for BIM adoption based on the action research philosophy and demonstrates a roadmap for BIM adoption at the operational level for SME companie

    On the Experimental Evaluation of Vehicular Networks: Issues, Requirements and Methodology Applied to a Real Use Case

    Get PDF
    One of the most challenging fields in vehicular communications has been the experimental assessment of protocols and novel technologies. Researchers usually tend to simulate vehicular scenarios and/or partially validate new contributions in the area by using constrained testbeds and carrying out minor tests. In this line, the present work reviews the issues that pioneers in the area of vehicular communications and, in general, in telematics, have to deal with if they want to perform a good evaluation campaign by real testing. The key needs for a good experimental evaluation is the use of proper software tools for gathering testing data, post-processing and generating relevant figures of merit and, finally, properly showing the most important results. For this reason, a key contribution of this paper is the presentation of an evaluation environment called AnaVANET, which covers the previous needs. By using this tool and presenting a reference case of study, a generic testing methodology is described and applied. This way, the usage of the IPv6 protocol over a vehicle-to-vehicle routing protocol, and supporting IETF-based network mobility, is tested at the same time the main features of the AnaVANET system are presented. This work contributes in laying the foundations for a proper experimental evaluation of vehicular networks and will be useful for many researchers in the area.Comment: in EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 201

    Use of discrete choice to obtain urban freight evaluation data

    Get PDF
    The ex-ante evaluation of urban freight solutions is a complex task, due to the interference of different stakeholder groups with different views and objectives. The multi-actor multi-criteria methods have developed as a response to this scenario, but the determination of the weights required by them remains an unclear and controversial task. We propose the use of discrete choice methods as a powerful tool to confront these multi-faced evaluation problems, since the resulting surveys are flexible and easy to respond, and do not give away the final quantitative results. We have applied this methodology to the selection of urban freight solutions in the city of Seville, in Spain, followed by the determination of the relative weights associated to different objectives, both analyses carried out from the side of the carriers stakeholder group.Ministerio de EconomĂ­a y Competitividad TEC2013-47286-C3-3-

    Risk assessment in life-cycle costing for road asset management

    Get PDF
    Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues

    Raising irrigation productivity and releasing water for intersectoral needs (RIPARWIN): RIPARWIN final technical report

    Get PDF
    River basinsHydrologyRiver basin managementRiver basin developmentDevelopment projectsWater allocationIrrigation waterProductivityIrrigation managementRiceTanzaniaGreat Ruaha River BasinUsangu River Basin

    Distributed-based massive processing of activity logs for efficient user modeling in a Virtual Campus

    Get PDF
    This paper reports on a multi-fold approach for the building of user models based on the identification of navigation patterns in a virtual campus, allowing for adapting the campus’ usability to the actual learners’ needs, thus resulting in a great stimulation of the learning experience. However, user modeling in this context implies a constant processing and analysis of user interaction data during long-term learning activities, which produces huge amounts of valuable data stored typically in server log files. Due to the large or very large size of log files generated daily, the massive processing is a foremost step in extracting useful information. To this end, this work studies, first, the viability of processing large log data files of a real Virtual Campus using different distributed infrastructures. More precisely, we study the time performance of massive processing of daily log files implemented following the master-slave paradigm and evaluated using Cluster Computing and PlanetLab platforms. The study reveals the complexity and challenges of massive processing in the big data era, such as the need to carefully tune the log file processing in terms of chunk log data size to be processed at slave nodes as well as the bottleneck in processing in truly geographically distributed infrastructures due to the overhead caused by the communication time among the master and slave nodes. Then, an application of the massive processing approach resulting in log data processed and stored in a well-structured format is presented. We show how to extract knowledge from the log data analysis by using the WEKA framework for data mining purposes showing its usefulness to effectively build user models in terms of identifying interesting navigation patters of on-line learners. The study is motivated and conducted in the context of the actual data logs of the Virtual Campus of the Open University of Catalonia.Peer ReviewedPostprint (author's final draft
    • 

    corecore