20 research outputs found

    Implementing clinical practice guidelines for chronic obstructive pulmonary disease in an EHR system

    Get PDF
    The use of clinical practice guidelines to improve quality of care has been a vividly discussed topic. Clinical practice guidelines (CPG) aim to improve the health of patients by guiding individual care in clinical settings. CPGs bring potential benefits for patients by improving clinical decision making, improving efficiency and enhancing patient care, while essentially optimizing financial value. Chronic conditions like heart disease, stroke, and chronic obstructive pulmonary disease (COPD), plague the US healthcare system causing several million dollars in healthcare related cost. This paper demonstrates the development of a CPG into an open-source EHR system to effectively manage COPD patients. The CPG is incorporated using the open web app standard, which allows it to be used with any web browser based EHR system, once data from the EHR system can be fed into the app. As a result, the CPG helps create a more effective and efficient decision-making process

    Towards a Cyber-Physical Manufacturing Cloud through Operable Digital Twins and Virtual Production Lines

    Get PDF
    In last decade, the paradigm of Cyber-Physical Systems (CPS) has integrated industrial manufacturing systems with Cloud Computing technologies for Cloud Manufacturing. Up to 2015, there were many CPS-based manufacturing systems that collected real-time machining data to perform remote monitoring, prognostics and health management, and predictive maintenance. However, these CPS-integrated and network ready machines were not directly connected to the elements of Cloud Manufacturing and required human-in-the-loop. Addressing this gap, we introduced a new paradigm of Cyber-Physical Manufacturing Cloud (CPMC) that bridges a gap between physical machines and virtual space in 2017. CPMC virtualizes machine tools in cloud through web services for direct monitoring and operations through Internet. Fundamentally, CPMC differs with contemporary modern manufacturing paradigms. For instance, CPMC virtualizes machining tools in cloud using remote services and establish direct Internet-based communication, which is overlooked in existing Cloud Manufacturing systems. Another contemporary, namely cyber-physical production systems enable networked access to machining tools. Nevertheless, CPMC virtualizes manufacturing resources in cloud and monitor and operate them over the Internet. This dissertation defines the fundamental concepts of CPMC and expands its horizon in different aspects of cloud-based virtual manufacturing such as Digital Twins and Virtual Production Lines. Digital Twin (DT) is another evolving concept since 2002 that creates as-is replicas of machining tools in cyber space. Up to 2018, many researchers proposed state-of-the-art DTs, which only focused on monitoring production lifecycle management through simulations and data driven analytics. But they overlooked executing manufacturing processes through DTs from virtual space. This dissertation identifies that DTs can be made more productive if they engage directly in direct execution of manufacturing operations besides monitoring. Towards this novel approach, this dissertation proposes a new operable DT model of CPMC that inherits the features of direct monitoring and operations from cloud. This research envisages and opens the door for future manufacturing systems where resources are developed as cloud-based DTs for remote and distributed manufacturing. Proposed concepts and visions of DTs have spawned the following fundamental researches. This dissertation proposes a novel concept of DT based Virtual Production Lines (VPL) in CPMC in 2019. It presents a design of a service-oriented architecture of DTs that virtualizes physical manufacturing resources in CPMC. Proposed DT architecture offers a more compact and integral service-oriented virtual representations of manufacturing resources. To re-configure a VPL, one requirement is to establish DT-to-DT collaborations in manufacturing clouds, which replicates to concurrent resource-to-resource collaborations in shop floors. Satisfying the above requirements, this research designs a novel framework to easily re-configure, monitor and operate VPLs using DTs of CPMC. CPMC publishes individual web services for machining tools, which is a traditional approach in the domain of service computing. But this approach overcrowds service registry databases. This dissertation introduces a novel fundamental service publication and discovery approach in 2020, OpenDT, which publishes DTs with collections of services. Experimental results show easier discovery and remote access of DTs while re-configuring VPLs. Proposed researches in this dissertation have received numerous citations both from industry and academia, clearly proving impacts of research contributions

    Web-oriented Event Processing

    Get PDF
    How can the Web be made situation-aware? Event processing is a suitable technology for gaining the necessary real-time results. The Web, however, has many users and many application domains. Thus, we developed multi-schema friendly data models allowing the re-use and mix from diverse users and application domains. Furthermore, our methods describe protocols to exchange events on the Web, algorithms to execute the language and to calculate access rights

    Collaborative coding in the cloud : providing a paradigm shift to the way software development is achieved in environments of the future

    Full text link
    University of Technology, Sydney. Faculty of Engineering and Information Technology.This research aims to address a number of challenges surrounding traditional software development practices, including the need for team transparency, synergy between project components and developers who are weighed down by desktop based environments. A cloud computing model is discussed, including a hypothesis on the required platform to solve many of these challenges. A number of previous research agendas are raised; including extensions to the JEdit and Eclipse IDEs, purpose built collaborative platforms, and an IDE that operates on a mobile device. Two cases studies around Google Wave and Mozilla Bespin are raised, and how industry leaders are addressing these challenges. Through a qualitative survey, the needs of a developer and perceptions behind cloud computing are raised with a discrete range of industry professionals. A proposed model is provided, which aims at borrowing concepts traditionally found in social networking yet applies them toward a software development context, and highlights a number of recommendations for success. A research subset is then chosen to provide a technical implementation of a Google Wave agent, aimed at assisting distributed teams with cross communication and autonomous up-skill. Finally, the research outcome answers the question of whether an IDE can be deployed within cloud based architectures and be adopted by the software development community. Given the infancy of the platform, the research outcome finds that immediate deployment of the proposed platform cannot be realized, and that researchers are dependent on platform maturity before successful deployment and adoption can be achieved. The overall research provides a number of future research directions, including reassessment of the philosophy proposed throughout this research, implementation of the proposed framework, or improvements focused on the communication and collaboration agent developed. The research fulfills a number of research areas required in the arenas of communication and collaboration among the software engineering community

    Web Application Programming Interfaces (APIs): general-purpose standards, terms and European Commission initiatives

    Get PDF
    From their inception, digital technologies have had a huge impact on our everyday life. In both the private and the public sectors, they have contributed to, or at times driven, change in organisational structures, ways of working, and how products and services are shaped and shared. Governments and public administration units, driven by the digital evolution of information and communications technology (ICT), are evolving from traditional workflow-based public service provisions to digital equivalents (e-government), with more innovative forms of government and administration looking for the engagement of citizens and the private sector to co-create final services through user-centric approaches. Application Programming Interfaces (APIs), which are one of the most relevant ICT solutions, have contributed to this notable shift in the adoption of technology, especially when used over the web. They have affected the global economy of the private sector and are contributing to the digital transformation of governments. To explore this in more detail, the European Commission recently started the APIs4DGov study. One of the outputs of the study is an analysis of the API technological landscape, including its related standards and technical specifications for general purpose use. The goal of the analysis presented in this brief report is to support the definition of stable APIs for digital government services adopted by governments or single public administration units. Such adoption would avoid the need to develop ad hoc solutions that could have limited scalability or potential for reuse. Instead, the work suggests that we should consider a number of existing standards provided by standardisation bodies or, at least, technical specifications written by well-recognised consortia, vendors or users. The aim of this report is also to support API stakeholders in the identification and selection of such solutions. To do this, it first gives a series of definitions to help the reader understand some basic concepts, as well as related standards and technical specifications. Then, it presents the description and classification (by resource representation, security, usability, test, performance and licence) of the standards and technical specifications collected. A shortlist of these documents (based on their utilisation, maintenance and stability) is also proposed, together with a brief description of each of them. Finally, the report provides a useful glossary with definitions of the relevant terms we have collected so far within the APIs4DGov study.JRC.B.6-Digital Econom

    A Reference Architecture for Service Lifecycle Management – Construction and Application to Designing and Analyzing IT Support

    Get PDF
    Service-orientation and the underlying concept of service-oriented architectures are a means to successfully address the need for flexibility and interoperability of software applications, which in turn leads to improved IT support of business processes. With a growing level of diffusion, sophistication and maturity, the number of services and interdependencies is gradually rising. This increasingly requires companies to implement a systematic management of services along their entire lifecycle. Service lifecycle management (SLM), i.e., the management of services from the initiating idea to their disposal, is becoming a crucial success factor. Not surprisingly, the academic and practice communities increasingly postulate comprehensive IT support for SLM to counteract the inherent complexity. The topic is still in its infancy, with no comprehensive models available that help evaluating and designing IT support in SLM. This thesis presents a reference architecture for SLM and applies it to the evaluation and designing of SLM IT support in companies. The artifact, which largely resulted from consortium research efforts, draws from an extensive analysis of existing SLM applications, case studies, focus group discussions, bilateral interviews and existing literature. Formal procedure models and a configuration terminology allow adapting and applying the reference architecture to a company’s individual setting. Corresponding usage examples prove its applicability and demonstrate the arising benefits within various SLM IT support design and evaluation tasks. A statistical analysis of the knowledge embodied within the reference data leads to novel, highly significant findings. For example, contemporary standard applications do not yet emphasize the lifecycle concept but rather tend to focus on small parts of the lifecycle, especially on service operation. This forces user companies either into a best-of-breed or a custom-development strategy if they are to implement integrated IT support for their SLM activities. SLM software vendors and internal software development units need to undergo a paradigm shift in order to better reflect the numerous interdependencies and increasing intertwining within services’ lifecycles. The SLM architecture is a first step towards achieving this goal.:Content Overview List of Figures....................................................................................... xi List of Tables ...................................................................................... xiv List of Abbreviations.......................................................................xviii 1 Introduction .................................................................................... 1 2 Foundations ................................................................................... 13 3 Architecture Structure and Strategy Layer .............................. 57 4 Process Layer ................................................................................ 75 5 Information Systems Layer ....................................................... 103 6 Architecture Application and Extension ................................. 137 7 Results, Evaluation and Outlook .............................................. 195 Appendix ..........................................................................................203 References .......................................................................................... 463 Curriculum Vitae.............................................................................. 498 Bibliographic Data............................................................................ 49

    Towards a unified methodology for supporting the integration of data sources for use in web applications

    Get PDF
    Organisations are making increasing use of web applications and web-based systems as an integral part of providing services. Examples include personalised dynamic user content on a website, social media plug-ins or web-based mapping tools. For these types of applications to have maximum use for the user where the applications are fully functional, they require the integration of data from multiple sources. The focus of this thesis is in improving this integration process with a focus on web applications with multiple sources of data. Integration of data from multiple sources is problematic for many reasons. Current integration methods tend to be domain specific and application specific. They are often complex, have compatibility issues with different technologies, lack maturity, are difficult to re-use, and do not accommodate new and emerging models and integration technologies. Technologies to achieve integration, such as brokers and translators do exist, but they cannot be used as a generic solution for developing web-applications achieving the integration outcomes required for successful web application development due to their domain specificity. It is because of these difficulties with integration, and the wide variety of integration approaches that there is a need to provide assistance to the developer in selecting the integration approach most appropriate to their needs. This thesis proposes GIWeb, a unified top-down data integration methodology instantiated with a framework that will aid developers in their integration process. It will act as a conceptual structure to support the chosen technical approach. The framework will assist in the integration of data sources to support web application builders. The thesis presents the rationale for the need for the framework based on an examination of the range of applications, associated data sources and the range of potential solutions. The framework is evaluated using four case studies

    Open Infrared : enhancing environmental monitoring through accessible remote sensing, in Indonesia and beyond

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2013.Cataloged from PDF version of thesis.Includes bibliographical references.As the human landscape changes ever more rapidly, environmental change accelerates. Much environmental information is publicly available as infrared satellite data. However, for the general user, this information is difficult to obtain, and even more difficult to interpret. With this in mind, my team and I launched OpenIR (Open Infrared), an ICT (Information Communication Technology) that provides geo-located IR (infrared) satellite data as ondemand map layers, automates environmental feature classification, experiments with flood risk mapping, and interfaces IR data with crowd- and citizen-maps. OpenIR's initial use case is emergency management and environmental monitoring in the economically developing and ecologically vulnerable archipelago of Indonesia, where we conduced initial usability tests in January 2013.by Arlene Ducao.S.M

    Designing the Sakai Open Academic Environment: A distributed cognition account of the design of a large scale software system

    Get PDF
    Social accounts of technological change make the flexibility and openness of interpretations the starting point of an argument against technological determinism. They suggest that technological change unfolds in the semantic domain, but they focus on the social processes around the interpretations of new technologies, and do not address the conceptual processes of change in interpretations. The dissertation presents an empirically grounded case study of the design process of an open-source online software platform based on the framework of distributed cognition to argue that the cognitive perspective is needed for understanding innovation in software, because it allows us to describe the reflexive and expansive contribution of conceptual processes to new software and the significance of professional epistemic practices in framing the direction of innovation. The framework of distributed cognition brings the social and cognitive perspectives together on account of its understanding of conceptual processes as distributed over time, among people, and between humans and artifacts. The dissertation argues that an evolving open-source software landscape became translated into the open-ended local design space of a new software project in a process of infrastructural implosion, and the design space prompted participants to outline and pursue epistemic strategies of sense-making and learning about the contexts of use. The result was a process of conceptual modeling, which resulted in a conceptually novel user interface. Prototyping professional practices of user-centered design lent directionality to this conceptual process in terms of a focus on individual activities with the user interface. Social approaches to software design under the broad umbrella of human-centered computing have been seeking to inform the design on the basis of empirical contributions about a social context. The analysis has shown that empirical engagement with the contexts of use followed from conceptual modeling, and concern about real world contexts was aligned with the user-centered direction that design was taking. I also point out a social-technical gap in the design process in connection with the repeated performance challenges that the platform was facing, and describe the possibility of a social-technical imagination.Ph.D
    corecore