146,001 research outputs found

    The Internet-of-Things Meets Business Process Management: Mutual Benefits and Challenges

    Get PDF
    The Internet of Things (IoT) refers to a network of connected devices collecting and exchanging data over the Internet. These things can be artificial or natural, and interact as autonomous agents forming a complex system. In turn, Business Process Management (BPM) was established to analyze, discover, design, implement, execute, monitor and evolve collaborative business processes within and across organizations. While the IoT and BPM have been regarded as separate topics in research and practice, we strongly believe that the management of IoT applications will strongly benefit from BPM concepts, methods and technologies on the one hand; on the other one, the IoT poses challenges that will require enhancements and extensions of the current state-of-the-art in the BPM field. In this paper, we question to what extent these two paradigms can be combined and we discuss the emerging challenges

    The Translation Evidence Mechanism. The Compact between Researcher and Clinician.

    Get PDF
    Currently, best evidence is a concentrated effort by researchers. Researchers produce information and expect that clinicians will implement their advances in improving patient care. However, difficulties exist in maximizing cooperation and coordination between the producers, facilitators, and users (patients) of best evidence outcomes. The Translational Evidence Mechanism is introduced to overcome these difficulties by forming a compact between researcher, clinician and patient. With this compact, best evidence may become an integral part of private practice when uncertainties arise in patient health status, treatments, and therapies. The mechanism is composed of an organization, central database, and decision algorithm. Communication between the translational evidence organization, clinicians and patients is through the electronic chart. Through the chart, clinical inquiries are made, patient data from provider assessments and practice cost schedules are collected and encrypted (HIPAA standards), then inputted into the central database. Outputs are made within a timeframe suitable to private practice and patient flow. The output consists of a clinical practice guideline that responds to the clinical inquiry with decision, utility and cost data (based on the "average patient") for shared decision-making within informed consent. This shared decision-making allows for patients to "game" treatment scenarios using personal choice inputs. Accompanying the clinical practice guideline is a decision analysis that explains the optimized clinical decision. The resultant clinical decision is returned to the central database using the clinical practice guideline. The result is subsequently used to update current best evidence, indicate the need for new evidence, and analyze the changes made in best evidence implementation. When updates in knowledge occur, these are transmitted to the provider as alerts or flags through patient charts and other communication modalities

    Fife Workforce Modelling Study: Final Report

    Get PDF
    No abstract available

    An Evaluation of the Sustainability of Global Tuna Stocks Relative to Marine Stewardship Council Criteria

    Get PDF
    The Marine Stewardship Council (MSC) has established a program whereby a fishery may be certified as being sustainable. The sustainability of a fishery is defined by MSC criteria which are embodied in three Principles: relating to the status of the stock, the ecosystem of which the stock is a member and the fishery management system. Since many of these MSC criteria are comparable for global tuna stocks, the MSC scoring system was used to evaluate nineteen stocks of tropical and temperate tunas throughout the world and to evaluate the management systems of the Regional Fishery Management Organizations (RFMO) associated with these stocks

    Cyber security investigation for Raspberry Pi devices

    Get PDF
    Big Data on Cloud application is growing rapidly. When the cloud is attacked, the investigation relies on digital forensics evidence. This paper proposed the data collection via Raspberry Pi devices, in a healthcare situation. The significance of this work is that could be expanded into a digital device array that takes big data security issues into account. There are many potential impacts in health area. The field of Digital Forensics Science has been tagged as a reactive science by some who believe research and study in the field often arise as a result of the need to respond to event which brought about the needs for investigation; this work was carried as a proactive research that will add knowledge to the field of Digital Forensic Science. The Raspberry Pi is a cost-effective, pocket sized computer that has gained global recognition since its development in 2008; with the wide spread usage of the device for different computing purposes. Raspberry Pi can potentially be a cyber security device, which can relate with forensics investigation in the near future. This work has used a systematic approach to study the structure and operation of the device and has established security issues that the widespread usage of the device can pose, such as health or smart city. Furthermore, its evidential information applied in security will be useful in the event that the device becomes a subject of digital forensic investigation in the foreseeable future. In healthcare system, PII (personal identifiable information) is a very important issue. When Raspberry Pi plays a processor role, its security is vital; consequently, digital forensics investigation on the Raspberry Pies becomes necessary

    Transdisciplinarity seen through Information, Communication, Computation, (Inter-)Action and Cognition

    Full text link
    Similar to oil that acted as a basic raw material and key driving force of industrial society, information acts as a raw material and principal mover of knowledge society in the knowledge production, propagation and application. New developments in information processing and information communication technologies allow increasingly complex and accurate descriptions, representations and models, which are often multi-parameter, multi-perspective, multi-level and multidimensional. This leads to the necessity of collaborative work between different domains with corresponding specialist competences, sciences and research traditions. We present several major transdisciplinary unification projects for information and knowledge, which proceed on the descriptive, logical and the level of generative mechanisms. Parallel process of boundary crossing and transdisciplinary activity is going on in the applied domains. Technological artifacts are becoming increasingly complex and their design is strongly user-centered, which brings in not only the function and various technological qualities but also other aspects including esthetic, user experience, ethics and sustainability with social and environmental dimensions. When integrating knowledge from a variety of fields, with contributions from different groups of stakeholders, numerous challenges are met in establishing common view and common course of action. In this context, information is our environment, and informational ecology determines both epistemology and spaces for action. We present some insights into the current state of the art of transdisciplinary theory and practice of information studies and informatics. We depict different facets of transdisciplinarity as we see it from our different research fields that include information studies, computability, human-computer interaction, multi-operating-systems environments and philosophy.Comment: Chapter in a forthcoming book: Information Studies and the Quest for Transdisciplinarity - Forthcoming book in World Scientific. Mark Burgin and Wolfgang Hofkirchner, Editor

    Architecture and Design of Medical Processor Units for Medical Networks

    Full text link
    This paper introduces analogical and deductive methodologies for the design medical processor units (MPUs). From the study of evolution of numerous earlier processors, we derive the basis for the architecture of MPUs. These specialized processors perform unique medical functions encoded as medical operational codes (mopcs). From a pragmatic perspective, MPUs function very close to CPUs. Both processors have unique operation codes that command the hardware to perform a distinct chain of subprocesses upon operands and generate a specific result unique to the opcode and the operand(s). In medical environments, MPU decodes the mopcs and executes a series of medical sub-processes and sends out secondary commands to the medical machine. Whereas operands in a typical computer system are numerical and logical entities, the operands in medical machine are objects such as such as patients, blood samples, tissues, operating rooms, medical staff, medical bills, patient payments, etc. We follow the functional overlap between the two processes and evolve the design of medical computer systems and networks.Comment: 17 page
    corecore