19 research outputs found

    Systematic Approaches for Telemedicine and Data Coordination for COVID-19 in Baja California, Mexico

    Get PDF
    Conference proceedings info: ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologies Raleigh, HI, United States, March 24-26, 2023 Pages 529-542We provide a model for systematic implementation of telemedicine within a large evaluation center for COVID-19 in the area of Baja California, Mexico. Our model is based on human-centric design factors and cross disciplinary collaborations for scalable data-driven enablement of smartphone, cellular, and video Teleconsul-tation technologies to link hospitals, clinics, and emergency medical services for point-of-care assessments of COVID testing, and for subsequent treatment and quar-antine decisions. A multidisciplinary team was rapidly created, in cooperation with different institutions, including: the Autonomous University of Baja California, the Ministry of Health, the Command, Communication and Computer Control Center of the Ministry of the State of Baja California (C4), Colleges of Medicine, and the College of Psychologists. Our objective is to provide information to the public and to evaluate COVID-19 in real time and to track, regional, municipal, and state-wide data in real time that informs supply chains and resource allocation with the anticipation of a surge in COVID-19 cases. RESUMEN Proporcionamos un modelo para la implementaci贸n sistem谩tica de la telemedicina dentro de un gran centro de evaluaci贸n de COVID-19 en el 谩rea de Baja California, M茅xico. Nuestro modelo se basa en factores de dise帽o centrados en el ser humano y colaboraciones interdisciplinarias para la habilitaci贸n escalable basada en datos de tecnolog铆as de teleconsulta de tel茅fonos inteligentes, celulares y video para vincular hospitales, cl铆nicas y servicios m茅dicos de emergencia para evaluaciones de COVID en el punto de atenci贸n. pruebas, y para el tratamiento posterior y decisiones de cuarentena. R谩pidamente se cre贸 un equipo multidisciplinario, en cooperaci贸n con diferentes instituciones, entre ellas: la Universidad Aut贸noma de Baja California, la Secretar铆a de Salud, el Centro de Comando, Comunicaciones y Control Inform谩tico. de la Secretar铆a del Estado de Baja California (C4), Facultades de Medicina y Colegio de Psic贸logos. Nuestro objetivo es proporcionar informaci贸n al p煤blico y evaluar COVID-19 en tiempo real y rastrear datos regionales, municipales y estatales en tiempo real que informan las cadenas de suministro y la asignaci贸n de recursos con la anticipaci贸n de un aumento de COVID-19. 19 casos.ICICT 2023: 2023 The 6th International Conference on Information and Computer Technologieshttps://doi.org/10.1007/978-981-99-3236-

    Big data use at an automotive manufacturer: a framework to address privacy concerns in Hadoop Technology.

    Get PDF
    An automotive manufacturer can generate big data through accessible data points from internal and external Internet of Things (IoT) data sources connected to the production line. Big data analytics needs to be applied to these large and complex datasets to realise the associated opportunities, such as an improved manufacturing process, optimised supply chain management, competitive advantage and business growth. In order to store, manage and process the data, automotive manufacturers are using Apache Hadoop technology. Apache Hadoop is a cost-effective, scalable, and fault-tolerant technology. However, there has been a concern raised regarding the privacy of big data in Apache Hadoop. A key challenge in Hadoop technology is its ineffective security model, making the data susceptible to unauthorised users. Consequently, a breach in data privacy results in automotive manufacturers becoming victims of theft of trade secrets and intellectual property via corporate spies. This theft has a negative impact and results in the loss of company reputation, business competitiveness and business growth in the automotive market. This study investigated a solution to ensure big data privacy when using Hadoop technology. The Selective Organisational Information Privacy and Security Violations Model (SOIPSVM) and the Capability Maturity Model (CMM) provided the theoretical base for this study. The researcher undertook a literature analysis and qualitative study to understand and address the identified research problem. The primary data was collected from ten Information Technology (IT) specialists at a local automotive manufacturer. These specialists participated in an interview session, which also included the completion of a questionnaire. All questions were pre-determined and open-ended, and the participants' responses were recorded. Primary data was analysed using the inductive approach by identifying relevant themes and sub-themes. In contrast, the literature analysis included academic journals, conference proceedings, websites, and books, which were critically discussed in this study. This study's findings indicated various measures to be implemented by the automotive manufacturer to address the research problem. Critical success factors were derived from the identified measures, which addressed significant data privacy issues in using Hadoop technology. The identified critical success factors included: control of internal and external data sources; monitor the value of big data towards improving the automotive manufacturing process and user behaviour; implementation of user authentication; encryption to secure data; disaster recovery and backup plan; execution of authorisation and Access Control List (ACLS); conduct audits and regular reviews of user access to data; apply data masking to sensitive data and tokenization to secure data; build own infrastructure to store and analyse data; install regular security updates and update passwords regularly. Each factor had a purpose that examined big data management, governance and compliance in detail. The identified factors contributed towards ensuring data privacy in the use of Hadoop technology. These factors were categorised into contextual and rule and regulatory conditions adopted from the SOIPSVM. Identified conditions were then aligned to the five-level CMM. Each condition was expanded upon at various maturity levels to form a framework that addressed the main research problem. The framework's application was described as an independent assessment of each critical success factor and provided a guide through various maturity levels. The framework's purpose was to address and overcome big data privacy concerns in using Hadoop technology at a local automotive manufacturer.Thesis (MCom) (Information Systems) -- University of Fort Hare, 202

    Rv-Enabled Framework For Self-Adaptive Software

    Get PDF
    Software systems keep increasing in scale and complexity, requiring ever more effort to design, build, test, and deploy. Systems are integrated from separately developed modules. Over the life of a system, individual modules may be updated, which may allow incompatibilities between modules to slip in. Consequently, many faults in a software system are often discovered after the system is built and deployed. Runtime verification (RV) is a collection of dynamic techniques for detecting faults of software systems. An executable monitor is constructed from a formally specified property of the system being checked (denoted as the target system) and is run over a stream of observations (events) to check whether the property is satisfied or not. Although existing tools are able to specify and monitor properties efficiently, it is still challenging to apply RV to large-scale real-world applications. From the perspective of monitoring requirements, we need a formalism that can describe both high and low-level behaviors of the target system. Complexity of the target program also brings some issues. For instance, it may contain a set of loosely-coupled components which may be added or removed dynamically. Correspondingly, monitoring requirements are often defined upon asynchronous observations that carry data of which the domain scale up along with expansion of the target system. How to conveniently specify these properties and generate monitors that can check them efficiently is a challenge. Beyond detecting faults, self-adaptive software is desirable for tolerating faults or unexpected environment changes during execution. By equipping monitors with reflexive adaptation actions, runtime enforcement (RE) can be used to improve robustness of the system. However, there is little work on analyzing possible interference between the implementation of adaptation actions and the target program. In this thesis, we present SMEDL, a RV framework using a specification language designed for high usability with respect to expressiveness, efficiency and flexible deployment. The property specification is composed of a set of communicating monitors described in the form of EFSMs (extend finite state machines). High-level properties can be straightforwardly transformed into SMEDL specifications while actions can be specified in transitions to express low-level imperative behaviors. Deployment of monitors can be explicitly specified to support both centralized and distributed software. Based on dynamically scalable monitor structure, we propose a novel method to efficiently check parametric properties that rely on the data events carry. To tackle challenges of monitoring timing properties in an asynchronous environment, we propose a conceptual monitor architecture that clearly separates monitoring of time intervals from the rest of property checking. To support software adaptation, we extend the SMEDL framework to specify enforcement specifications, generate implementations and instrument them into the target system. Analysis of interference between the adaptation implementation and the target system can be performed statically based on Hoare-logic. Instead of building a whole new proof for the target system globally, we present a method to generate local proof obligations for better scalability

    Big data use at an automotive manufacturer: a framework to address privacy concerns in Hadoop Technology.

    Get PDF
    An automotive manufacturer can generate big data through accessible data points from internal and external Internet of Things (IoT) data sources connected to the production line. Big data analytics needs to be applied to these large and complex datasets to realise the associated opportunities, such as an improved manufacturing process, optimised supply chain management, competitive advantage and business growth. In order to store, manage and process the data, automotive manufacturers are using Apache Hadoop technology. Apache Hadoop is a cost-effective, scalable, and fault-tolerant technology. However, there has been a concern raised regarding the privacy of big data in Apache Hadoop. A key challenge in Hadoop technology is its ineffective security model, making the data susceptible to unauthorised users. Consequently, a breach in data privacy results in automotive manufacturers becoming victims of theft of trade secrets and intellectual property via corporate spies. This theft has a negative impact and results in the loss of company reputation, business competitiveness and business growth in the automotive market. This study investigated a solution to ensure big data privacy when using Hadoop technology. The Selective Organisational Information Privacy and Security Violations Model (SOIPSVM) and the Capability Maturity Model (CMM) provided the theoretical base for this study. The researcher undertook a literature analysis and qualitative study to understand and address the identified research problem. The primary data was collected from ten Information Technology (IT) specialists at a local automotive manufacturer. These specialists participated in an interview session, which also included the completion of a questionnaire. All questions were pre-determined and open-ended, and the participants' responses were recorded. Primary data was analysed using the inductive approach by identifying relevant themes and sub-themes. In contrast, the literature analysis included academic journals, conference proceedings, websites, and books, which were critically discussed in this study. This study's findings indicated various measures to be implemented by the automotive manufacturer to address the research problem. Critical success factors were derived from the identified measures, which addressed significant data privacy issues in using Hadoop technology. The identified critical success factors included: control of internal and external data sources; monitor the value of big data towards improving the automotive manufacturing process and user behaviour; implementation of user authentication; encryption to secure data; disaster recovery and backup plan; execution of authorisation and Access Control List (ACLS); conduct audits and regular reviews of user access to data; apply data masking to sensitive data and tokenization to secure data; build own infrastructure to store and analyse data; install regular security updates and update passwords regularly. Each factor had a purpose that examined big data management, governance and compliance in detail. The identified factors contributed towards ensuring data privacy in the use of Hadoop technology. These factors were categorised into contextual and rule and regulatory conditions adopted from the SOIPSVM. Identified conditions were then aligned to the five-level CMM. Each condition was expanded upon at various maturity levels to form a framework that addressed the main research problem. The framework's application was described as an independent assessment of each critical success factor and provided a guide through various maturity levels. The framework's purpose was to address and overcome big data privacy concerns in using Hadoop technology at a local automotive manufacturer.Thesis (MCom) (Information Systems) -- University of Fort Hare, 202

    From Simulation to Runtime Verification and Back: Connecting Single-Run Verification Techniques

    Get PDF
    Modern safety-critical systems, such as aircraft and spacecraft, crucially depend on rigorous verification, from design time to runtime. Simulation is a highly-developed, time-honored design-time verification technique, whereas runtime verification is a much younger outgrowth from modern complex systems that both enable embedding analysis on-board and require mission-time verification, e.g., for flight certification. While the attributes of simulation are well-defined, the vocabulary of runtime verification is still being formed; both are active research areas needed to ensure safety and security. This invited paper explores the connections and differences between simulation and runtime verification and poses open research questions regarding how each might be used to advance past bottlenecks in the other. We unify their vocabulary, list their commonalities and contrasts, and examine how their artifacts may be connected to push the state of the art of what we can (safely) fly

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF

    16th SC@RUG 2019 proceedings 2018-2019

    Get PDF
    corecore