2,980 research outputs found

    Systematic specification of requirements for assembly process control system in the pharmaceutical industry

    Get PDF
    Abstract. Pharmaceutical manufacturing is one of the most strictly regulated fields in the world. Manufacturers of pharmaceutical products are juridically obliged to monitor the safety and quality of products. Any defects and manufacturing errors affecting the product are demanded to be traceable due to patient safety. Regulative bodies have set strict demands for data integrity in manufacturing records. The main objective of this thesis is to evaluate whether the proposed supervisory control and data acquisition software can adhere to current prevailing regulatory framework. The evaluation of the proposed supervisory control and data acquisition software focuses on handling of electronic records and electronic signatures. Features like user management, alarm and event management, reporting, and locally set requirements in the target company are investigated and reflected to the prevailing regulations concerning data integrity. The results showed that the proposed software is, when properly configured, compliant to prevailing regulations regarding electronic records and electronic signatures. In addition, the proposed software is capable of the requirements set by the target company.Systemaattinen vaatimusmäärittely kokoonpanoprosessin ohjausjärjestelmälle lääketeollisuudessa. Tiivistelmä. Valmistava lääketeollisuus on yksi maailman eniten säädellyin teollisuuden ala. Lääkinnällisten tuotteiden valmistaja on lainmukaisesti vastuussa tuotteidensa laadusta ja valmistuksen valvomisesta. Tuotteiden laatu- ja valmistusvirheiden vaaditaan olevan jäljitettävissä potilasturvallisuuden vuoksi. Sääntelyviranomaiset ovat asettaneet tiukat vaatimukset tuotantokoneiden elektronisille tallenteille. Tämän diplomityön tavoitteena on arvioida noudattaako ehdotettu ohjausjärjestelmä nykyisiä säädöksiä. Ohjausjärjestelmän arviointi keskittyy eletronisten tallenteiden ja elektronisten allekirjoitusten toteutukseen ohjelmassa. Arvioinnin perustana käytetään sääntelyviranomaisten viimeisimpiä säädöksiä. Arviointi kohdistuu ohjelmiston käyttähallintaan, hälytys- ja tapahtumahallintaan, raportointiin ja paikallisesti asetettuihin vaatimuksiin tiedon eheyden näkökulmasta. Arviointi osoitti, että oikein konfiguroituna ehdotettu ohjausjärjestelmä noudattaa nykyisiä säännöksiä elektronisten tallenteiden ja elektronisten allekirjoitusten osalta. Ohjelmisto pystyy myös vastaamaan yrityksen paikallisesti asetettuihin vaatimuksiin. Ohjelmistoa voi kuitenkin käyttää vastoin nykyisiä sääntelyviranomaisten laatimia säädöksiä ilman riittävää asiantuntevuutta

    AN INVESTIGATION INTO THE ROLE OF DATA GOVERNANCE IN IMPROVING DATA QUALITY: A CASE STUDY OF THE OMANI BANKING SECTOR

    Get PDF
    In the era of big data analytics, data is widely recognised as a valuable asset that can enable organisations to achieve their strategic objectives. Despite that, banks are still struggling to maintain high-quality data. Prior studies show that a data governance programme can play a critical role in improving data quality. It can provide data quality professionals with a holistic approach to formally define policies, procedures and decision rights required for managing data quality in a more systematic manner. However, few empirical studies were conducted in this area. Therefore, the present paper aims to close this gap by investigating the data quality problem in the Omani banking industry to understand how various data governance mechanisms can address this issue. The study adopted a qualitative case study with semi-structured interviews and document reviews being used to collect data. A theoretical framework by Abraham et al. (2019) was adopted to guide the collection and analysis of the data. A thematic analysis (TA) by Braun and Clark was followed for data analysis. Findings of the study suggest that the data governance mechanisms, namely ‘performance measurement’, ‘compliance monitoring’ and ‘training’, have positively contributed to mitigating data quality issues in the Omani banking sector. Keywords: Data quality, data governance, information governance, the banking industr

    Legal compliance by design (LCbD) and through design (LCtD) : preliminary survey

    Get PDF
    1st Workshop on Technologies for Regulatory Compliance co-located with the 30th International Conference on Legal Knowledge and Information Systems (JURIX 2017). The purpose of this paper is twofold: (i) carrying out a preliminary survey of the literature and research projects on Compliance by Design (CbD); and (ii) clarifying the double process of (a) extending business managing techniques to other regulatory fields, and (b) converging trends in legal theory, legal technology and Artificial Intelligence. The paper highlights the connections and differences we found across different domains and proposals. We distinguish three different policydriven types of CbD: (i) business, (ii) regulatory, (iii) and legal. The recent deployment of ethical views, and the implementation of general principles of privacy and data protection lead to the conclusion that, in order to appropriately define legal compliance, Compliance through Design (CtD) should be differentiated from CbD

    DBKnot: A Transparent and Seamless, Pluggable Tamper Evident Database

    Get PDF
    Database integrity is crucial to organizations that rely on databases of important data. They suffer from the vulnerability to internal fraud. Database tampering by internal malicious employees with high technical authorization to their infrastructure or even compromised by externals is one of the important attack vectors. This thesis addresses such challenge in a class of problems where data is appended only and is immutable. Examples of operations where data does not change is a) financial institutions (banks, accounting systems, stock market, etc., b) registries and notary systems where important data is kept but is never subject to change, and c) system logs that must be kept intact for performance and forensic inspection if needed. The target of the approach is implementation seamlessness with little-or-no changes required in existing systems. Transaction tracking for tamper detection is done by utilizing a common hashtable that serially and cumulatively hashes transactions together while using an external time-stamper and signer to sign such linkages together. This allows transactions to be tracked without any of the organizations’ data leaving their premises and going to any third-party which also reduces the performance impact of tracking. This is done so by adding a tracking layer and embedding it inside the data workflow while keeping it as un-invasive as possible. DBKnot implements such features a) natively into databases, or b) embedded inside Object Relational Mapping (ORM) frameworks, and finally c) outlines a direction of implementing it as a stand-alone microservice reverse-proxy. A prototype ORM and database layer has been developed and tested for seamlessness of integration and ease of use. Additionally, different models of optimization by implementing pipelining parallelism in the hashing/signing process have been tested in order to check their impact on performance. Stock-market information was used for experimentation with DBKnot and the initial results gave a slightly less than 100% increase in transaction time by using the most basic, sequential, and synchronous version of DBKnot. Signing and hashing overhead does not show significant increase per record with the increased amount of data. A number of different alternate optimizations were done to the design that via testing have resulted in significant increase in performance

    Leveraging socio-technical collaborations to support researchers at the University of Florida

    Get PDF
    The Developing socio-technical collaborations to promote good laboratory practice (GLP), responsible conduct of research (RCR), and research data management (RDM) at the University of Florida (UF) proposal seeks to further support of GLP, RCR, and RDM through support for infrastructure such as electronic research notebooks (ERN), data repositories (general and domain-specific), and policies. Select ERNs, also known as electronic lab notebook (ELN), promote GLP. “The Principles of Good Laboratory Practice (GLP) is to ensure the quality and integrity of test data related to non-clinical safety studies” (OECD, n.d.). “Responsible conduct of research (RCR) is defined as ‘the practice of scientific investigation with integrity’” (UCSB, n.d.).” Research Data Management [RDM] is the care and maintenance of the data that is produced during the course of a research cycle. It is an integral part of the research process and helps to ensure that your data is properly organized, described, preserved, and shared” (DePaul University, 2019). A good ERN can support GLP, RCR, and RDM in research. Good laboratory practice (GLP), responsible conduct of research (RCR), and research data management (RDM) require structural, relational, and transformative change. Structural change includes policies, practices, and resource flows. Relational change includes relationships & connections and power dynamics. Transformative change includes mental models (Kania, Kramer, & Senge, 2018). “Organizational change programmes often fail because they are too focused on one aspect of the system, commonly technology, and fail to analyze and understand the complex interdependencies that exist” (Leeds University Business School, n.d.). A sociotechnical systems theory approach to GLP, RCR, and RDM is one strategy to better understand the complex interdependencies that exist in the conduct of research. This presentation explores select socio-technical RDM efforts to support researchers at UF. Research Questions: 1. What constitutes responsible conduct of research (RCR)? 2. How can the use of ERN promote transparency in the conduct of research? 3. How can the use of ERN promote transparency in the reporting of research? 4. Does the use of ERNs promote responsible conduct of research (RCR)

    Network and Database Security: Regulatory Compliance, Network, and Database Security - A Unified Process and Goal

    Get PDF
    Database security has evolved; data security professionals have developed numerous techniques and approaches to assure data confidentiality, integrity, and availability. This paper will show that the Traditional Database Security, which has focused primarily on creating user accounts and managing user privileges to database objects are not enough to protect data confidentiality, integrity, and availability. This paper is a compilation of different journals, articles and classroom discussions will focus on unifying the process of securing data or information whether it is in use, in storage or being transmitted. Promoting a change in Database Curriculum Development trends may also play a role in helping secure databases. This paper will take the approach that if one make a conscientious effort to unifying the Database Security process, which includes Database Management System (DBMS) selection process, following regulatory compliances, analyzing and learning from the mistakes of others, Implementing Networking Security Technologies, and Securing the Database, may prevent database breach
    corecore