11 research outputs found

    PARALIGN: rapid and sensitive sequence similarity searches powered by parallel computing technology

    Get PDF
    PARALIGN is a rapid and sensitive similarity search tool for the identification of distantly related sequences in both nucleotide and amino acid sequence databases. Two algorithms are implemented, accelerated Smith–Waterman and ParAlign. The ParAlign algorithm is similar to Smith–Waterman in sensitivity, while as quick as BLAST for protein searches. A form of parallel computing technology known as multimedia technology that is available in modern processors, but rarely used by other bioinformatics software, has been exploited to achieve the high speed. The software is also designed to run efficiently on computer clusters using the message-passing interface standard. A public search service powered by a large computer cluster has been set-up and is freely available at , where the major public databases can be searched. The software can also be downloaded free of charge for academic use

    A blockchain-based framework for trusted quality data sharing towards zero-defect manufacturing

    Get PDF
    There is a current wave of a new generation of digital solutions based on intelligent systems, hybrid digital twins and AI-driven optimization tools to assure quality in smart factories. Such digital solutions heavily depend on quality-related information within the supply chain business ecosystem to drive zero-waste value chains. To empower zero-waste value chain strategies with meaningful, reliable, and trustful data, there must be a solution for end-to-end industrial data traceability, trust, and security across multiple process chains or even inter-organizational supply chains. In this paper, we first present Product, Process, and Data quality services to drive zero-waste value chain strategies. Following this, we present the Trusted Framework (TF), which is a key enabler for the secure and effective sharing of quality-related information within the supply chain business ecosystem, and thus for quality optimization actions towards zero-defect manufacturing. The TF specification includes the data model and format of the Process/Product/Data (PPD) Quality Hallmark, the OpenAPI exposed to factory system and a comprehensive Identity Management layer, for secure horizontal- and vertical quality data integration. The PPD hallmark and the TF already address some of the industrial needs to have a trusted approach to share quality data between the different stakeholders of the production chain to empower zero-waste value chain strategies.publishedVersio

    A feasibility study in model based prediction of impact of changes on system quality

    No full text
    We propose a method, called PREDIQT, for model based prediction of impact of architecture design changes on system quality attributes. PREDIQT supports simultaneous analysis of several quality attributes and their trade-offs. This paper argues for the feasibility of the PREDIQT method based on a comprehensive industrial case study targeting a system for managing validation of electronic certificates and signatures worldwide. We first give an overview of the PREDIQT method, and then present an evaluation of the method in terms of a feasibility study and a thought experiment. The evaluation focuses on security and its trade-offs with the overall quality attributes of the target system Oppdragsgiver: Research Council of Norwa

    A feasibility study in model based prediction of impact of changes on system quality

    No full text
    -We propose a method, called PREDIQT, for model based prediction of impact of architecture design changes on system quality attributes. PREDIQT supports simultaneous analysis of several quality attributes and their trade-offs. This paper argues for the feasibility of the PREDIQT method based on a comprehensive industrial case study targeting a system for managing validation of electronic certificates and signatures worldwide. We first give an overview of the PREDIQT method, and then present an evaluation of the method in terms of a feasibility study and a thought experiment. The evaluation focuses on security and its trade-offs with the overall quality attributes of the target system Oppdragsgiver: Research Council of Norwa

    A feasibility study in model based prediction of impact of changes on system quality

    No full text
    -We propose a method, called PREDIQT, for model based prediction of impact of architecture design changes on system quality attributes. PREDIQT supports simultaneous analysis of several quality attributes and their trade-offs. This paper argues for the feasibility of the PREDIQT method based on a comprehensive industrial case study targeting a system for managing validation of electronic certificates and signatures worldwide. We first give an overview of the PREDIQT method, and then present an evaluation of the method in terms of a feasibility study and a thought experiment. The evaluation focuses on security and its trade-offs with the overall quality attributes of the target system Oppdragsgiver: Research Council of Norwa

    A feasibility study in model based prediction of impact of changes on system quality

    No full text
    -We propose a method, called PREDIQT, for model based prediction of impact of architecture design changes on system quality attributes. PREDIQT supports simultaneous analysis of several quality attributes and their trade-offs. This paper argues for the feasibility of the PREDIQT method based on a comprehensive industrial case study targeting a system for managing validation of electronic certificates and signatures worldwide. We first give an overview of the PREDIQT method, and then present an evaluation of the method in terms of a feasibility study and a thought experiment. The evaluation focuses on security and its trade-offs with the overall quality attributes of the target system Oppdragsgiver: Research Council of Norwa

    Software Engineering and AI for Data Quality in Cyber-Physical Systems

    No full text
    <p>ABSTRACT</p><p>Cyber-physical systems (CPS) have been developed in many industrial sectors and application domains in which the quality requirements of data acquired are a common factor. Data quality in CPS can deteriorate because of several factors such as sensor faults and failures due to operating in harsh and uncertain environments. How can software engineering and artificial intelligence (AI) help manage and tame data quality issues in CPS? This is the question we aimed to investigate in the SEA4DQ workshop. Emerging trends in software engineering need to take data quality management seriously as CPS are increasingly data-centric in their approach to acquiring and processing data along the edge-fog-cloud continuum. This workshop provided researchers and practitioners a forum for exchanging ideas, experiences, understanding of the problems, visions for the future, and promising solutions to the problems in data quality in CPS. Examples of topics include software/hardware architectures and frameworks for data quality management in CPS; software engineering and AI to detect anomalies in CPS data or to repair erroneous CPS data. SEA4DQ 2021, which took place on August 24th, 2021 was a satellite event of the ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC / FSE) 2021. The workshop attracted 35 international participants and was exciting with a great keynote, six excellent presentations, and concluded on a high note with a panel discussion. SEA4DQ was motivated by the common research interests from the EU projects for Zero-Defects Manufacturing such as InterQ and Dat4.Zero.</p><p>This work has received funding from the European Union's Horizon 2020 Research and Innovation programme under Grant Agreement No. 958357 (InterQ), and Grant Agreement No. 958363 (DAT4.Zero).</p&gt

    Samlet påvirkning i foreslåtte særlig verdifulle og sårbare områder i norske havområder

    Get PDF
    I dette arbeidet er det gjort en vurdering av risiko fra samlet påvirkning med tilhørende usikkerhet for aktive sektorer i norske havområder, for hvert av havområdene Nordsjøen, Norskehavet og Barentshavet, og for hvert av de 19 foreslåtte særlig verdifulle og sårbare områdene (SVO). Resultatene i denne rapporten bør sees på som en første og midlertidig versjon, da det er aller første gang rammeverket benyttes for norske havområder. All informasjon om sektorenes aktivitet er tatt fra en representativ periode (2017-2019), og speiler så nær som mulig dagens aktivitet. Vi har valgt å ikke benytte nyere data, da pandemien påvirket noen av sektorene i stor grad. Akuttutslipp og andre uhell er ikke gjenstand for vurdering i denne rapporten. Usikkerhet knyttet til eksponering (tid og rom) og sårbarhet er inkludert i vurderingene. Det er stor variasjon i usikkerheten, både sektor-relatert og relatert til miljøverdi og/eller påvirkningsfaktor. Dette arbeidet bygger på rapporten Særlig verdifulle og sårbare områder (SVO) i norske havområder – Miljøverdi og Miljøverdiers sårbarhet for påvirkninger. Her tar vi i bruk ODEMM (Options for Delivering Ecosystem-based Marine Management) rammeverket for å vurdere risiko for negativ påvirkning.publishedVersio
    corecore