48 research outputs found

    Application of Computational Intelligence in Visual Quality Optimization Watermarking and Coding Tools to Improve the Medical IoT Platforms Using ECC Cybersecurity Based CoAP Protocol

    Get PDF
    To ensure copyright protection and authenticate ownership of media or entities, image watermarking techniques are utilized. This technique entails embedding hidden information about an owner in a specific entity to discover any potential ownership issues. In recent years, several authors have proposed various ways to watermarking. In computational intelligence contexts, however, there are not enough research and comparisons of watermarking approaches. Soft computing techniques are now being applied to help watermarking algorithms perform better. This chapter investigates soft computing-based image watermarking for a medical IoT platform that aims to combat the spread of COVID-19, by allowing a large number of people to simultaneously and securely access their private data, such as photos and QR codes in public places such as stadiums, supermarkets, and events with a large number of participants. Therefore, our platform is composed of QR Code, and RFID identification readers to ensure the validity of a health pass as well as an intelligent facial recognition system to verify the pass’s owner. The proposed system uses artificial intelligence, psychovisual coding, CoAP protocol, and security tools such as digital watermarking and ECC encryption to optimize the sending of data captured from citizens wishing to access a given space in terms of execution time, bandwidth, storage space, energy, and memory consumption

    Database NewSQL performance evaluation for big data in the public cloud

    Get PDF
    For very years, relational databases have been the leading model for data storage, retrieval and management. However, due to increasing needs for scalability and performance, alternative systems have emerged, namely NewSQL technology. NewSQL is a class of modern relational database management systems (RDBMS) that provide the same scalable performance of NoSQL systems for online transaction processing (OLTP) read-write workloads, while still maintaining the ACID guarantees of a traditional database system. In this research paper, the performance of a NewSQL database is evaluated, compared to a MySQL database, both running in the cloud, in order to measure the response time against different configurations of workloads.Instituto de Investigación en Informátic

    Equivalence of three-dimensional spacetimes

    Full text link
    A solution to the equivalence problem in three-dimensional gravity is given and a practically useful method to obtain a coordinate invariant description of local geometry is presented. The method is a nontrivial adaptation of Karlhede invariant classification of spacetimes of general relativity. The local geometry is completely determined by the curvature tensor and a finite number of its covariant derivatives in a frame where the components of the metric are constants. The results are presented in the framework of real two-component spinors in three-dimensional spacetimes, where the algebraic classifications of the Ricci and Cotton-York spinors are given and their isotropy groups and canonical forms are determined. As an application we discuss Goedel-type spacetimes in three-dimensional General Relativity. The conditions for local space and time homogeneity are derived and the equivalence of three-dimensional Goedel-type spacetimes is studied and the results are compared with previous works on four-dimensional Goedel-type spacetimes.Comment: 13 pages - content changes and corrected typo

    Why High-Performance Modelling and Simulation for Big Data Applications Matters

    Get PDF
    Modelling and Simulation (M&S) offer adequate abstractions to manage the complexity of analysing big data in scientific and engineering domains. Unfortunately, big data problems are often not easily amenable to efficient and effective use of High Performance Computing (HPC) facilities and technologies. Furthermore, M&S communities typically lack the detailed expertise required to exploit the full potential of HPC solutions while HPC specialists may not be fully aware of specific modelling and simulation requirements and applications. The COST Action IC1406 High-Performance Modelling and Simulation for Big Data Applications has created a strategic framework to foster interaction between M&S experts from various application domains on the one hand and HPC experts on the other hand to develop effective solutions for big data applications. One of the tangible outcomes of the COST Action is a collection of case studies from various computing domains. Each case study brought together both HPC and M&S experts, giving witness of the effective cross-pollination facilitated by the COST Action. In this introductory article we argue why joining forces between M&S and HPC communities is both timely in the big data era and crucial for success in many application domains. Moreover, we provide an overview on the state of the art in the various research areas concerned

    Etude et traitement des series formelles non commutatives pour le calcul de la representation minimale des systemes dynamiques

    No full text
    SIGLECNRS T Bordereau / INIST-CNRS - Institut de l'Information Scientifique et TechniqueFRFranc

    Analysis and processing aspects of data in big data applications

    No full text

    Macsyma computation of local minimal realization of dynamical systems of which generating power series are finite

    Get PDF
    We present here a package of Macsyma programs, allowing the manipulation of words, and noncommutative power series over some finite alphabet.On the basis of the works of M.Fliess and C.Reutenauer, concerning local realization of nonlinear dynamical systems, we present an algorithm allowing the computation of the local and minimal realization of finite generating power series. We describe that algorithm in the computer algebra system Macsyma

    Assessment of the focal hepatic lesions using diffusion tensor magnetic resonance imaging

    No full text
    The goal is assessing the diffusion magnetic resonance imaging (dMRI) method efficiency in characterizing focal hepatic lesions (FHLs). About 28-FHL patients were studied in Radiology and Clinical Imaging Department of our University Hospital using 1.5 Tesla MRI system between January 2010 and June 2011. Patients underwent hepatic MRI consisting of dynamic T1- and T2-weighted imaging. The dMRI was performed with b-values of 200 s/mm 2 and 600 s/mm 2 . About 42 lesions measuring more than 1 cm were studied including the variation of the signal according to the b-value and the apparent diffusion coefficient (ADC). The diagnostic imaging reference was based on standard MRI techniques data for typical lesions and on histology after surgical biopsy for atypical lesions. About 38 lesions were assessed including 13 benign lesions consisting of 1 focal nodular hyperplasia, 8 angiomas, and 4 cysts. About 25 malignant lesions included 11 hepatocellular carcinoma, 9 hepatic metastases, 1 cholangiocarcinoma, and 4 lymphomas. dMRI of soft lesions demonstrated higher ADC of 2.26 ± 0.75 mm 2 /s, whereas solid lesions showed lower ADC 1.19 ± 0.33 mm 2 /s with significant difference (P = 0.05). Discrete values collections were noticed. These results were correlated to standard MRI and histological findings. Sensitivity of 93% and specificity of 84% were found in diagnoses of malignant tumors with an ADC threshold of 1.6 × 10−3 mm 2 /s. dMRI is important characterization method of FHL. However, it should not be used as single criteria of hepatic lesions malignity. MRI, clinical, and biological data must be correlated. Significant difference was found between benign and solid malignant lesions without threshold ADC values. Hence, it is difficult to confirm ADC threshold differentiating the lesion classification
    corecore