6,274 research outputs found

    Complex adaptive systems based data integration : theory and applications

    Get PDF
    Data Definition Languages (DDLs) have been created and used to represent data in programming languages and in database dictionaries. This representation includes descriptions in the form of data fields and relations in the form of a hierarchy, with the common exception of relational databases where relations are flat. Network computing created an environment that enables relatively easy and inexpensive exchange of data. What followed was the creation of new DDLs claiming better support for automatic data integration. It is uncertain from the literature if any real progress has been made toward achieving an ideal state or limit condition of automatic data integration. This research asserts that difficulties in accomplishing integration are indicative of socio-cultural systems in general and are caused by some measurable attributes common in DDLs. This research’s main contributions are: (1) a theory of data integration requirements to fully support automatic data integration from autonomous heterogeneous data sources; (2) the identification of measurable related abstract attributes (Variety, Tension, and Entropy); (3) the development of tools to measure them. The research uses a multi-theoretic lens to define and articulate these attributes and their measurements. The proposed theory is founded on the Law of Requisite Variety, Information Theory, Complex Adaptive Systems (CAS) theory, Sowa’s Meaning Preservation framework and Zipf distributions of words and meanings. Using the theory, the attributes, and their measures, this research proposes a framework for objectively evaluating the suitability of any data definition language with respect to degrees of automatic data integration. This research uses thirteen data structures constructed with various DDLs from the 1960\u27s to date. No DDL examined (and therefore no DDL similar to those examined) is designed to satisfy the law of requisite variety. No DDL examined is designed to support CAS evolutionary processes that could result in fully automated integration of heterogeneous data sources. There is no significant difference in measures of Variety, Tension, and Entropy among DDLs investigated in this research. A direction to overcome the common limitations discovered in this research is suggested and tested by proposing GlossoMote, a theoretical mathematically sound description language that satisfies the data integration theory requirements. The DDL, named GlossoMote, is not merely a new syntax, it is a drastic departure from existing DDL constructs. The feasibility of the approach is demonstrated with a small scale experiment and evaluated using the proposed assessment framework and other means. The promising results require additional research to evaluate GlossoMote’s approach commercial use potential

    Design and realization of a network security model

    Get PDF
    The security of information is a key problem in the development of network technology. The basic requirements of security of information clearly include confidentiality, integrity, authentication and non-repudiation. This paper proposes a network security model that is composed of security system, security connection and communication, and key management. The model carries out encrypting, decrypting, signature and ensures confidentiality, integrity, authentication and non-repudiation. Finally, the paper analyses the merits of the model. (author abst.

    Survey on XML encryption

    Get PDF
    Every transaction on the Internet involves some kind of data. Data can be transferred in various modes. Now a days, XML is widely used for transferring and storing the data. There must be some mechanism to protect these data. In most of the literature, two most important techniques i.e. XML Signature and XML Encryption are used for securing these XML data. These two techniques provide signing and encrypting of XML data using cryptographic functionalities and results are also represented in XML format. These two techniques are con- sidered as standard worldwide which is released by W3C. In this thesis we are focusing on XML Encryption. In this study, W3C standards are used to encrypt sensitive XML data. JavaScript has been used to implement encryption of XML data and "Node.js" as software platform for providing the environment for encrypting. In this study, time elapsed is also measured in case of encryption and decryption. We have used AES and Triple DES algorithm for encryption of XML data. For encryption of symmetric key, RSA is used. Library used is "xml-encryption" for encryption and decryption. Time analysis for encryption and decryption are also shown by graph

    FLBP: A Federated Learning-enabled and Blockchain-supported Privacy-Preserving of Electronic Patient Records for the Internet of Medical Things

    Get PDF
    The evolution of the computing paradigms and the Internet of Medical Things (IoMT) have transfigured the healthcare sector with an alarming rise of privacy issues in healthcare records. The rapid growth of medical data leads to privacy and security concerns to protect the confidentiality and integrity of the data in the feature-loaded infrastructure and applications. Moreover, the sharing of medical records of a patient among hospitals rises security and interoperability issues. This article, therefore, proposes a Federated Learning-and-Blockchain-enabled framework to protect electronic medical records from unauthorized access using a deep learning technique called Artificial Neural Network (ANN) for a collaborative IoMT-Fog-Cloud environment. ANN is used to identify insiders and intruders. An Elliptical Curve Digital Signature (ECDS) algorithm is adopted to devise a secured Blockchain-based validation method. To process the anti-malicious propagation method, a Blockchain-based Health Record Sharing (BHRS) is implemented. In addition, an FL approach is integrated into Blockchain for scalable applications to form a global model without the need of sharing and storing the raw data in the Cloud. The proposed model is evident from the simulations that it improves the operational cost and communication (latency) overhead with a percentage of 85.2% and 62.76%, respectively. The results showcase the utility and efficacy of the proposed model

    Performance Development for Securing the Data Sharing Services in Cloud Storage using Hybrid Encryption

    Get PDF
    Information sharing among more numbers of users especially the end clients. Preferable people will use famous and financially savvy cloud-based help for associations to share information with clients, and accomplices need of insider clients. This sort of administration further develops information accessibility and I/O execution by delivering and dispersing copies of shared information. Notwithstanding, such a strategy expands the capacity/network assets usage. At present, the Organizations have another choice to re-appropriate their monstrous information in the cloud without stressing over the size of information or the limit of memory. Be that as it may, moving classified and delicate information from believed person, area of the information proprietors by sharing with the public cloud will cause different security and protection chances. Moreover, the expanding measure of huge information reevaluated in the cloud builds the possibility to penetrate the protection and security of these data. Despite all the exploration that has been done around here, enormous information stockpiling security and security stays one of the main issues of associations that embrace computing and huge information technologies

    Community Technology? Issues in Computer-Supported Work

    Get PDF
    In this paper I wish to discuss a number of issues concerning work practices, especially communication and cooperation among people, and examine how we can use the computer as a tool and/or medium for supporting such group activities. The intent is not to substitute computer-mediated for face to-face or other forms of communication, but rather to discover if there are additional possibilities that may be afforded us through use of computing technology. My emphasis is not with the technology per se, but with people, their needs and activities. My focus is on how we can augment human capabilities through use of the technology, rather than on how to simulate or replace labour processes with machines. I believe, along with Rosenbrock and many others, that our present-day utilization of information technology in work has tended to restrict, rather than expand human potential. This is not due solely to the nature of the technology itself, although it is not a neutral element, but also to the organization of work around the technology, and the general socio-economic and political rationale within our society which develops these machines and industrial systems. The paper does not present a carefully compiled rationale for an alternative technology, or an argument for the construction of new ''widgets'', but consists of a number of observations, reviews of rcsearch, experiences with current technologies, and speculations about possible future uses of technology in promoting communication between people. The intent is to sharpen our understanding of everyday activities, and open up alternative paths for future design of support technology. Reaction in the form of supportive or negative examples of technology use in group settings is particularly welcome from readers

    Blockchain-Based E-Certificate Verification and Validation Automation Architecture to Avoid Counterfeiting of Digital Assets in Order to Accelerate Digital Transformation

    Get PDF
    The security and confidentiality of data are very important for institutions. Meanwhile, data fabrication or falsification of official documents is still common. Validation of the authenticity of documents such as certificates becomes a challenge for various parties, especially those who have to make decisions based on the validity of the document. Scanning-based signatures on printed and digital documents are still relatively easy to counterfeit and yet still difficult to distinguish from the original. The traditional approach is no longer reliable. Solutions to these problems require the existence of data security techniques, seamless online verification of the authenticity of printed documents, and e-certificates quickly. The objective of the study is to model the e-certificate verification process via blockchain and proof-of-stake consensus methods and use MD5 encryption. The data or identity listed on the e-certificate is secured with an embedded digital signature in the form of a QR code and can be checked for the truth online. A combination of technologies capable of suppressing or removing counterfeiting of digital assets will accelerate digital transformation across spectrums of modern life. The resulting architectural model can be used as a starting point for implementing a blockchain-based e-certificate verification and validation automation system

    Data retrieval based on the smart contract within the blockchain

    Get PDF
    Blockchain technology appears to be the ideal solution for storing data in a transparent and decentralized manner. It also allows open access to data and enhances its immutable nature. This technology has helped prove its usefulness in several industries so far, however, distributed ledger technology does not work as a pure database. Therefore, some problems occur in accessing data. Querying data in the blockchain leads to performance and bandwidth problems. This primarily occurs because the blockchain does not have a primary query language, unlike regular databases. The distributed nature of the blockchain is in this case an obstacle. In this paper, a safe and fast method will be proposed to retrieve consistent data from the blockchain-based on the smart contract that will be opened after completing the transaction procedures. All nodes will sign the proposed transaction (by adding a special hash to each node resulting from the transaction information and node data). Upon completion of Transaction procedures, A smart contract will be opened (in which a QR is placed) resulting from converting the signatures in the transaction to QR When the smart contract data is retrieved, the QR for each transaction will be used All node signatures and transaction data will be extracted. The data will be retrieved by the QR generated for each transaction after it is stored in all nodes servers participating in the system. A new method was proposed to generate a hash for each node present in the system. The proposed method was tested in terms of time and complexity, and the algorithm was statistically analyzed, and all the results proved successful

    The IPTS Report No. 57, September 2001

    Get PDF
    corecore