162 research outputs found

    Generic adaptation framework for unifying adaptive web-based systems

    Get PDF
    The Generic Adaptation Framework (GAF) research project first and foremost creates a common formal framework for describing current and future adaptive hypermedia (AHS) and adaptive webbased systems in general. It provides a commonly agreed upon taxonomy and a reference model that encompasses the most general architectures of the present and future, including conventional AHS, and different types of personalization-enabling systems and applications such as recommender systems (RS) personalized web search, semantic web enabled applications used in personalized information delivery, adaptive e-Learning applications and many more. At the same time GAF is trying to bring together two (seemingly not intersecting) views on the adaptation: a classical pre-authored type, with conventional domain and overlay user models and data-driven adaptation which includes a set of data mining, machine learning and information retrieval tools. To bring these research fields together we conducted a number GAF compliance studies including RS, AHS, and other applications combining adaptation, recommendation and search. We also performed a number of real systems’ case-studies to prove the point and perform a detailed analysis and evaluation of the framework. Secondly it introduces a number of new ideas in the field of AH, such as the Generic Adaptation Process (GAP) which aligns with a layered (data-oriented) architecture and serves as a reference adaptation process. This also helps to understand the compliance features mentioned earlier. Besides that GAF deals with important and novel aspects of adaptation enabling and leveraging technologies such as provenance and versioning. The existence of such a reference basis should stimulate AHS research and enable researchers to demonstrate ideas for new adaptation methods much more quickly than if they had to start from scratch. GAF will thus help bootstrap any adaptive web-based system research, design, analysis and evaluation

    Eesti elektrooniline ID-kaart ja selle turvaväljakutsed

    Get PDF
    Eesti elektrooniline isikutunnistust (ID-kaart) on üle 18 aasta pakkunud turvalist elektroonilist identiteeti Eesti kodanikele. Avaliku võtme krüptograafia ja kaardile talletatud privaatvõti võimaldavad ID-kaardi omanikel juurde pääseda e-teenustele, anda juriidilist jõudu omavaid digiallkirju ning elektrooniliselt hääletada. Käesolevas töös uuritakse põhjalikult Eesti ID-kaarti ning sellega seotud turvaväljakutseid. Me kirjeldame Eesti ID-kaarti ja selle ökosüsteemi, seotud osapooli ja protsesse, ID-kaardi elektroonilist baasfunktsionaalsust, seotud tehnilisi ja juriidilisi kontseptsioone ning muid seotud küsimusi. Me tutvustame kõiki kasutatud kiipkaardiplatforme ja nende abil väljastatud isikutunnistuste tüüpe. Iga platformi kohta esitame me detailse analüüsi kasutatava asümmeetrilise krüptograafia funktsionaalsusest ning kirjeldame ja analüüsime ID-kaardi kauguuendamise lahendusi. Lisaks esitame me süstemaatilise uurimuse ID-kaardiga seotud turvaintsidentidest ning muudest sarnastest probleemidest läbi aastate. Me kirjeldame probleemide tehnilist olemust, kasutatud leevendusmeetmeid ning kajastust ajakirjanduses. Käesoleva uurimustöö käigus avastati mitmeid varem teadmata olevaid turvaprobleeme ning teavitati nendest seotud osapooli. Käesolev töö põhineb avalikult kättesaadaval dokumentatsioonil, kogutud ID-kaartide sertifikaatide andmebaasil, ajakirjandusel,otsesuhtlusel seotud osapooltega ning töö autori analüüsil ja eksperimentidel.For more than 18 years, the Estonian electronic identity card (ID card) has provided a secure electronic identity for Estonian residents. The public-key cryptography and private keys stored on the card enable Estonian ID card holders to access e-services, give legally binding digital signatures and even cast an i-vote in national elections. This work provides a comprehensive study on the Estonian ID card and its security challenges. We introduce the Estonian ID card and its ecosystem by describing the involved parties and processes, the core electronic functionality of the ID card, related technical and legal concepts, and the related issues. We describe the ID card smart card chip platforms used over the years and the identity document types that have been issued using these platforms. We present a detailed analysis of the asymmetric cryptography functionality provided by each ID card platform and present a description and security analysis of the ID card remote update solutions that have been provided for each ID card platform. As yet another contribution of this work, we present a systematic study of security incidents and similar issues the Estonian ID card has experienced over the years. We describe the technical nature of the issue, mitigation measures applied and the reflections on the media. In the course of this research, several previously unknown security issues were discovered and reported to the involved parties. The research has been based on publicly available documentation, collection of ID card certificates in circulation, information reflected in media, information from the involved parties, and our own analysis and experiments performed in the field.https://www.ester.ee/record=b541416

    Electro-mechanical whole-heart digital twins: A fully coupled multi-physics approach

    Get PDF
    Mathematical models of the human heart are evolving to become a cornerstone of precision medicine and support clinical decision making by providing a powerful tool to understand the mechanisms underlying pathophysiological conditions. In this study, we present a detailed mathematical description of a fully coupled multi-scale model of the human heart, including electrophysiology, mechanics, and a closed-loop model of circulation. State-of-the-art models based on human physiology are used to describe membrane kinetics, excitation-contraction coupling and active tension generation in the atria and the ventricles. Furthermore, we highlight ways to adapt this framework to patient specific measurements to build digital twins. The validity of the model is demonstrated through simulations on a personalized whole heart geometry based on magnetic resonance imaging data of a healthy volunteer. Additionally, the fully coupled model was employed to evaluate the effects of a typical atrial ablation scar on the cardiovascular system. With this work, we provide an adaptable multi-scale model that allows a comprehensive personalization from ion channels to the organ level enabling digital twin modeling

    The integrity of digital technologies in the evolving characteristics of real-time enterprise architecture

    Get PDF
    Advancements in interactive and responsive enterprises involve real-time access to the information and capabilities of emerging technologies. Digital technologies (DTs) are emerging technologies that provide end-to-end business processes (BPs), engage a diversified set of real-time enterprise (RTE) participants, and institutes interactive DT services. This thesis offers a selection of the author’s work over the last decade that addresses the real-time access to changing characteristics of information and integration of DTs. They are critical for RTEs to run a competitive business and respond to a dynamic marketplace. The primary contributions of this work are listed below. • Performed an intense investigation to illustrate the challenges of the RTE during the advancement of DTs and corresponding business operations. • Constituted a practical approach to continuously evolve the RTEs and measure the impact of DTs by developing, instrumenting, and inferring the standardized RTE architecture and DTs. • Established the RTE operational governance framework and instituted it to provide structure, oversight responsibilities, features, and interdependencies of business operations. • Formulated the incremental risk (IR) modeling framework to identify and correlate the evolving risks of the RTEs during the deployment of DT services. • DT service classifications scheme is derived based on BPs, BP activities, DT’s paradigms, RTE processes, and RTE policies. • Identified and assessed the evaluation paradigms of the RTEs to measure the progress of the RTE architecture based on the DT service classifications. The starting point was the author’s experience with evolving aspects of DTs that are disrupting industries and consequently impacting the sustainability of the RTE. The initial publications emphasized innovative characteristics of DTs and lack of standardization, indicating the impact and adaptation of DTs are questionable for the RTEs. The publications are focused on developing different elements of RTE architecture. Each published work concerns the creation of an RTE architecture framework fit to the purpose of business operations in association with the DT services and associated capabilities. The RTE operational governance framework and incremental risk methodology presented in subsequent publications ensure the continuous evolution of RTE in advancements of DTs. Eventually, each publication presents the evaluation paradigms based on the identified scheme of DT service classification to measure the success of RTE architecture or corresponding elements of the RTE architecture

    A service-orientated architecture for adaptive and collaborative e-learning systems

    Get PDF
    This research proposes a new architecture for Adaptive Educational Hypermedia Systems (AEHS). Architectures in the context of this thesis refer to the components of the system and their communications and interactions. The architecture addresses the limitations of AEHS regarding interoperability, reusability, openness, flexibility, and limited tools for collaborative and social learning. It presents an integrated adaptive and collaborative Web-based learning environment. The new e-learning environment is implemented as a set of independent Web services within a service-oriented architecture (SOA). Moreover, it uses a modern Learning Management System (LMS) as the delivery service and the user interface for this environment. This is a two-way solution, whereby adaptive learning is introduced via a widely adopted LMS, and the LMS itself is enriched with an external - yet integrated - adaptation layer. To test the relevance of the new architecture, practical experiments were undertaken. The interoperability, reusability and openness test revealed that the user could easily switch between various LMS to access the personalised lessons. In addition, the system was tested by students at the University of Nottingham as a revision guide to a Software Engineering module. This test showed that the system was robust; it automatically handled a large number of students and produced the desired adaptive content. However, regarding the use of the collaborative learning tools, the test showed low levels of such usage

    Designing Incentives Enabled Decentralized User Data Sharing Framework

    Get PDF
    Data sharing practices are much needed to strike a balance between user privacy, user experience, and profit. Different parties collect user data, for example, companies offering apps, social networking sites, and others, whose primary motive is an enhanced business model while giving optimal services to the end-users. However, the collection of user data is associated with serious privacy and security issues. The sharing platform also needs an effective incentive mechanism to realize transparent access to the user data while distributing fair incentives. The emerging literature on the topic includes decentralized data sharing approaches. However, there has been no universal method to track who shared what, to whom, when, for what purpose and under what condition in a verifiable manner until recently, when the distributed ledger technologies emerged to become the most effective means for designing a decentralized peer-to-peer network. This Ph.D. research includes an engineering approach for specifying the operations for designing incentives and user-controlled data-sharing platforms. The thesis presents a series of empirical studies and proposes novel blockchains- and smart contracts-based DUDS (Decentralized User Data Sharing) framework conceptualizing user-controlled data sharing practices. The DUDS framework supports immutability, authenticity, enhanced security, trusted records and is a promising means to share user data in various domains, including among researchers, customer data in e-commerce, tourism applications, etc. The DUDS framework is evaluated via performance analyses and user studies. The extended Technology Acceptance Model and a Trust-Privacy-Security Model are used to evaluate the usability of the DUDS framework. The evaluation allows uncovering the role of different factors affecting user intention to adopt data-sharing platforms. The results of the evaluation point to guidelines and methods for embedding privacy, user transparency, control, and incentives from the start in the design of a data-sharing framework to provide a platform that users can trust to protect their data while allowing them to control it and share it in the ways they want
    corecore