19 research outputs found

    Short-Term History Based Authentication Using Smartphone Sensors and Apps

    Get PDF
    Secrete-QA could be a security primarily based application. A security question is asked once the user fails to login or forgets his/her password and to reset the password. A security question could be a secondary authentication, but these queries is guessed simply by an admirer or exposed to a strangers who might recognize our personal info like friends, relations or those that will access our personal info through public on-line tools. We will build these security queries additional customized by using the privilege of accessing our Smartphone sensors and apps while not affecting the user’s privacy. Three types of questions are generated that is, Yes/No questions, Multiple Choice questions and WH questions. An android app is created for the frequent updating of the Smartphone data and questions are asked in the web part using this updated data.The questions are generated by considering the legacy apps, GPS, calendar, app details etc.These have the mostmemorability to the users and high robustness to attacks

    Understanding Persistent-Memory Related Issues in the Linux Kernel

    Full text link
    Persistent memory (PM) technologies have inspired a wide range of PM-based system optimizations. However, building correct PM-based systems is difficult due to the unique characteristics of PM hardware. To better understand the challenges as well as the opportunities to address them, this paper presents a comprehensive study of PM-related issues in the Linux kernel. By analyzing 1,553 PM-related kernel patches in-depth and conducting experiments on reproducibility and tool extension, we derive multiple insights in terms of PM patch categories, PM bug patterns, consequences, fix strategies, triggering conditions, and remedy solutions. We hope our results could contribute to the development of robust PM-based storage systemsComment: ACM TRANSACTIONS ON STORAGE(TOS'23

    Modern computing: Vision and challenges

    Get PDF
    Over the past six decades, the computing systems field has experienced significant transformations, profoundly impacting society with transformational developments, such as the Internet and the commodification of computing. Underpinned by technological advancements, computer systems, far from being static, have been continuously evolving and adapting to cover multifaceted societal niches. This has led to new paradigms such as cloud, fog, edge computing, and the Internet of Things (IoT), which offer fresh economic and creative opportunities. Nevertheless, this rapid change poses complex research challenges, especially in maximizing potential and enhancing functionality. As such, to maintain an economical level of performance that meets ever-tighter requirements, one must understand the drivers of new model emergence and expansion, and how contemporary challenges differ from past ones. To that end, this article investigates and assesses the factors influencing the evolution of computing systems, covering established systems and architectures as well as newer developments, such as serverless computing, quantum computing, and on-device AI on edge devices. Trends emerge when one traces technological trajectory, which includes the rapid obsolescence of frameworks due to business and technical constraints, a move towards specialized systems and models, and varying approaches to centralized and decentralized control. This comprehensive review of modern computing systems looks ahead to the future of research in the field, highlighting key challenges and emerging trends, and underscoring their importance in cost-effectively driving technological progress

    Privacy-Aware and Reliable Complex Event Processing in the Internet of Things - Trust-Based and Flexible Execution of Event Processing Operators in Dynamic Distributed Environments

    Get PDF
    The Internet of Things (IoT) promises to be an enhanced platform for supporting a heterogeneous range of context-aware applications in the fields of traffic monitoring, healthcare, and home automation, to name a few. The essence of the IoT is in the inter-networking of distributed information sources and the analysis of their data to understand the interactions between the physical objects, their users, and their environment. Complex Event Processing (CEP) is a cogent paradigm to infer higher-level information from atomic event streams (e.g., sensor data in the IoT). Using functional computing modules called operators (e.g., filters, aggregates, sequencers), CEP provides for an efficient and low-latency processing environment. Privacy and mobility support for context processing is gaining immense importance in the age of the IoT. However, new mobile communication paradigms - like Device-to-Device (D2D) communication - that are inherent to the IoT, must be enhanced to support a privacy-aware and reliable execution of CEP operators on mobile devices. It is crucial to preserve the differing privacy constraints of mobile users, while allowing for flexible and collaborative processing. Distributed mobile environments are also susceptible to adversary attacks, given the lack of sufficient control over the processing environment. Lastly, ensuring reliable and accurate CEP becomes a serious challenge due to the resource-constrained and dynamic nature of the IoT. In this thesis, we design and implement a privacy-aware and reliable CEP system that supports distributed processing of context data, by flexibly adapting to the dynamic conditions of a D2D environment. To this end, the main contributions, which form the key components of the proposed system, are three-fold: 1) We develop a method to analyze the communication characteristics of the users and derive the type and strength of their relationships. By doing so, we utilize the behavioral aspects of user relationships to automatically derive differing privacy constraints of the individual users. 2) We employ the derived privacy constraints as trust relations between users to execute CEP operators on mobile devices in a privacy-aware manner. In turn, we develop a trust management model called TrustCEP that incorporates a robust trust recommendation scheme to prevent adversary attacks and allow for trust evolution. 3) Finally, to account for reliability, we propose FlexCEP, a fine-grained flexible approach for CEP operator migration, such that the CEP system adapts to the dynamic nature of the environment. By extracting intermediate operator state and by leveraging device mobility and instantaneous characteristics, FlexCEP provides a flexible CEP execution model under varying network conditions. Overall, with the help of thorough evaluations of the above three contributions, we show how the proposed distributed CEP system can satisfy the requirements established above for a privacy-aware and reliable IoT environment

    Design Space Exploration and Resource Management of Multi/Many-Core Systems

    Get PDF
    The increasing demand of processing a higher number of applications and related data on computing platforms has resulted in reliance on multi-/many-core chips as they facilitate parallel processing. However, there is a desire for these platforms to be energy-efficient and reliable, and they need to perform secure computations for the interest of the whole community. This book provides perspectives on the aforementioned aspects from leading researchers in terms of state-of-the-art contributions and upcoming trends

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    Analyzing Granger causality in climate data with time series classification methods

    Get PDF
    Attribution studies in climate science aim for scientifically ascertaining the influence of climatic variations on natural or anthropogenic factors. Many of those studies adopt the concept of Granger causality to infer statistical cause-effect relationships, while utilizing traditional autoregressive models. In this article, we investigate the potential of state-of-the-art time series classification techniques to enhance causal inference in climate science. We conduct a comparative experimental study of different types of algorithms on a large test suite that comprises a unique collection of datasets from the area of climate-vegetation dynamics. The results indicate that specialized time series classification methods are able to improve existing inference procedures. Substantial differences are observed among the methods that were tested
    corecore