186 research outputs found

    Web2Touch 2019: Semantic Technologies for Smart Information Sharing and Web Collaboration

    Get PDF
    This foreword introduces a summary of themes and papers of the Web2Touch (W2T) 2019 Track at the 28th IEEE WETICE Conference held in Capri, June 2019. W2T 2019 includes ten full papers and one short paper. They all address relevant issues in the field of information sharing for collaboration, including, big data analytics, knowledge engineering, linked open data, applications of smart Web technologies, and smart care. The papers are a portfolio of hot issues in research and applications of semantics, smart technologies (e.g., IoT, sensors, devices for tele-monitoring, and smart contents management) with crucial topics, such as big data analysis, knowledge representation, smart enterprise management, among the others. This track shows how cooperative technologies based on knowledge representation, intelligent tools, and enhanced Web engineering can enhance collaborative work through smart service design and delivery, so it contributes to radically change the role of the semantic Web and applications

    Educational Process Mining based on Moodle courses: a review of literature

    Get PDF
    With the prevalence of E-Learning, it is important to analyze how students progress in this environment. These systems collect data about the students’ learning path, and Process Mining (PM) can provide a detailed model of this path. Based on the analysis of ten Educational Process Mining (EPM) case studies involving Moodle event logs, this article aims to contribute a literature review on EPM’s research. Beyond a theoretical introduction to PM and its implications for educational data, the review concludes on what PM tools and techniques are used, as well as the challenges faced in practice. The technical options include software, process discovery algorithms and representation models. These results aim to create a list of available options for future EPM endeavors, in addition to a list of issues to consider in future research involving Moodle

    Safeguarding Privacy Through Deep Learning Techniques

    Get PDF
    Over the last few years, there has been a growing need to meet minimum security and privacy requirements. Both public and private companies have had to comply with increasingly stringent standards, such as the ISO 27000 family of standards, or the various laws governing the management of personal data. The huge amount of data to be managed has required a huge effort from the employees who, in the absence of automatic techniques, have had to work tirelessly to achieve the certification objectives. Unfortunately, due to the delicate information contained in the documentation relating to these problems, it is difficult if not impossible to obtain material for research and study purposes on which to experiment new ideas and techniques aimed at automating processes, perhaps exploiting what is in ferment in the scientific community and linked to the fields of ontologies and artificial intelligence for data management. In order to bypass this problem, it was decided to examine data related to the medical world, which, especially for important reasons related to the health of individuals, have gradually become more and more freely accessible over time, without affecting the generality of the proposed methods, which can be reapplied to the most diverse fields in which there is a need to manage privacy-sensitive information

    Emergency department performance assessment using administrative data: A managerial framework

    Get PDF
    Administrative data play an important role in performance monitoring of healthcare providers. Nonetheless, little attention has been given so far to the emergency department (ED) evaluation. In addition, most of existing research focuses on a single core ED function, such as treatment or triage, thus providing a limited picture of performance. The goal of this study is to harness the value of routinely produced records proposing a framework for multidimensional performance evaluation of EDs able to support internal decision stakeholders in managing operations. Starting with the overview of administrative data, and the definition of the desired framework's characteristics from the perspective of decision stakeholders, a review of the academic literature on ED performance measures and indicators is conducted. A performance measurement framework is designed using 224 ED performance metrics (measures and indicators) satisfying established selection criteria. Real-world feedback on the framework is obtained through expert interviews. Metrics in the proposed ED performance measurement framework are arranged along three dimensions: performance (quality of care, time-efficiency, throughput), analysis unit (physician, disease etc.), and time-period (quarter, year, etc.). The framework has been judged as "clear and intuitive", "useful for planning", able to "reveal inefficiencies in care process" and "transform existing data into decision support information" by the key ED decision stakeholders of a teaching hospital. Administrative data can be a new cornerstone for health care operation management. A framework of ED-specific indicators based on administrative data enables multi-dimensional performance assessment in a timely and cost-effective manner, an essential requirement for nowadays resource-constrained hospitals. Moreover, such a framework can support different stakeholders' decision making as it allows the creation of a customized metrics sets for performance analysis with the desired granularity

    Visualizing the outcome of dynamic analysis of Android malware with VizMal

    Get PDF
    Malware detection techniques based on signature extraction require security analysts to manually inspect samples to find evidences of malicious behavior. This time-consuming task received little attention by researchers and practitioners, as most of the effort is on the identification as malware or non-malware of an entire sample. There are no tools for supporting the analyst in identifying when the malicious behavior occurs, given a sample. In this paper we propose VizMal, a tool able to visualize the execution traces of Android applications and to highlight which portions of the traces correspond to a potentially malicious behavior. The aim of VizMal is twofold: assisting the malware analyst during the inspection of an application and pushing the research community to organize and focus its effort on the malicious behavior localization. VizMal is able to discern if the application behavior during each second of execution are legitimate or malicious and to show this information in a simple and understandable way. We validate VizMal experimentally and by means of a user study: the results are promising and confirm that the tool can be useful

    Blockchain-Based Transaction Management in Smart Logistics: A Sawtooth Framework

    Get PDF
    3partially_openopenPerboli, Guido; Capocasale, Vittorio; Gotta, DaniloPerboli, Guido; Capocasale, Vittorio; Gotta, Danil

    Zone-based verification of timed automata: extrapolations, simulations and what next?

    Full text link
    Timed automata have been introduced by Rajeev Alur and David Dill in the early 90's. In the last decades, timed automata have become the de facto model for the verification of real-time systems. Algorithms for timed automata are based on the traversal of their state-space using zones as a symbolic representation. Since the state-space is infinite, termination relies on finite abstractions that yield a finite representation of the reachable states. The first solution to get finite abstractions was based on extrapolations of zones, and has been implemented in the industry-strength tool Uppaal. A different approach based on simulations between zones has emerged in the last ten years, and has been implemented in the fully open source tool TChecker. The simulation-based approach has led to new efficient algorithms for reachability and liveness in timed automata, and has also been extended to richer models like weighted timed automata, and timed automata with diagonal constraints and updates. In this article, we survey the extrapolation and simulation techniques, and discuss some open challenges for the future.Comment: Invited contribution at FORMATS'2

    Dimensional Data Design for Event Feedback Data Warehouse

    Get PDF
    Data is an important asset and a fundamental requirement for building valuable information for organizations. Association of Information Systems Students of Unsika (Himsika) as a university organization provides many events to develop student’s academic and professional skills. A post-event evaluation through a feedback survey was conducted and stored in Google Sheets spreadsheet format. However, the current analysis process using spreadsheets lacks standardization, making it difficult to compare satisfaction rates over time and between events. Additionally, the lack of standardization leads to semi-structured data on spreadsheets, with varying question formats and meanings. To address these limitations, implementing a centralized data warehouse is proposed as a solution. The data warehouse would provide a structured and standardized approach to analyzing event feedback, enabling better comparisons and evaluation of management quality within Himsika. The research aims to design a data warehouse that supports multidimensional analysis. As a way to simplify and optimize analytical queries, the data structure is standardized in the data warehouse. The Four-step Dimensional Design method is applied in designing dimensional modeling on the data warehouse, consisting of four stages including selecting the business process, declaring the grain, identifying the dimensions, and identifying the facts. The design process resulted in 4 dimensions of events, dim_instances, dim_degree_programs, and dim_professions, and a fact table called fact_rates_by_responses. Overall, the proposed data warehouse and dimensional modeling approach aim to enhance the analysis and evaluation of Himsika’s events
    • …
    corecore