4,259 research outputs found

    Scalable discovery of hybrid process models in a cloud computing environment

    Get PDF
    Process descriptions are used to create products and deliver services. To lead better processes and services, the first step is to learn a process model. Process discovery is such a technique which can automatically extract process models from event logs. Although various discovery techniques have been proposed, they focus on either constructing formal models which are very powerful but complex, or creating informal models which are intuitive but lack semantics. In this work, we introduce a novel method that returns hybrid process models to bridge this gap. Moreover, to cope with today’s big event logs, we propose an efficient method, called f-HMD, aims at scalable hybrid model discovery in a cloud computing environment. We present the detailed implementation of our approach over the Spark framework, and our experimental results demonstrate that the proposed method is efficient and scalabl

    Technology assessment of advanced automation for space missions

    Get PDF
    Six general classes of technology requirements derived during the mission definition phase of the study were identified as having maximum importance and urgency, including autonomous world model based information systems, learning and hypothesis formation, natural language and other man-machine communication, space manufacturing, teleoperators and robot systems, and computer science and technology

    Big Data and the Internet of Things

    Full text link
    Advances in sensing and computing capabilities are making it possible to embed increasing computing power in small devices. This has enabled the sensing devices not just to passively capture data at very high resolution but also to take sophisticated actions in response. Combined with advances in communication, this is resulting in an ecosystem of highly interconnected devices referred to as the Internet of Things - IoT. In conjunction, the advances in machine learning have allowed building models on this ever increasing amounts of data. Consequently, devices all the way from heavy assets such as aircraft engines to wearables such as health monitors can all now not only generate massive amounts of data but can draw back on aggregate analytics to "improve" their performance over time. Big data analytics has been identified as a key enabler for the IoT. In this chapter, we discuss various avenues of the IoT where big data analytics either is already making a significant impact or is on the cusp of doing so. We also discuss social implications and areas of concern.Comment: 33 pages. draft of upcoming book chapter in Japkowicz and Stefanowski (eds.) Big Data Analysis: New algorithms for a new society, Springer Series on Studies in Big Data, to appea

    Security of Big Data in Healthcare Systems

    Get PDF
    As cyber-attacks have become more common and sophisticated, the need for a stable security framework has become essential. Information security requirements must be met by digital technologies utilized in the health care sector. Modern hospitals are becoming increasingly digital, and information and communication technology is becoming an increasingly significant element of the core business. This lays the groundwork for improved patient care quality. At the same time, the health sector's vulnerability to digital attacks and data breaches is growing, and so are the potential negative effects of security breaches. The Norwegian healthcare system is divided into different regions, each with its own set of processes and procedures. Because of the fragmentation, there are substantial communication issues between the many health regions and their systems, making transmitted data vulnerable to threat actors. A reorganization is required to effectively handle this issue and improve the security of healthcare systems. The research was conducted using a qualitative method with a problem-oriented phenomenon-driven research approach on Norwegian Healthcare Sector. In addition, interviews with different security employees from the different health regions in Norway, as well as a document analysis of published papers was done to gather empirical material for the master thesis

    Security of Big Data in Healthcare Systems

    Get PDF
    As cyber-attacks have become more common and sophisticated, the need for a stable security framework has become essential. Information security requirements must be met by digital technologies utilized in the health care sector. Modern hospitals are becoming increasingly digital, and information and communication technology is becoming an increasingly significant element of the core business. This lays the groundwork for improved patient care quality. At the same time, the health sector's vulnerability to digital attacks and data breaches is growing, and so are the potential negative effects of security breaches. The Norwegian healthcare system is divided into different regions, each with its own set of processes and procedures. Because of the fragmentation, there are substantial communication issues between the many health regions and their systems, making transmitted data vulnerable to threat actors. A reorganization is required to effectively handle this issue and improve the security of healthcare systems. The research was conducted using a qualitative method with a problem-oriented phenomenon-driven research approach on Norwegian Healthcare Sector. In addition, interviews with different security employees from the different health regions in Norway, as well as a document analysis of published papers was done to gather empirical material for the master thesis

    Tortoise: Interactive System Configuration Repair

    Full text link
    System configuration languages provide powerful abstractions that simplify managing large-scale, networked systems. Thousands of organizations now use configuration languages, such as Puppet. However, specifications written in configuration languages can have bugs and the shell remains the simplest way to debug a misconfigured system. Unfortunately, it is unsafe to use the shell to fix problems when a system configuration language is in use: a fix applied from the shell may cause the system to drift from the state specified by the configuration language. Thus, despite their advantages, configuration languages force system administrators to give up the simplicity and familiarity of the shell. This paper presents a synthesis-based technique that allows administrators to use configuration languages and the shell in harmony. Administrators can fix errors using the shell and the technique automatically repairs the higher-level specification written in the configuration language. The approach (1) produces repairs that are consistent with the fix made using the shell; (2) produces repairs that are maintainable by minimizing edits made to the original specification; (3) ranks and presents multiple repairs when relevant; and (4) supports all shells the administrator may wish to use. We implement our technique for Puppet, a widely used system configuration language, and evaluate it on a suite of benchmarks under 42 repair scenarios. The top-ranked repair is selected by humans 76% of the time and the human-equivalent repair is ranked 1.31 on average.Comment: Published version in proceedings of IEEE/ACM International Conference on Automated Software Engineering (ASE) 201

    REA2: A unified formalisation of the Resource-Event-Agent Ontology

    Get PDF
    Through a proof of concept in SWI-Prolog, this paper demonstrates a business transaction model by which the trading partners can derive their own, personal perspective from shared data. The demonstration is an innovative formalisation of the Resource-Event-Agent (REA) ontology as it allows for switching viewpoints in real-time between one trading-partner’s perspective and that of a trading-partner with an opposing view (i.e. customer or supplier), or a trading-partner independent perspective (e.g. trusted third-party). The business transaction model is achieved by uniting REA with the Open-EDI Business Transaction Ontology (OeBTO). The resulting unified formalisation of the REA ontology (REA2) also highlights implications for the future development of a) enterprise information systems (EIS) in the cloud, b) social-mediabased EIS, c) blockchain EIS, and d) EIS interoperability across business paradigms. The EIS interoperability such as between traditional EIS (which typically uses a tradingpartner perspective), and EIS for the collaborative economy (which typically uses a trading-partner independent perspective) is particularly highlighted as it becomes much more transparent than previously

    Reinforcing Digital Trust for Cloud Manufacturing Through Data Provenance Using Ethereum Smart Contracts

    Get PDF
    Cloud Manufacturing(CMfg) is an advanced manufacturing model that caters to fast-paced agile requirements (Putnik, 2012). For manufacturing complex products that require extensive resources, manufacturers explore advanced manufacturing techniques like CMfg as it becomes infeasible to achieve high standards through complete ownership of manufacturing artifacts (Kuan et al., 2011). CMfg, with other names such as Manufacturing as a Service (MaaS) and Cyber Manufacturing (NSF, 2020), addresses the shortcoming of traditional manufacturing by building a virtual cyber enterprise of geographically distributed entities that manufacture custom products through collaboration. With manufacturing venturing into cyberspace, Digital Trust issues concerning product quality, data, and intellectual property security, become significant concerns (R. Li et al., 2019). This study establishes a trust mechanism through data provenance for ensuring digital trust between various stakeholders involved in CMfg. A trust model with smart contracts built on the Ethereum blockchain implements data provenance in CMfg. The study covers three data provenance models using Ethereum smart contracts for establishing digital trust in CMfg. These are Product Provenance, Order Provenance, and Operational Provenance. The models of provenance together address the most important questions regarding CMfg: What goes into the product, who manufactures the product, who transports the products, under what conditions the products are manufactured, and whether regulatory constraints/requisites are met

    Container description ontology for CaaS

    Full text link
    [EN] Besides its classical three service models (IaaS, PaaS, and SaaS), container as a service (CaaS) has gained significant acceptance. It offers without the difficulty of high-performance challenges of traditional hypervisors deployable applications. As the adoption of containers is increasingly wide spreading, the use of tools to manage them across the infrastructure becomes a vital necessity. In this paper, we propose a conceptualisation of a domain ontology for the container description called CDO. CDO presents, in a detailed and equal manner, the functional and non-functional capabilities of containers, Dockers and container orchestration systems. In addition, we provide a framework that aims at simplifying the container management not only for the users but also for the cloud providers. In fact, this framework serves to populate CDO, help the users to deploy their application on a container orchestration system, and enhance interoperability between the cloud providers by providing migration service for deploying applications among different host platforms. Finally, the CDO effectiveness is demonstrated relying on a real case study on the deployment of a micro-service application over a containerised environment under a set of functional and non-functional requirements.K. Boukadi; M.a Rekik; J. Bernal Bernabe; Lloret, J. (2020). Container description ontology for CaaS. International Journal of Web and Grid Services (Online). 16(4):341-363. https://doi.org/10.1504/IJWGS.2020.11094434136316
    corecore