1,356 research outputs found

    Semantic business process management: a vision towards using semantic web services for business process management

    Get PDF
    Business process management (BPM) is the approach to manage the execution of IT-supported business operations from a business expert's view rather than from a technical perspective. However, the degree of mechanization in BPM is still very limited, creating inertia in the necessary evolution and dynamics of business processes, and BPM does not provide a truly unified view on the process space of an organization. We trace back the problem of mechanization of BPM to an ontological one, i.e. the lack of machine-accessible semantics, and argue that the modeling constructs of semantic Web services frameworks, especially WSMO, are a natural fit to creating such a representation. As a consequence, we propose to combine SWS and BPM and create one consolidated technology, which we call semantic business process management (SBPM

    SOA-enabled compliance management: Instrumenting, assessing, and analyzing service-based business processes

    Get PDF
    Facilitating compliance management, that is, assisting a company's management in conforming to laws, regulations, standards, contracts, and policies, is a hot but non-trivial task. The service-oriented architecture (SOA) has evolved traditional, manual business practices into modern, service-based IT practices that ease part of the problem: the systematic definition and execution of business processes. This, in turn, facilitates the online monitoring of system behaviors and the enforcement of allowed behaviors-all ingredients that can be used to assist compliance management on the fly during process execution. In this paper, instead of focusing on monitoring and runtime enforcement of rules or constraints, we strive for an alternative approach to compliance management in SOAs that aims at assessing and improving compliance. We propose two ingredients: (i) a model and tool to design compliant service-based processes and to instrument them in order to generate evidence of how they are executed and (ii) a reporting and analysis suite to create awareness of a company's compliance state and to enable understanding why and where compliance violations have occurred. Together, these ingredients result in an approach that is close to how the real stakeholders-compliance experts and auditors-actually assess the state of compliance in practice and that is less intrusive than enforcing compliance. © 2013 Springer-Verlag London

    A unified view of data-intensive flows in business intelligence systems : a survey

    Get PDF
    Data-intensive flows are central processes in today’s business intelligence (BI) systems, deploying different technologies to deliver data, from a multitude of data sources, in user-preferred and analysis-ready formats. To meet complex requirements of next generation BI systems, we often need an effective combination of the traditionally batched extract-transform-load (ETL) processes that populate a data warehouse (DW) from integrated data sources, and more real-time and operational data flows that integrate source data at runtime. Both academia and industry thus must have a clear understanding of the foundations of data-intensive flows and the challenges of moving towards next generation BI environments. In this paper we present a survey of today’s research on data-intensive flows and the related fundamental fields of database theory. The study is based on a proposed set of dimensions describing the important challenges of data-intensive flows in the next generation BI setting. As a result of this survey, we envision an architecture of a system for managing the lifecycle of data-intensive flows. The results further provide a comprehensive understanding of data-intensive flows, recognizing challenges that still are to be addressed, and how the current solutions can be applied for addressing these challenges.Peer ReviewedPostprint (author's final draft

    BUSINESS INTELLIGENT AGENTS FOR ENTERPRISE APPLICATION

    Get PDF
    Fierce competition in a market increasingly crowded and frequent changes in consumer requirements are the main forces that will cause companies to change their current organization and management. One solution is to move to open architectures and virtual type, which requires addressing business methods and technologies using distributed multi-agent systems. Intelligent agents are one of the most important areas of artificial intelligence that deals with the development of hardware and software systems able to reason, learn to recognize natural language, speak, make decisions, to recognize objects in the working environment etc. Thus in this paper, we presented some aspects of smart business, intelligent agents, intelligent systems, intelligent systems models, and I especially emphasized their role in managing business processes, which have become highly complex systems that are in a permanent change to meet the requirements of timely decision making. The purpose of this paper is to prove that there is no business without using the integration Business Process Management, Web Services and intelligent agents.business intelligence, intelligent agents, intelligent systems, management, enterprise, web services

    Online platform for building, testing and deploying predictive models

    Get PDF
    Machine Learning (ML) and Artificial Intelligence (AI) have been traditionally built and deployed manually in a single machine, using tools such as R or Weka. Times are changing and in the real-time service and big data era, this methods are being obsoleted, as they severely limit the applicability and deployability of ML. Many companies such as Microsoft, Amazon and Google have been trying to mitigate this problem developing their MLaaS (Machine Learning as a Service) solutions, which are online platforms capable to scale and automate the development of predictive models. Despite the existence of some ML platforms available in the cloud, that enable the user to develop and deploy ML processes, they are not suitable for rapidly prototype and deploy predictive models, as some complex steps need to be done before the user starts using them, like configuration of environments, configuration of accounts and the overcome of the steep learning curve. In this research project, it’s presented MLINO, which is a concept of an online platform that allows the user to rapidly prototype and deploy basic ML processes, in an intuitive and easy way. Even though the implementation of the prototype wasn’t the optimal, due to software and infrastructure limitations, through a series of experiments it was demonstrated that the final performance of the prototype was satisfactory. When benchmarking the devised solution against the Microsoft Azure ML, the results showed that MLINO tool is easier to use, and takes less time when building and deploying a basic predictive model

    Fast Data in the Era of Big Data: Twitter's Real-Time Related Query Suggestion Architecture

    Full text link
    We present the architecture behind Twitter's real-time related query suggestion and spelling correction service. Although these tasks have received much attention in the web search literature, the Twitter context introduces a real-time "twist": after significant breaking news events, we aim to provide relevant results within minutes. This paper provides a case study illustrating the challenges of real-time data processing in the era of "big data". We tell the story of how our system was built twice: our first implementation was built on a typical Hadoop-based analytics stack, but was later replaced because it did not meet the latency requirements necessary to generate meaningful real-time results. The second implementation, which is the system deployed in production, is a custom in-memory processing engine specifically designed for the task. This experience taught us that the current typical usage of Hadoop as a "big data" platform, while great for experimentation, is not well suited to low-latency processing, and points the way to future work on data analytics platforms that can handle "big" as well as "fast" data

    Supporting adaptiveness of cyber-physical processes through action-based formalisms

    Get PDF
    Cyber Physical Processes (CPPs) refer to a new generation of business processes enacted in many application environments (e.g., emergency management, smart manufacturing, etc.), in which the presence of Internet-of-Things devices and embedded ICT systems (e.g., smartphones, sensors, actuators) strongly influences the coordination of the real-world entities (e.g., humans, robots, etc.) inhabitating such environments. A Process Management System (PMS) employed for executing CPPs is required to automatically adapt its running processes to anomalous situations and exogenous events by minimising any human intervention. In this paper, we tackle this issue by introducing an approach and an adaptive Cognitive PMS, called SmartPM, which combines process execution monitoring, unanticipated exception detection and automated resolution strategies leveraging on three well-established action-based formalisms developed for reasoning about actions in Artificial Intelligence (AI), including the situation calculus, IndiGolog and automated planning. Interestingly, the use of SmartPM does not require any expertise of the internal working of the AI tools involved in the system

    Automation and orchestration of hardware and firmware data mining using a smart data analytics platform

    Get PDF
    Effective data mining is going to be important for differentiating and succeeding in the digital economy especially with increased commoditization and reduced barrier to entry for infrastructure devices like servers, storage and networking systems. There is lot of telemetry data from manufacturing facilities and customers that can be used to drive improved supportability experience, unmatched product quality and reliability of infrastructure devices like servers and storage devices. Currently data mining of hardware, firmware and platform logs is a challenging task as the domain knowledge is complex with expertise for large multinational organization distributed across the world. With increasing complexity and data mining continuing to be a very time consuming task that requires math/statistics skills, diverse programming & machine learning skills and cross domain knowledge, it is important to look at next generation analytics solution tailored to infrastructure vendors to improve supportability, quality, reliability, performance and security. In this publication we propose a smart, automated and generic data analytics platform that enables a 24/7 data mining solution using an built in platform domain modeler, an expert system for analyzing hardware and firmware logs and a policy manager that allows user defined hypothesis to be verified round the clock based on policies and configurable triggers. This smart data analytics platform will help democratize data mining of hardware and firmware logs and help improve troubleshooting complex issues, improve supportability experience, reliability and quality and reduce warranty costs
    • …
    corecore