89,805 research outputs found

    Collaborative model development increases trust in and use of scientific information in environmental decision-making

    Get PDF
    While science matters for environmental management, creating science that is credible, salient to decision-makers, and deemed legitimate by stakeholders is challenging. Collaborative modeling is an increasingly-used approach to enable effective science-based decision-making. This work evaluates the modeling process conducted for two hydropower dam licensing negotiations, to explore how differences in the collaborative development of hydrological models affected differences in their use in subsequent decision-making. In one case, the model was developed iteratively through deliberation with stakeholders. Consequently, stakeholders understood the model and its limitations and trusted the model and modelers; the model itself was also better designed to evaluate resource managers’ questions. The collaboratively-developed model became the focal point for subsequent negotiations and enabled creative group problem-solving. Conversely, in the case with less engagement during model development, the model was not used subsequently by decision-makers. These differences are argued to result from trust built during the modeling process, applicability of the model to test real management scenarios, and the broader social context in which the models were used

    BPM News - Folge 3

    Get PDF
    Die BPM-Kolumne des EMISA-Forums berichtet ĂŒber aktuelle Themen, Projekte und Veranstaltungen aus dem BPM-Umfeld. Schwerpunkt der vorliegenden Kolumne bildet das Thema Standardisierung von Prozessbeschreibungssprachen und -notationen im Allgemeinen und BPEL4WS (Business Process Execution Language for Web Services) im Speziellen. Hierzu liefert Jan Mendling von der WirtschaftsuniversitĂ€t Wien in aktuelles Schlagwort. Des weiteren erhalten Leser eine Zusammenfassung zweier im ersten Halbjahr 2006 veranstalteten Workshops zu den Themen „FlexibilitĂ€t prozessorientierter Informationssysteme“ und „Kollaborative Prozesse“ sowie einen BPM Veranstaltungskalender fĂŒr die 2. JahreshĂ€lfte 2006

    A Methodology for Engineering Collaborative and ad-hoc Mobile Applications using SyD Middleware

    Get PDF
    Today’s web applications are more collaborative and utilize standard and ubiquitous Internet protocols. We have earlier developed System on Mobile Devices (SyD) middleware to rapidly develop and deploy collaborative applications over heterogeneous and possibly mobile devices hosting web objects. In this paper, we present the software engineering methodology for developing SyD-enabled web applications and illustrate it through a case study on two representative applications: (i) a calendar of meeting application, which is a collaborative application and (ii) a travel application which is an ad-hoc collaborative application. SyD-enabled web objects allow us to create a collaborative application rapidly with limited coding effort. In this case study, the modular software architecture allowed us to hide the inherent heterogeneity among devices, data stores, and networks by presenting a uniform and persistent object view of mobile objects interacting through XML/SOAP requests and responses. The performance results we obtained show that the application scales well as we increase the group size and adapts well within the constraints of mobile devices

    Knowledge-Intensive Processes: Characteristics, Requirements and Analysis of Contemporary Approaches

    Get PDF
    Engineering of knowledge-intensive processes (KiPs) is far from being mastered, since they are genuinely knowledge- and data-centric, and require substantial flexibility, at both design- and run-time. In this work, starting from a scientific literature analysis in the area of KiPs and from three real-world domains and application scenarios, we provide a precise characterization of KiPs. Furthermore, we devise some general requirements related to KiPs management and execution. Such requirements contribute to the definition of an evaluation framework to assess current system support for KiPs. To this end, we present a critical analysis on a number of existing process-oriented approaches by discussing their efficacy against the requirements

    Sharing 3D city models: an overview

    Get PDF
    This study describes the computing methods now available to enable the sharing of three-dimensional (3D) data between various stakeholders for the purposes of city modeling and considers the need for a seamless approach for sharing, transmitting, and maintaining 3D city models. The study offers an overview of the technologies and the issues related to remote access, collaboration, and version control. It builds upon previous research on 3D city models where issues were raised on utilizing, updating and maintaining 3D city models and providing access to various stakeholders. This paper will also describe a case study which is currently analyzing the remote access requirements for a sustainable computer model of NewcastleGateshead in England. Options available will be examined and areas of future research will be discussed

    Attributes of Big Data Analytics for Data-Driven Decision Making in Cyber-Physical Power Systems

    Get PDF
    Big data analytics is a virtually new term in power system terminology. This concept delves into the way a massive volume of data is acquired, processed, analyzed to extract insight from available data. In particular, big data analytics alludes to applications of artificial intelligence, machine learning techniques, data mining techniques, time-series forecasting methods. Decision-makers in power systems have been long plagued by incapability and weakness of classical methods in dealing with large-scale real practical cases due to the existence of thousands or millions of variables, being time-consuming, the requirement of a high computation burden, divergence of results, unjustifiable errors, and poor accuracy of the model. Big data analytics is an ongoing topic, which pinpoints how to extract insights from these large data sets. The extant article has enumerated the applications of big data analytics in future power systems through several layers from grid-scale to local-scale. Big data analytics has many applications in the areas of smart grid implementation, electricity markets, execution of collaborative operation schemes, enhancement of microgrid operation autonomy, management of electric vehicle operations in smart grids, active distribution network control, district hub system management, multi-agent energy systems, electricity theft detection, stability and security assessment by PMUs, and better exploitation of renewable energy sources. The employment of big data analytics entails some prerequisites, such as the proliferation of IoT-enabled devices, easily-accessible cloud space, blockchain, etc. This paper has comprehensively conducted an extensive review of the applications of big data analytics along with the prevailing challenges and solutions

    Leveraging Open-standard Interorganizational Information Systems for Process Adaptability and Alignment: An Empirical Analysis

    Get PDF
    PurposeThe purpose of this paper is to understand the value creation mechanisms of open-standard inter-organizational information system (OSIOS), which is a key technology to achieve Industry 4.0. Specifically, this study investigates how the internal assimilation and external diffusion of OSIOS help manufactures facilitate process adaptability and alignment in supply chain network.Design/methodology/approachA survey instrument was designed and administrated to collect data for this research. Using three-stage least squares estimation, the authors empirically tested a number of hypothesized relationships based on a sample of 308 manufacturing firms in China.FindingsThe results of the study show that OSIOS can perform as value creation mechanisms to enable process adaptability and alignment. In addition, the impact of OSIOS internal assimilation is inversely U-shaped where the positive effect on process adaptability will become negative after an extremum point is reached.Originality/valueThis study contributes to the existing literature by providing insights on how OSIOS can improve supply chain integration and thus promote the achievement of industry 4.0. By revealing a U-shaped relationship between OSIOS assimilation and process adaptability, this study fills previous research gap by advancing the understanding on the value creation mechanisms of information systems deployment

    Querying Large Physics Data Sets Over an Information Grid

    Get PDF
    Optimising use of the Web (WWW) for LHC data analysis is a complex problem and illustrates the challenges arising from the integration of and computation across massive amounts of information distributed worldwide. Finding the right piece of information can, at times, be extremely time-consuming, if not impossible. So-called Grids have been proposed to facilitate LHC computing and many groups have embarked on studies of data replication, data migration and networking philosophies. Other aspects such as the role of 'middleware' for Grids are emerging as requiring research. This paper positions the need for appropriate middleware that enables users to resolve physics queries across massive data sets. It identifies the role of meta-data for query resolution and the importance of Information Grids for high-energy physics analysis rather than just Computational or Data Grids. This paper identifies software that is being implemented at CERN to enable the querying of very large collaborating HEP data-sets, initially being employed for the construction of CMS detectors.Comment: 4 pages, 3 figure

    Big Data Meets Telcos: A Proactive Caching Perspective

    Full text link
    Mobile cellular networks are becoming increasingly complex to manage while classical deployment/optimization techniques and current solutions (i.e., cell densification, acquiring more spectrum, etc.) are cost-ineffective and thus seen as stopgaps. This calls for development of novel approaches that leverage recent advances in storage/memory, context-awareness, edge/cloud computing, and falls into framework of big data. However, the big data by itself is yet another complex phenomena to handle and comes with its notorious 4V: velocity, voracity, volume and variety. In this work, we address these issues in optimization of 5G wireless networks via the notion of proactive caching at the base stations. In particular, we investigate the gains of proactive caching in terms of backhaul offloadings and request satisfactions, while tackling the large-amount of available data for content popularity estimation. In order to estimate the content popularity, we first collect users' mobile traffic data from a Turkish telecom operator from several base stations in hours of time interval. Then, an analysis is carried out locally on a big data platform and the gains of proactive caching at the base stations are investigated via numerical simulations. It turns out that several gains are possible depending on the level of available information and storage size. For instance, with 10% of content ratings and 15.4 Gbyte of storage size (87% of total catalog size), proactive caching achieves 100% of request satisfaction and offloads 98% of the backhaul when considering 16 base stations.Comment: 8 pages, 5 figure
    • 

    corecore