2,926 research outputs found

    Multiagent Brokerage with CBR

    Get PDF
    This paper classifies multiagent-based e-commerce into multiagent-based auction, multiagent-based mediation and multiagent-based brokerage and gives a brief survey of related works in each. The paper proposes a framework of CMB, a CBR system for multiagent brokerage, which integrates CBR, intelligent agents and brokerage, in which we also propose a knowledge-based model for CBR. The key insight is that an efficient way for applying CBR in e-commerce is through intelligent agents or multiagent systems, and the work of a human broker should be done by a few intelligent agents in a cooperative way. This approach will facilitate research and development of CBR in multiagent e-commerce

    Intellectual Asset Management for Collaborative Business Support

    Get PDF

    Case Based Reasoning in E-Commerce.

    Get PDF

    Architecture for privacy-preserving brokerage of analytics using Multi Party Computation, Self Sovereign Identity and Blockchain

    Get PDF
    In our increasingly digitized world, the value of data is clear and proved, and many solutions and businesses have been developed to harness it. In particular, personal data (such as health-related data) is highly valuable, but it is also sensitive and could harm the owners if misused. In this context, data marketplaces could enhance the circulation of data and enable new businesses and solutions. However, in the case of personal data, marketplaces would necessarily have to comply with existing regulations, and they would also need to make users privacy protection a priority. In particular, privacy protection has been only partially accomplished by existing datamarkets, as they themselves can gather information about the individuals connected with the datasets they handle. In this thesis is presented an architecture proposal for KRAKEN, a new datamarket that provides privacy guarantees at every step in the data exchange and analytics pipeline. This is accomplished through the use of multi-party computation, blockchain and self-sovereign identity technologies. In addition to that, the thesis presents also a privacy analysis of the entire system. The analysis indicated that KRAKEN is safe from possible data disclosures to the buyers. On the other hand, some potential threats regarding the disclosure of data to the datamarket itself were identified, although posing a low-priority risk, given their rare chance of occurrence. Moreover the author of this thesis elaborated remarks on the decentralisation of the architecture and possible improvements to increase the security. These improvements are accompanied by the solutions identified in the paper that proposes the adoption of a trust measure for the MPC nodes. The work on the paper and the thesis contributed to the personal growth of the author, specifically improving his knowledge of cryptography by learning new schemes such as group signatures, zero knowledge proof of knowledge and multi-party computation. He improved his skills in writing academic papers and in working in a team of researchers leading a research area

    The Small Worlds of Wikipedia: Implications for Growth, Quality and Sustainability of Collaborative Knowledge Networks

    Get PDF
    This work is a longitudinal network analysis of the interaction networks of Wikipedia, a free, user-led collaborativelygenerated online encyclopedia. Making a case for representing Wikipedia as a knowledge network, and using the lens of contemporary graph theory, we attempt to unravel its knowledge creation process and growth dynamics over time. Typical small-world characteristics of short path-length and high clustering have important theoretical implications for knowledge networks. We show Wikipedia’s small-world nature to be increasing over time, while also uncovering power laws and assortative mixing. Investigating the process by which an apparently un-coordinated, diversely motivated swarm of assorted contributors, create and maintain remarkably high quality content, we find an association between Quality and Structural Holes. We find that a few key high degree, cluster spanning nodes - ‘hubs’ - hold the growing network together, and discuss implications for the networks’ growth and emergent quality

    A Hybrid Simulation Methodology To Evaluate Network Centricdecision Making Under Extreme Events

    Get PDF
    Currently the network centric operation and network centric warfare have generated a new area of research focused on determining how hierarchical organizations composed by human beings and machines make decisions over collaborative environments. One of the most stressful scenarios for these kinds of organizations is the so-called extreme events. This dissertation provides a hybrid simulation methodology based on classical simulation paradigms combined with social network analysis for evaluating and improving the organizational structures and procedures, mainly the incident command systems and plans for facing those extreme events. According to this, we provide a methodology for generating hypotheses and afterwards testing organizational procedures either in real training systems or simulation models with validated data. As long as the organization changes their dyadic relationships dynamically over time, we propose to capture the longitudinal digraph in time and analyze it by means of its adjacency matrix. Thus, by using an object oriented approach, three domains are proposed for better understanding the performance and the surrounding environment of an emergency management organization. System dynamics is used for modeling the critical infrastructure linked to the warning alerts of a given organization at federal, state and local levels. Discrete simulations based on the defined concept of community of state enables us to control the complete model. Discrete event simulation allows us to create entities that represent the data and resource flows within the organization. We propose that cognitive models might well be suited in our methodology. For instance, we show how the team performance decays in time, according to the Yerkes-Dodson curve, affecting the measures of performance of the whole organizational system. Accordingly we suggest that the hybrid model could be applied to other types of organizations, such as military peacekeeping operations and joint task forces. Along with providing insight about organizations, the methodology supports the analysis of the after action review (AAR), based on collection of data obtained from the command and control systems or the so-called training scenarios. Furthermore, a rich set of mathematical measures arises from the hybrid models such as triad census, dyad census, eigenvalues, utilization, feedback loops, etc., which provides a strong foundation for studying an emergency management organization. Future research will be necessary for analyzing real data and validating the proposed methodology

    The business model: Theoretical roots, recent developments, and future research

    Get PDF
    The paper provides a broad and multifaceted review of the received literature on business models, in which we attempt to explore the origin of the construct and to examine the business model concept through multiple disciplinary and subject-matter lenses. The review reveals that scholars do not agree on what a business model is, and that the literature is developing largely in silos, according to the phenomena of interest to the respective researchers. However, we also found some emerging common ground among students of business models. Specifically, i) the business model is emerging as a new unit of analysis; ii) business models emphasize a system-level, holistic approach towards explaining how firms do business; iii) organizational activities play an important role in the various conceptualizations of business models that have been proposed, and iv) business models seek not only to explain the ways in which value is captured but also how it is created. These emerging themes could serve as important catalysts towards a more unified study of business models.Business model; strategy; technology management; innovation; literature review;

    Interoperability and the Need for Intelligent Software: A Historical Perspective

    Get PDF
    With the objective of defining the interoperability theme of this year’s conference it is the purpose of this paper1 to trace the evolution of intelligent software from data-centric applications that essentially encapsulate their data environment to ontology-based applications with automated reasoning capabilities. The author draws a distinction between human intelligence and component capabilities within a more general definition of intelligence; - a kind of intelligence that can be embedded in computer software. The primary vehicle in the quest for intelligent software has been the gradual recognition of the central role played by data and information, rather than the logic and functionality of the application. The three milestones in this evolution have been: the separation of data management from the internal domain of the application; the development of standard data exchange protocols such as XML that allow machine interpretable structure and meaning to be added to data exchange packages; and, the ability to build information models that are rich in relationships and are thereby capable of supporting the automated reasoning capabilities of software agents. The author suggests that the vision of a Semantic Web environment in which ontology-based Web services with intelligent capabilities are able to discover each other and individually or in self-configured groups perform useful tasks, is not only feasible but imminently realizable. The capabilities of an experimental proof-of-concept system featuring semantic Web services that was demonstrated during the 2002 meeting of this annual conference series is described in summary form

    Analysis Of Aircraft Arrival Delay And Airport On-time Performance

    Get PDF
    While existing grid environments cater to specific needs of a particular user community, we need to go beyond them and consider general-purpose large-scale distributed systems consisting of large collections of heterogeneous computers and communication systems shared by a large user population with very diverse requirements. Coordination, matchmaking, and resource allocation are among the essential functions of large-scale distributed systems. Although deterministic approaches for coordination, matchmaking, and resource allocation have been well studied, they are not suitable for large-scale distributed systems due to the large-scale, the autonomy, and the dynamics of the systems. We have to seek for nondeterministic solutions for large-scale distributed systems. In this dissertation we describe our work on a coordination service, a matchmaking service, and a macro-economic resource allocation model for large-scale distributed systems. The coordination service coordinates the execution of complex tasks in a dynamic environment, the matchmaking service supports finding the appropriate resources for users, and the macro-economic resource allocation model allows a broker to mediate resource providers who want to maximize their revenues and resource consumers who want to get the best resources at the lowest possible price, with some global objectives, e.g., to maximize the resource utilization of the system
    corecore