761 research outputs found

    The structure of problem-solving knowledge and the structure of organisations

    Get PDF
    This work presents a model of organisational problem solving able to account for the relationships between problem complexity, tasks decentralilzation and problem solving efficiency. Whenever problem solving requires the coordination of a multiplicity of interdependent elements, the varying degrees of decentralization of cognitive and operational tasks shape the solution which can be generated, tested and selected. Suboptimality and path-dependence are shown to be ubiquitous features of organisational problem solving. At the same time, the model allows a precise exploration of the possible trade-offs between decompostion patterns and search efficiency involved in different organisational architectures.-

    Behavioural Economics: Classical and Modern

    Get PDF
    In this paper, the origins and development of behavioural economics, beginning with the pioneering works of Herbert Simon (1953) and Ward Edwards (1954), is traced, described and (critically) discussed, in some detail. Two kinds of behavioural economics – classical and modern – are attributed, respectively, to the two pioneers. The mathematical foundations of classical behavioural economics is identified, largely, to be in the theory of computation and computational complexity; the corresponding mathematical basis for modern behavioural economics is, on the other hand, claimed to be a notion of subjective probability (at least at its origins in the works of Ward Edwards). The economic theories of behavior, challenging various aspects of 'orthodox' theory, were decisively influenced by these two mathematical underpinnings of the two theoriesClassical Behavioural Economics, Modern Behavioural Economics, Subjective Probability, Model of Computation, Computational Complexity. Subjective Expected Utility

    Modularity and Architecture of PLC-based Software for Automated Production Systems: An analysis in industrial companies

    Full text link
    Adaptive and flexible production systems require modular and reusable software especially considering their long term life cycle of up to 50 years. SWMAT4aPS, an approach to measure Software Maturity for automated Production Systems is introduced. The approach identifies weaknesses and strengths of various companie's solutions for modularity of software in the design of automated Production Systems (aPS). At first, a self assessed questionnaire is used to evaluate a large number of companies concerning their software maturity. Secondly, we analyze PLC code, architectural levels, workflows and abilities to configure code automatically out of engineering information in four selected companies. In this paper, the questionnaire results from 16 German world leading companies in machine and plant manufacturing and four case studies validating the results from the detailed analyses are introduced to prove the applicability of the approach and give a survey of the state of the art in industry

    Perspectives on the Neuroscience of Cognition and Consciousness

    Get PDF
    The origin and current use of the concepts of computation, representation and information in Neuroscience are examined and conceptual flaws are identified which vitiate their usefulness for addressing problems of the neural basis of Cognition and Consciousness. In contrast, a convergence of views is presented to support the characterization of the Nervous System as a complex dynamical system operating in the metastable regime, and capable of evolving to configurations and transitions in phase space with potential relevance for Cognition and Consciousness

    Moving from a "human-as-problem" to a "human-as-solution" cybersecurity mindset

    Get PDF
    Cybersecurity has gained prominence, with a number of widely publicised security incidents, hacking attacks and data breaches reaching the news over the last few years. The escalation in the numbers of cyber incidents shows no sign of abating, and it seems appropriate to take a look at the way cybersecurity is conceptualised and to consider whether there is a need for a mindset change.To consider this question, we applied a "problematization" approach to assess current conceptualisations of the cybersecurity problem by government, industry and hackers. Our analysis revealed that individual human actors, in a variety of roles, are generally considered to be "a problem". We also discovered that deployed solutions primarily focus on preventing adverse events by building resistance: i.e. implementing new security layers and policies that control humans and constrain their problematic behaviours. In essence, this treats all humans in the system as if they might well be malicious actors, and the solutions are designed to prevent their ill-advised behaviours. Given the continuing incidences of data breaches and successful hacks, it seems wise to rethink the status quo approach, which we refer to as "Cybersecurity, Currently". In particular, we suggest that there is a need to reconsider the core assumptions and characterisations of the well-intentioned human's role in the cybersecurity socio-technical system. Treating everyone as a problem does not seem to work, given the current cyber security landscape.Benefiting from research in other fields, we propose a new mindset i.e. "Cybersecurity, Differently". This approach rests on recognition of the fact that the problem is actually the high complexity, interconnectedness and emergent qualities of socio-technical systems. The "differently" mindset acknowledges the well-intentioned human's ability to be an important contributor to organisational cybersecurity, as well as their potential to be "part of the solution" rather than "the problem". In essence, this new approach initially treats all humans in the system as if they are well-intentioned. The focus is on enhancing factors that contribute to positive outcomes and resilience. We conclude by proposing a set of key principles and, with the help of a prototypical fictional organisation, consider how this mindset could enhance and improve cybersecurity across the socio-technical system

    Input Secrecy & Output Privacy: Efficient Secure Computation of Differential Privacy Mechanisms

    Get PDF
    Data is the driving force of modern businesses. For example, customer-generated data is collected by companies to improve their products, discover emerging trends, and provide insights to marketers. However, data might contain personal information which allows to identify a person and violate their privacy. Examples of privacy violations are abundant – such as revealing typical whereabout and habits, financial status, or health information, either directly or indirectly by linking the data to other available data sources. To protect personal data and regulate its collection and processing, the general data protection regulation (GDPR) was adopted by all members of the European Union. Anonymization addresses such regulations and alleviates privacy concerns by altering personal data to hinder identification. Differential privacy (DP), a rigorous privacy notion for anonymization mechanisms, is widely deployed in the industry, e.g., by Google, Apple, and Microsoft. Additionally, cryptographic tools, namely, secure multi-party computation (MPC), protect the data during processing. MPC allows distributed parties to jointly compute a function over their data such that only the function output is revealed but none of the input data. MPC and DP provide orthogonal protection guarantees. MPC provides input secrecy, i.e., MPC protects the inputs of a computation via encrypted processing. DP provides output privacy, i.e., DP anonymizes the output of a computation via randomization. In typical deployments of DP the data is randomized locally, i.e., by each client, and aggregated centrally by a server. MPC allows to apply the randomization centrally as well, i.e., only once, which is optimal for accuracy. Overall, MPC and DP augment each other nicely. However, universal MPC is inefficient – requiring large computation and communication overhead – which makes MPC of DP mechanisms challenging for general real-world deployments. In this thesis, we present efficient MPC protocols for distributed parties to collaboratively compute DP statistics with high accuracy. We support general rank-based statistics, e.g., min, max, median, as well as decomposable aggregate functions, where local evaluations can be efficiently combined to global ones, e.g., for convex optimizations. Furthermore, we detect heavy hitters, i.e., most frequently appearing values, over known as well as unknown data domains. We prove the semi-honest security and differential privacy of our protocols. Also, we theoretically analyse and empirically evaluate their accuracy as well as efficiency. Our protocols provide higher accuracy than comparable solutions based on DP alone. Our protocols are efficient, with running times of seconds to minutes evaluated in real-world WANs between Frankfurt and Ohio (100 ms delay, 100 Mbits/s bandwidth), and have modest hardware requirements compared to related work (mainly, 4 CPU cores at 3.3 GHz and 2 GB RAM per party). Additionally, our protocols can be outsourced, i.e., clients can send encrypted inputs to few servers which run the MPC protocol on their behalf

    Kernel-based Inference of Functions over Graphs

    Get PDF
    The study of networks has witnessed an explosive growth over the past decades with several ground-breaking methods introduced. A particularly interesting -- and prevalent in several fields of study -- problem is that of inferring a function defined over the nodes of a network. This work presents a versatile kernel-based framework for tackling this inference problem that naturally subsumes and generalizes the reconstruction approaches put forth recently by the signal processing on graphs community. Both the static and the dynamic settings are considered along with effective modeling approaches for addressing real-world problems. The herein analytical discussion is complemented by a set of numerical examples, which showcase the effectiveness of the presented techniques, as well as their merits related to state-of-the-art methods.Comment: To be published as a chapter in `Adaptive Learning Methods for Nonlinear System Modeling', Elsevier Publishing, Eds. D. Comminiello and J.C. Principe (2018). This chapter surveys recent work on kernel-based inference of functions over graphs including arXiv:1612.03615 and arXiv:1605.07174 and arXiv:1711.0930
    • 

    corecore