413,603 research outputs found
Symbiotic Learning Systems: Reorganizing and Integrating Learning Efforts and Responsibilities Between Higher Educational Institutions (HEIs) and Work Places
This article presents the idea of “symbiotic learning systems” as a possible strategy for dealing with institutional knowledge and learning challenges posed by an emerging transition from “socially monopolized” to “socially distributed” knowledge generation and distribution. As knowledge production and learning become increasingly relocated from segregated and specialized institutions for research and education and socially distributed to and within “ordinary” work life, corresponding changes are required in the basic institutionalized relationships between research, higher education, and practical knowledge application. The concept of “symbiotic learning” addresses these problems by deconstructing age-old divisions between vocational and liberal education. In order to build foundations for a changed and
improved relationship between advanced organizations in work life and institutions
of higher education and research (HEIs), the general preconditions for learning in the
work places themselves need to be addressed. In modeling general preconditions for learning, and even in transcending the division of labor between manual and intellectual work, inspiration is found in the philosophy of Plato and Aristotle, and in their search for intellectual “commons” (tà koiná) as constituting public spheres and
community among individuals
Knowledge management for self-organised resource allocation
Many open systems, such as networks, distributed computing and socio-technical systems address a common problem of how to define knowledge management processes to structure and guide decision-making, coordination and learning. While participation is an essential and desirable feature of such systems, the amount of information produced by its individual agents can often be overwhelming and intractable. The challenge, thus, is how to organise and process such information, so it is transformed into productive knowledge used for the resolution of collective action problems.
To address this problem, we consider a study of classical Athenian democracy which investigates how the governance model of the city-state flourished. The work suggests that exceptional knowledge management, i.e. making information available for socially productive purposes, played a crucial role in sustaining its democracy for nearly 200 years, by creating processes for aggregation, alignment and codification of knowledge. We therefore examine the proposition that some properties of this historical experience can be generalised and applied to computational systems, so we establish a set of design principles intended to make knowledge management processes open, inclusive, transparent and effective in self-governed social technical systems. We operationalise three of these principles in the context of a collective action situation, namely self-organised common-pool resource allocation, exploring four governance problems: (a) how fairness can be perceived; (b) how resources can be distributed; (c) how policies should be enforced and (d) how tyranny can be opposed.
By applying this operationalisation of the design principles for knowledge management processes as a complement to institutional approaches to governance, we demonstrate empirically how it can guide solutions that satisfice shared values, distribute power fairly, apply "common sense" in dealing with rule violations, and protect agents against abuse of power. We conclude by arguing that this approach to the design of open systems can provide the foundations for sustainable and democratic self-governance in socio-technical systems.Open Acces
Toward Domain-Specific Solvers for Distributed Consistency
To guard against machine failures, modern internet services store multiple replicas of the same application data within and across data centers, which introduces the problem of keeping geo-distributed replicas consistent with one another in the face of network partitions and unpredictable message latency. To avoid costly and conservative synchronization protocols, many real-world systems provide only weak consistency guarantees (e.g., eventual, causal, or PRAM consistency), which permit certain kinds of disagreement among replicas.
There has been much recent interest in language support for specifying and verifying such consistency properties. Although these properties are usually beyond the scope of what traditional type checkers or compiler analyses can guarantee, solver-aided languages are up to the task. Inspired by systems like Liquid Haskell [Vazou et al., 2014] and Rosette [Torlak and Bodik, 2014], we believe that close integration between a language and a solver is the right path to consistent-by-construction distributed applications. Unfortunately, verifying distributed consistency properties requires reasoning about transitive relations (e.g., causality or happens-before), partial orders (e.g., the lattice of replica states under a convergent merge operation), and properties relevant to message processing or API invocation (e.g., commutativity and idempotence) that cannot be easily or efficiently carried out by general-purpose SMT solvers that lack native support for this kind of reasoning.
We argue that domain-specific SMT-based tools that exploit the mathematical foundations of distributed consistency would enable both more efficient verification and improved ease of use for domain experts. The principle of exploiting domain knowledge for efficiency and expressivity that has borne fruit elsewhere - such as in the development of high-performance domain-specific languages that trade off generality to gain both performance and productivity - also applies here. Languages augmented with domain-specific, consistency-aware solvers would support the rapid implementation of formally verified programming abstractions that guarantee distributed consistency. In the long run, we aim to democratize the development of such domain-specific solvers by creating a framework for domain-specific solver development that brings new theory solver implementation within the reach of programmers who are not necessarily SMT solver internals experts
Curriculum Guidelines for Undergraduate Programs in Data Science
The Park City Math Institute (PCMI) 2016 Summer Undergraduate Faculty Program
met for the purpose of composing guidelines for undergraduate programs in Data
Science. The group consisted of 25 undergraduate faculty from a variety of
institutions in the U.S., primarily from the disciplines of mathematics,
statistics and computer science. These guidelines are meant to provide some
structure for institutions planning for or revising a major in Data Science
Dimensions of Neural-symbolic Integration - A Structured Survey
Research on integrated neural-symbolic systems has made significant progress
in the recent past. In particular the understanding of ways to deal with
symbolic knowledge within connectionist systems (also called artificial neural
networks) has reached a critical mass which enables the community to strive for
applicable implementations and use cases. Recent work has covered a great
variety of logics used in artificial intelligence and provides a multitude of
techniques for dealing with them within the context of artificial neural
networks. We present a comprehensive survey of the field of neural-symbolic
integration, including a new classification of system according to their
architectures and abilities.Comment: 28 page
Computer Science and Game Theory: A Brief Survey
There has been a remarkable increase in work at the interface of computer
science and game theory in the past decade. In this article I survey some of
the main themes of work in the area, with a focus on the work in computer
science. Given the length constraints, I make no attempt at being
comprehensive, especially since other surveys are also available, and a
comprehensive survey book will appear shortly.Comment: To appear; Palgrave Dictionary of Economic
- …