1,569 research outputs found
Assessing Code Authorship: The Case of the Linux Kernel
Code authorship is a key information in large-scale open source systems.
Among others, it allows maintainers to assess division of work and identify key
collaborators. Interestingly, open-source communities lack guidelines on how to
manage authorship. This could be mitigated by setting to build an empirical
body of knowledge on how authorship-related measures evolve in successful
open-source communities. Towards that direction, we perform a case study on the
Linux kernel. Our results show that: (a) only a small portion of developers (26
%) makes significant contributions to the code base; (b) the distribution of
the number of files per author is highly skewed --- a small group of top
authors (3 %) is responsible for hundreds of files, while most authors (75 %)
are responsible for at most 11 files; (c) most authors (62 %) have a specialist
profile; (d) authors with a high number of co-authorship connections tend to
collaborate with others with less connections.Comment: Accepted at 13th International Conference on Open Source Systems
(OSS). 12 page
Enabling decentral collaborative innovation processes - a web based real time collaboration platform
The main goal of this paper is to define a collaborative innovation process
as well as a supporting tool. It is motivated through the increasing
competition on global markets and the resultant propagation of decentralized
projects with a high demand of innovative collaboration in global contexts. It
bases on a project accomplished by the author group. A detailed literature
review and the action design research methodology of the project led to an
enhanced process model for decentral collaborative innovation processes and a
basic realization of a browser based real time tool to enable these processes.
The initial evaluation in a practical distributed setting has shown that the
created tool is a useful way to support collaborative innovation processes.Comment: multikonferenz wirtschaftsinformati
Distributed Wikis: A Survey
International audienceSUMMARY "Distributed Wiki" is a generic term covering various systems, including "peer-to-peer wiki," "mobile wiki," "offline wiki," "federated wiki" and others. Distributed wikis distribute their pages among the sites of autonomous participants to address various motivations, including high availability of data, new collaboration models and different viewpoint of subjects. Although existing systems share some common basic concepts, it is often difficult to understand the specificity of each one, the underlying complexities or the best context in which to use it. In this paper, we define, classify and characterize distributed wikis. We identify three classes of distributed wiki systems, each using a different collaboration model and distribution scheme for its pages: highly available wikis, decentralized social wikis and federated wikis. We classify existing distributed wikis according to these classes. We detail their underlying complexities and social and technical motivations. We also highlight some directions for research and opportunities for new systems with original social and technical motivations
MONDO : Scalable modelling and model management on the Cloud
Achieving scalability in modelling and MDE involves being able to construct large models and domain-specific languages in a systematic manner, enabling teams of modellers to construct and refine large models in collaboration, advancing the state of the art in model querying and transformations tools so that they can cope with large models (of the scale of millions of model elements), and providing an infrastructure for efficient storage, indexing and retrieval of large models. This paper outlines how MONDO, a collaborative EC-funded project, has contributed to tackling some of these scalability-related challenges
WFIRST Coronagraph Technology Requirements: Status Update and Systems Engineering Approach
The coronagraphic instrument (CGI) on the Wide-Field Infrared Survey
Telescope (WFIRST) will demonstrate technologies and methods for high-contrast
direct imaging and spectroscopy of exoplanet systems in reflected light,
including polarimetry of circumstellar disks. The WFIRST management and CGI
engineering and science investigation teams have developed requirements for the
instrument, motivated by the objectives and technology development needs of
potential future flagship exoplanet characterization missions such as the NASA
Habitable Exoplanet Imaging Mission (HabEx) and the Large UV/Optical/IR
Surveyor (LUVOIR). The requirements have been refined to support
recommendations from the WFIRST Independent External Technical/Management/Cost
Review (WIETR) that the WFIRST CGI be classified as a technology demonstration
instrument instead of a science instrument. This paper provides a description
of how the CGI requirements flow from the top of the overall WFIRST mission
structure through the Level 2 requirements, where the focus here is on
capturing the detailed context and rationales for the CGI Level 2 requirements.
The WFIRST requirements flow starts with the top Program Level Requirements
Appendix (PLRA), which contains both high-level mission objectives as well as
the CGI-specific baseline technical and data requirements (BTR and BDR,
respectively)... We also present the process and collaborative tools used in
the L2 requirements development and management, including the collection and
organization of science inputs, an open-source approach to managing the
requirements database, and automating documentation. The tools created for the
CGI L2 requirements have the potential to improve the design and planning of
other projects, streamlining requirement management and maintenance. [Abstract
Abbreviated]Comment: 16 pages, 4 figure
- …