46,083 research outputs found
Assessing Code Authorship: The Case of the Linux Kernel
Code authorship is a key information in large-scale open source systems.
Among others, it allows maintainers to assess division of work and identify key
collaborators. Interestingly, open-source communities lack guidelines on how to
manage authorship. This could be mitigated by setting to build an empirical
body of knowledge on how authorship-related measures evolve in successful
open-source communities. Towards that direction, we perform a case study on the
Linux kernel. Our results show that: (a) only a small portion of developers (26
%) makes significant contributions to the code base; (b) the distribution of
the number of files per author is highly skewed --- a small group of top
authors (3 %) is responsible for hundreds of files, while most authors (75 %)
are responsible for at most 11 files; (c) most authors (62 %) have a specialist
profile; (d) authors with a high number of co-authorship connections tend to
collaborate with others with less connections.Comment: Accepted at 13th International Conference on Open Source Systems
(OSS). 12 page
The private finance initiative (PFI) and finance capital: A note on gaps in the "accountability" debate
During recent years, a wide spectrum of research has questioned whether public services/infrastructure procurement through private finance, as exemplified by the UK Private Finance Initiative (PFI), meets minimum standard of democratic accountability. While broadly agreeing with some of these arguments, this paper suggests that this debate is flawed on two grounds. Firstly, PFI is not about effective procurement, or even about a pragmatic choice of procurement mechanisms which can potentially compromise public involvement and input; rather it is about a process where the state creates new profit opportunities at a time when the international financial system is increasingly lacking in safe investment opportunities. Secondly, because of its primary function as investment opportunity, PFI, by its very nature, prioritises the risk-return criteria of private finance over the needs of the public sector client and its stakeholders. Using two case studies of recent PFI projects, the paper illustrates some of the mechanisms through which finance capital exercises control over the PFI procurement process. The paper concludes that recent proposals aimed at “reforming” or “democratising” PFI fail to recognise the objective constraints which this type of state-finance capital nexus imposes on political process
Data analysis challenges in transient gravitational-wave astronomy
Gravitational waves are radiative solutions of space-time dynamics predicted
by Einstein's theory of General Relativity. A world-wide array of large-scale
and highly sensitive interferometric detectors constantly scrutinizes the
geometry of the local space-time with the hope to detect deviations that would
signal an impinging gravitational wave from a remote astrophysical source.
Finding the rare and weak signature of gravitational waves buried in
non-stationary and non-Gaussian instrument noise is a particularly challenging
problem. We will give an overview of the data-analysis techniques and
associated observational results obtained so far by Virgo (in Europe) and LIGO
(in the US), along with the prospects offered by the up-coming advanced
versions of those detectors.Comment: 7 pages, 5 figures, Proceedings of the ARENA'12 Conference, few minor
change
Digital curation and the cloud
Digital curation involves a wide range of activities, many of which could benefit from cloud
deployment to a greater or lesser extent. These range from infrequent, resource-intensive tasks
which benefit from the ability to rapidly provision resources to day-to-day collaborative activities
which can be facilitated by networked cloud services. Associated benefits are offset by risks
such as loss of data or service level, legal and governance incompatibilities and transfer
bottlenecks. There is considerable variability across both risks and benefits according to the
service and deployment models being adopted and the context in which activities are
performed. Some risks, such as legal liabilities, are mitigated by the use of alternative, e.g.,
private cloud models, but this is typically at the expense of benefits such as resource elasticity
and economies of scale. Infrastructure as a Service model may provide a basis on which more
specialised software services may be provided.
There is considerable work to be done in helping institutions understand the cloud and its
associated costs, risks and benefits, and how these compare to their current working methods,
in order that the most beneficial uses of cloud technologies may be identified. Specific
proposals, echoing recent work coordinated by EPSRC and JISC are the development of
advisory, costing and brokering services to facilitate appropriate cloud deployments, the
exploration of opportunities for certifying or accrediting cloud preservation providers, and
the targeted publicity of outputs from pilot studies to the full range of stakeholders within the
curation lifecycle, including data creators and owners, repositories, institutional IT support
professionals and senior manager
- …