87,555 research outputs found
Recommended from our members
On the use of testability measures for dependability assessment
Program “testability” is informally, the probability that a program will fail under test if it contains at least one fault. When a dependability assessment has to be derived from the observation of a series of failure free test executions (a common need for software subject to “ultra high reliability” requirements), measures of testability can-in theory-be used to draw inferences on program correctness. We rigorously investigate the concept of testability and its use in dependability assessment, criticizing, and improving on, previously published results. We give a general descriptive model of program execution and testing, on which the different measures of interest can be defined. We propose a more precise definition of program testability than that given by other authors, and discuss how to increase testing effectiveness without impairing program reliability in operation. We then study the mathematics of using testability to estimate, from test results: the probability of program correctness and the probability of failures. To derive the probability of program correctness, we use a Bayesian inference procedure and argue that this is more useful than deriving a classical “confidence level”. We also show that a high testability is not an unconditionally desirable property for a program. In particular, for programs complex enough that they are unlikely to be completely fault free, increasing testability may produce a program which will be less trustworthy, even after successful testin
Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World
This report documents the program and the outcomes of GI-Dagstuhl Seminar
16394 "Software Performance Engineering in the DevOps World".
The seminar addressed the problem of performance-aware DevOps. Both, DevOps
and performance engineering have been growing trends over the past one to two
years, in no small part due to the rise in importance of identifying
performance anomalies in the operations (Ops) of cloud and big data systems and
feeding these back to the development (Dev). However, so far, the research
community has treated software engineering, performance engineering, and cloud
computing mostly as individual research areas. We aimed to identify
cross-community collaboration, and to set the path for long-lasting
collaborations towards performance-aware DevOps.
The main goal of the seminar was to bring together young researchers (PhD
students in a later stage of their PhD, as well as PostDocs or Junior
Professors) in the areas of (i) software engineering, (ii) performance
engineering, and (iii) cloud computing and big data to present their current
research projects, to exchange experience and expertise, to discuss research
challenges, and to develop ideas for future collaborations
From physical marketing to web marketing
Reviews the criticism of the 4P marketing mix framework as the basis of traditional and virtual marketing planning. Argues that the customary marketing management approach, based on the popular marketing mix 4Ps paradigm, is inadequate in the case of virtual marketing. Identifies two main limitations of the marketing mix when applied in online environments namely the role of the Ps in a virtual commercial setting and the lack of any strategic elements in the model. Identifies the critical factors of the Web marketing and argues that the basis for successful e-commerce is the full integration of virtual activities into the company's physical strategy, marketing plan and organisational processes. The 4S elements of the Web marketing mix framework offer the basis for developing and commercialising business to consumer online projects. The model was originally developed for educational purposes and has been tested and refined by means of three case studies
A Technology Proposal for a Management Information System for the Director’s Office, NAL.
This technology proposal attempts in giving a viable solution for a Management Information System (MIS) for the Director's Office. In today's IT scenario, an Organization's success greatly depends on its ability to get accurate and timely data on its operations of varied nature and to manage this data effectively to guide its activities and meet its goals. To cater to the information needs of an Organization or an Office like the Director's Office, information systems are developed and deployed to gather and process data in ways that produce a variety of information to the end-user. MIS can therefore can be defined as an integrated user-machine system for providing information to support operations, management and decision-making functions in an Organization. The system in a nutshell, utilizes computer hardware and software, manual procedures, models for analysis planning, control and decision-making and a database. Using state-of-the-art front-end and back-end web based tools, this technology proposal attempts to provide a single-point Information Management, Information Storage, Information Querying and Information Retrieval interface to the Director and his office for handling all information traffic flow in and out of the Director's Office
SigTree: A Microbial Community Analysis Tool to Identify and Visualize Significantly Responsive Branches in a Phylogenetic Tree.
Microbial community analysis experiments to assess the effect of a treatment intervention (or environmental change) on the relative abundance levels of multiple related microbial species (or operational taxonomic units) simultaneously using high throughput genomics are becoming increasingly common. Within the framework of the evolutionary phylogeny of all species considered in the experiment, this translates to a statistical need to identify the phylogenetic branches that exhibit a significant consensus response (in terms of operational taxonomic unit abundance) to the intervention. We present the R software package SigTree, a collection of flexible tools that make use of meta-analysis methods and regular expressions to identify and visualize significantly responsive branches in a phylogenetic tree, while appropriately adjusting for multiple comparisons
The Maunakea Spectroscopic Explorer Book 2018
(Abridged) This is the Maunakea Spectroscopic Explorer 2018 book. It is
intended as a concise reference guide to all aspects of the scientific and
technical design of MSE, for the international astronomy and engineering
communities, and related agencies. The current version is a status report of
MSE's science goals and their practical implementation, following the System
Conceptual Design Review, held in January 2018. MSE is a planned 10-m class,
wide-field, optical and near-infrared facility, designed to enable
transformative science, while filling a critical missing gap in the emerging
international network of large-scale astronomical facilities. MSE is completely
dedicated to multi-object spectroscopy of samples of between thousands and
millions of astrophysical objects. It will lead the world in this arena, due to
its unique design capabilities: it will boast a large (11.25 m) aperture and
wide (1.52 sq. degree) field of view; it will have the capabilities to observe
at a wide range of spectral resolutions, from R2500 to R40,000, with massive
multiplexing (4332 spectra per exposure, with all spectral resolutions
available at all times), and an on-target observing efficiency of more than
80%. MSE will unveil the composition and dynamics of the faint Universe and is
designed to excel at precision studies of faint astrophysical phenomena. It
will also provide critical follow-up for multi-wavelength imaging surveys, such
as those of the Large Synoptic Survey Telescope, Gaia, Euclid, the Wide Field
Infrared Survey Telescope, the Square Kilometre Array, and the Next Generation
Very Large Array.Comment: 5 chapters, 160 pages, 107 figure
Data analytics and algorithms in policing in England and Wales: Towards a new policy framework
RUSI was commissioned by the Centre for Data Ethics and Innovation (CDEI) to conduct an independent study into the use of data analytics by police forces in England and Wales, with a focus on algorithmic bias. The primary purpose of the project is to inform CDEI’s review of bias in algorithmic decision-making, which is focusing on four sectors, including policing, and working towards a draft framework for the ethical development and deployment of data analytics tools for policing.
This paper focuses on advanced algorithms used by the police to derive insights, inform operational decision-making or make predictions. Biometric technology, including live facial recognition, DNA analysis and fingerprint matching, are outside the direct scope of this study, as are covert surveillance capabilities and digital forensics technology, such as mobile phone data extraction and computer forensics. However, because many of the policy issues discussed in this paper stem from general underlying data protection and human rights frameworks, these issues will also be relevant to other police technologies, and their use must be considered in parallel to the tools examined in this paper.
The project involved engaging closely with senior police officers, government officials, academics, legal experts, regulatory and oversight bodies and civil society organisations. Sixty nine participants took part in the research in the form of semi-structured interviews, focus groups and roundtable discussions. The project has revealed widespread concern across the UK law enforcement community regarding the lack of official national guidance for the use of algorithms in policing, with respondents suggesting that this gap should be addressed as a matter of urgency.
Any future policy framework should be principles-based and complement existing police guidance in a ‘tech-agnostic’ way. Rather than establishing prescriptive rules and standards for different data technologies, the framework should establish standardised processes to ensure that data analytics projects follow recommended routes for the empirical evaluation of algorithms within their operational context and evaluate the project against legal requirements and ethical standards. The new guidance should focus on ensuring multi-disciplinary legal, ethical and operational input from the outset of a police technology project; a standard process for model development, testing and evaluation; a clear focus on the human–machine interaction and the ultimate interventions a data driven process may inform; and ongoing tracking and mitigation of discrimination risk
Usability testing for improving interactive geovisualization techniques
Usability describes a product’s fitness for use according to a set of predefined criteria.
Whatever the aim of the product, it should facilitate users’ tasks or enhance their performance
by providing appropriate analysis tools. In both cases, the main interest is to satisfy users in
terms of providing relevant functionality which they find fit for purpose. “Testing usability
means making sure that people can find and work with [a product’s] functions to meet their
needs” (Dumas and Redish, 1999: 4). It is therefore concerned with establishing whether
people can use a product to complete their tasks with ease and at the same time help them
complete their jobs more effectively.
This document describes the findings of a usability study carried out on DecisionSite Map
Interaction Services (Map IS). DecisionSite, a product of Spotfire, Inc.,1 is an interactive
system for the visual and dynamic exploration of data designed for supporting decisionmaking.
The system was coupled to ArcExplorer (forming DecisionSite Map IS) to provide
limited GIS functionality (simple user interface, basic tools, and data management) and
support users of spatial data. Hence, this study set out to test the suitability of the coupling
between the two software components (DecisionSite and ArcExplorer) for the purpose of
exploring spatial data. The first section briefly discusses DecisionSite’s visualization
functionality. The second section describes the test goals, its design, the participants and data
used. The following section concentrates on the analysis of results, while the final section
discusses future areas of research and possible development
The resilience of post market infrastructures and payment systems Initiatives and perspectives.
The use of non-cash payment schemes is particularly widespread in France where the number of non-cash transactions is in fact well above the European average. Though they have different features corresponding to users’ varying needs (payments may be face-to-face, remote or recurring, for instance), non-cash payment schemes generally consist of an instrument that generates a payment order combined with the technical and organisational arrangements that enable this order to be processed. Putting these arrangements in place requires close co-operation between all participants of the payment «network», i.e., naturally, credit institutions that hold accounts for debtors and beneficiaries, and also their technical service providers. The Everyday Security Act of 15 November 2001 entrusts the Banque de France with a specific task with regard to overseeing the security of non-cash means of payment. This task falls naturally within the purview of central banks, which guarantee both the value of the currency and the stability of payment systems. To carry out its task, the Banque de France analyses the potential threats associated with payment schemes and defines, in consultation with the parties involved, the minimum security objectives designed to prevent the occurrence of payment-specific risk events. To assess the security of a payment scheme, the Banque de France ensures that the parties involved comply with these objectives.
Developing a conformance methodology for clinically-defined medical record headings:a preliminary report.
Background: The Professional Records Standards Body for health and social care (PRSB) was formed in 2013 to develop and assure professional standards for the content and structure of patient records across all care disciplines in the UK. Although the PRSB work is aimed at Electronic Health Record (EHR) adoption and interoperability to support continuity of care, the current technical guidance is limited and ambiguous.
Objectives: This project was initiated as a proof-ofconcept to demonstrate whether, and if so, how, conformance methods can be developed based on the professional standards. Methods: An expert group was convened, comprising clinical and technical representatives. A constrained data set was defined for an outpatient letter, using the subset of outpatient headings that are also present in the ep-SOS patient summary. A mind map was produced for the main sections and sub-sections. An openEHR archetype model was produced as the basis for creating HL7 and IHE implementation artefacts.
Results: Several issues about data definition and representation were identified when attempting to map the outpatient headings to the epSOS patient summary, partly due to the difference between process and static viewpoints. Mind maps have been a simple and helpful way to visualize the logical information model and expose and resolve disagreements about which headings are purely for human navigation and which, if any, have intrinsic meaning.
Conclusions: Conformance testing is feasible but nontrivial. In contrast to traditional standards-development timescales, PRSB needs an agile standards development process with EHR vendor and integrator collaboration to ensure implementability and widespread adoption. This will require significant clinical and technical resources
- …