40,298 research outputs found
Assessing the maturity of software testing services using CMMI-SVC: An industrial case study
Context: While many companies conduct their software testing activities
in-house, many other companies outsource their software testing needs to other
firms who act as software testing service providers. As a result, Testing as a
Service (TaaS) has emerged as a strong service industry in the last several
decades. In the context of software testing services, there could be various
challenges (e.g., during the planning and service delivery phases) and, as a
result, the quality of testing services is not always as expected. Objective:
It is important, for both providers and also customers of testing services, to
assess the quality and maturity of test services and subsequently improve them.
Method: Motivated by a real industrial need in the context of several testing
service providers, to assess the maturity of their software testing services,
we chose the existing CMMI for Services maturity model (CMMI-SVC), and
conducted a case study using it in the context of two Turkish testing service
providers. Results: The case-study results show that maturity appraisal of
testing services using CMMI-SVC was helpful for both companies and their test
management teams by enabling them objectively assess the maturity of their
testing services and also by pinpointing potential improvement areas.
Conclusion: We empirically observed that, after some minor customization,
CMMI-SVC is indeed a suitable model for maturity appraisal of testing services
Using Social Network Service to determine the Initial User Requirements for Small Software Businesses
Background/Objectives: Software engineering community has been studied
extensively on large-sized software organizations and has provided suitable and
interesting solutions. However, small software companies that make a large part
of the software industry have been overlooked. Methods/Statistical analysis:
The current requirement engineering practices are not suitable for small
software companies. We propose a social network-based requirement engineering
approach that will complement the traditional requirement engineering
approaches and will make it suitable for small software companies. Findings: We
have applied our SNS-based requirements determination approach to knowing about
its validity. As a result, we concluded that 33.06 % of invited end-users
participated in our approach and figured out 156 distinct user requirements. It
has been seen that it was not necessary for users to have requirements
engineering knowledge to participate in our proposed SNS-based approach that
made maximum users to be involved during requirements elicitation process. By
investigating the ideas and opinions communicated by users, we were able to
figure out a high number of user requirements. It was observed that maximum
user-requirements were determined within a short period of time (7days). Our
experience with SNS-based approach also says that end-users hardly know about
non-functional requirements and express it explicitly.
Improvements/Applications: we believe that researchers will consider SNS other
than Facebook that would allow applying our SNS-based approach for requirements
identification. We have experienced our approach with Facebook but we do not
know how our approach would actually work with other SNSs
Particle Swarm Optimization: A survey of historical and recent developments with hybridization perspectives
Particle Swarm Optimization (PSO) is a metaheuristic global optimization
paradigm that has gained prominence in the last two decades due to its ease of
application in unsupervised, complex multidimensional problems which cannot be
solved using traditional deterministic algorithms. The canonical particle swarm
optimizer is based on the flocking behavior and social co-operation of birds
and fish schools and draws heavily from the evolutionary behavior of these
organisms. This paper serves to provide a thorough survey of the PSO algorithm
with special emphasis on the development, deployment and improvements of its
most basic as well as some of the state-of-the-art implementations. Concepts
and directions on choosing the inertia weight, constriction factor, cognition
and social weights and perspectives on convergence, parallelization, elitism,
niching and discrete optimization as well as neighborhood topologies are
outlined. Hybridization attempts with other evolutionary and swarm paradigms in
selected applications are covered and an up-to-date review is put forward for
the interested reader.Comment: 34 pages, 7 table
Improvement of the sensory and autonomous capability of robots through olfaction: the IRO Project
Proyecto de Excelencia Junta de AndalucÃa TEP2012-530Olfaction is a valuable source of information about the environment that has not been su ciently exploited in mobile robotics
yet. Certainly, odor information can contribute to other sensing modalities, e.g. vision, to successfully accomplish high-level robot
activities, such as task planning or execution in human environments. This paper describes the developments carried out in the scope of the IRO project, which aims at making progress in this direction by investigating mechanisms that exploit odor information (usually coming in the form of the type of volatile and its concentration) in problems like object recognition and scene-activity understanding. A distinctive aspect of this research is the special attention paid to the role of semantics within the robot perception and decisionmaking processes. The results of the IRO project have improved the robot capabilities in terms of efciency, autonomy and usefulness.Universidad de Málaga. Campus de Excelencia Internacional AndalucÃa Tec
GTTC Future of Ground Testing Meta-Analysis of 20 Documents
National research, development, test, and evaluation ground testing capabilities in the United States are at risk. There is a lack of vision and consensus on what is and will be needed, contributing to a significant threat that ground test capabilities may not be able to meet the national security and industrial needs of the future. To support future decisions, the AIAA Ground Testing Technical Committees (GTTC) Future of Ground Test (FoGT) Working Group selected and reviewed 20 seminal documents related to the application and direction of ground testing. Each document was reviewed, with the content main points collected and organized into sections in the form of a gap analysis current state, future state, major challenges/gaps, and recommendations. This paper includes key findings and selected commentary by an editing team
Improved deep learning based macromolecules structure classification from electron cryo tomograms
Cellular processes are governed by macromolecular complexes inside the cell.
Study of the native structures of macromolecular complexes has been extremely
difficult due to lack of data. With recent breakthroughs in Cellular electron
cryo tomography (CECT) 3D imaging technology, it is now possible for
researchers to gain accesses to fully study and understand the macromolecular
structures single cells. However, systematic recovery of macromolecular
structures from CECT is very difficult due to high degree of structural
complexity and practical imaging limitations. Specifically, we proposed a deep
learning based image classification approach for large-scale systematic
macromolecular structure separation from CECT data. However, our previous work
was only a very initial step towards exploration of the full potential of deep
learning based macromolecule separation. In this paper, we focus on improving
classification performance by proposing three newly designed individual CNN
models: an extended version of (Deep Small Receptive Field) DSRF3D, donated as
DSRF3D-v2, a 3D residual block based neural network, named as RB3D and a
convolutional 3D(C3D) based model, CB3D. We compare them with our previously
developed model (DSRF3D) on 12 datasets with different SNRs and tilt angle
ranges. The experiments show that our new models achieved significantly higher
classification accuracies. The accuracies are not only higher than 0.9 on
normal datasets, but also demonstrate potentials to operate on datasets with
high levels of noises and missing wedge effects presented.Comment: Preliminary working repor
A Decentralized Time- and Energy-Optimal Control Framework for Connected Automated Vehicles: From Simulation to Field Test
The implementation of connected and automated vehicle (CAV) technologies
enables a novel computational framework for real-time control aimed at
optimizing energy consumption with associated benefits. In this paper, we
implement an optimal control framework, developed previously, in an Audi A3
etron plug-in hybrid electric vehicle, and demonstrate that we can improve the
vehicle's efficiency and travel time in a corridor including an on-ramp
merging, a speed reduction zone, and a roundabout. Our exposition includes the
development, integration, implementation and validation of the proposed
framework in (1) simulation, (2) hardware-in-the-loop (HIL) testing, (3)
connectivity enabled virtual reality based bench-test, and (4) field test in
Mcity. We show that by adopting such inexpensive, yet effective process, we can
efficiently integrate and test the controller framework, ensure proper
connectivity and data transmission between different modules of the system, and
reduce uncertainty. We evaluate the performance and effectiveness of the
control framework and observe significant improvement in terms of energy and
travel time compared to the baseline scenario
Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services
One of the most widely-implemented service standards provided by the Open
Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS).
WMS is widely employed globally, but there is limited knowledge of the global
distribution, adoption status or the service quality of these online WMS
resources. To fill this void, we investigated global WMSs resources and
performed distributed performance monitoring of these services. This paper
explicates a distributed monitoring framework that was used to monitor 46,296
WMSs continuously for over one year and a crawling method to discover these
WMSs. We analyzed server locations, provider types, themes, the spatiotemporal
coverage of map layers and the service versions for 41,703 valid WMSs.
Furthermore, we appraised the stability and performance of basic operations for
1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major
reasons for request errors and performance issues, as well as the relationship
between service response times and the spatiotemporal distribution of client
monitoring sites. This paper will help service providers, end users and
developers of standards to grasp the status of global WMS resources, as well as
to understand the adoption status of OGC standards. The conclusions drawn in
this paper can benefit geospatial resource discovery, service performance
evaluation and guide service performance improvements.Comment: 24 pages; 15 figure
Internet of Things: Survey on Security and Privacy
The Internet of Things (IoT) is intended for ubiquitous connectivity among
different entities or "things". While its purpose is to provide effective and
efficient solutions, security of the devices and network is a challenging
issue. The number of devices connected along with the ad-hoc nature of the
system further exacerbates the situation. Therefore, security and privacy has
emerged as a significant challenge for the IoT. In this paper,we aim to provide
a thorough survey related to the privacy and security challenges of the IoT.
This document addresses these challenges from the perspective of technologies
and architecture used. This work focuses also in IoT intrinsic vulnerabilities
as well as the security challenges of various layers based on the security
principles of data confidentiality, integrity and availability. This survey
analyzes articles published for the IoT at the time and relates it to the
security conjuncture of the field and its projection to the future.Comment: 16 pages, 3 figure
Learning from Context: Exploiting and Interpreting File Path Information for Better Malware Detection
Machine learning (ML) used for static portable executable (PE) malware
detection typically employs per-file numerical feature vector representations
as input with one or more target labels during training. However, there is much
orthogonal information that can be gleaned from the \textit{context} in which
the file was seen. In this paper, we propose utilizing a static source of
contextual information -- the path of the PE file -- as an auxiliary input to
the classifier. While file paths are not malicious or benign in and of
themselves, they do provide valuable context for a malicious/benign
determination. Unlike dynamic contextual information, file paths are available
with little overhead and can seamlessly be integrated into a multi-view static
ML detector, yielding higher detection rates at very high throughput with
minimal infrastructural changes. Here we propose a multi-view neural network,
which takes feature vectors from PE file content as well as corresponding file
paths as inputs and outputs a detection score. To ensure realistic evaluation,
we use a dataset of approximately 10 million samples -- files and file paths
from user endpoints of an actual security vendor network. We then conduct an
interpretability analysis via LIME modeling to ensure that our classifier has
learned a sensible representation and see which parts of the file path most
contributed to change in the classifier's score. We find that our model learns
useful aspects of the file path for classification, while also learning
artifacts from customers testing the vendor's product, e.g., by downloading a
directory of malware samples each named as their hash. We prune these artifacts
from our test dataset and demonstrate reductions in false negative rate of
32.3% at a false positive rate (FPR) and 33.1% at FPR, over
a similar topology single input PE file content only model.Comment: Submitted to ACM CCS 201
- …