Blekinge Institute of Technology
Electronic Research Archive - Blekinge Tekniska HögskolaNot a member yet
1589 research outputs found
Sort by
A No-Reference Bitstream-based Perceptual Model for Video Quality Estimation of Videos Affected by Coding Artifacts and Packet Losses
In this work, we propose a No-Reference (NR) bitstream-based model for
predicting the quality of H.264/AVC video sequences, aeffected by both
compression artifacts and transmission impairments. The concept of the article
is based on a feature extraction procedure, where a large number of features
are calculated from the impaired bitstream. Many of the features are mostly
proposed in this work, while the specificc set of the features as a whole is
applied for the first time for making NR video quality predictions. All
feature observations are taken as input to the Least Absolute Shrinkage and
Selection Operator (LASSO) regression method. LASSO indicates the most
important features, and using only them, it is able to estimate the Mean
Opinion Score (MOS) with high accuracy. Indicatively, we point out that only 13
features are able to produce a Pearson Correlation Coefficient of 0:92 with
the MOS. Interestingly, the performance statistics we computed in order to
assess our method for predicting the Structural Similarity Index and the Video
Quality Metric are equally good. Thus, the obtained experimental results verifi
ed the suitability of the features selected by LASSO as well as the ability of
LASSO in making accurate predictions through sparse modeling
The Contextual Nature of Innovation - An Empirical Investigation of Three Software Intensive Products
Context:
New products create significant opportunities for differentiation and
competitive advantage. To increase the chances of new product success, a
universal set of critical activities and determinants have been recommended.
Some researchers believe, however, that these factors are not universal, but
are contextual.
Objective:
This paper reports innovation processes followed to develop three software
intensive products for understanding how and why innovation practice is
dependent on innovation context.
Method:
This paper reports innovation processes and practices with an in-depth
multi-case study of three software product innovations from Ericsson, IBM, and
Rorotika. It describes the actual innovation processes followed in the three
cases and discusses the observed innovation practice and relates it to
state-of-the-art.
Results:
The cases point to a set of contextual factors that influence the choice of
innovation activities and determinants for developing successful product
innovations. The cases provide evidence that innovation practice cannot be
standardized, but is contextual in nature.
Conclusion:
The rich description of the interaction between context and innovation practice
enables future investigations into contextual elements that influence
innovation practice, and calls for the creation of frameworks enabling activity
and determinant selection for a given context – since one size does not fit
all
In press: A Cross-Layer Optimized Scheme and Its Application in Mobile Multimedia Networks With QoS Provision
To cope with the rapid growth of multimedia applications that requires dynamic
levels of quality of service (QoS), cross-layer (CL) design, where multiple
protocol layers are jointly combined, has been considered to provide diverse
QoS provisions for mobile multimedia networks. However, there is a lack of a
general mathematical framework to model such CL scheme in wireless networks
with different types of multimedia classes. In this paper, to overcome this
shortcoming, we therefore propose a novel CL design for integrated
real-time/non-real-time traffic with strict preemptive priority via a
finite-state Markov chain. The main strategy of the CL scheme is to design a
Markov model by explicitly including adaptive modulation and coding at the
physical layer, queuing at the data link layer, and the bursty nature of
multimedia traffic classes at the application layer. Utilizing this Markov
model, several important performance metrics in terms of packet loss rate,
delay, and throughput are examined. In addition, our proposed framework is
exploited in various multimedia applications, for example, the end-to-end
real-time video streaming and CL optimization, which require the priority-based
QoS adaptation for different applications. More importantly, the CL framework
reveals important guidelines as to optimize the network performance
Feasibility of using existing open access networks to support the harmonization of open access
In this report we identify and assess different options for bringing together
and mobilizing relevant stakeholders in the open access arena. This builds on
previous work done in the RECODE project about the open access stakeholder
ecosystem, the barriers and the enablers for open access to research data as
well as on the RECODE overarching policy recommendations set out in the RECODE
work package 5. We have identified a sample of existing networks and
organizations to determine whether they are able to meet stakeholder needs and
mobilization objectives for open access, data preservation, dissemination and
use, and what these organizations can or must do in order to better mobilize
stakeholders in order to meet the policy recommendations.
We have identified not only examples of good practice but also the barriers not
addressed by any currently available structures or organizations. While there
are many forces striving to make data sharing common practice, there is still a
lack of an overarching, international initiative to implement necessary
requirements for making data sharing a truly international community asset.
Within the European community the prerequisites and the ambitions are very
different among the member states. Even amongst those countries where open
access has long been on the agenda, there remain issues and barriers to be
addressed. For example, training is needed in most of the stakeholder groups;
investment in infrastructure building and funding is required in most
institutions dealing with open access to research data; much work remains to be
done to convince researchers that sharing data is a good idea. These are just a
few of the obstacles that still need attention despite good, on-going efforts
by individuals, organizations and states
VLQoE: Video QoE instrumentation on the smartphone
The usage of network-demanding applications is growing rapidly such as video
streaming on mobile terminals. However, network and/or service providers might
not guarantee the perceived quality for video streaming that demands high
packet transmission rate. In order to satisfy the user expectations and to
minimize user churn, it is important for network operators to infer the
end-user perceived quality in video streaming. Today, the most reliable method
to obtain end-user perceived quality is through subjective tests, and the
preferred location is the user interface as it is the closest point of
application to the end-user. The end-user perceived quality on video streaming
is highly influenced by occasional freezes; technically the extraordinary time
gaps between two consecutive pictures that are displayed to the user, i.e.,
high inter-picture time. In this paper, we present a QoE instrumentation for
video streaming, VLQoE. We added functionality to the VLC player to record a
set of metrics from the user interface, application-level, network-level, and
from the available sensors of the device. To the best of our knowledge, VLQoE
is the first tool of its kind that can be used in user experiments for video
streaming. By using the tool, we present a two state model based on the
inter-picture time, for the HTTP- and RTSP-based video streaming via 3.5G.
Next, we studied the influence of inter-picture time on the user perceived
quality through out a user study. We investigated the minimum user perceived
inter-picture time, and the user response time
Operationalization of lean thinking through value stream mapping with simulation and FLOW
Background: The continued success of Lean thinking beyond manufacturing has led
to an increasing interest to utilize it in software engineering (SE). Value
Stream Mapping (VSM) had a pivotal role in the operationalization of Lean
thinking. However, this has not been recognized in SE adaptations of Lean.
Furthermore, there are two main shortcomings in existing adaptations of VSM for
an SE context. First, the assessments for the potential of the proposed
improvements are based on idealistic assertions. Second, the current VSM
notation and methodology are unable to capture the myriad of significant
information flows, which in software development go beyond just the schedule
information about the flow of a software artifact through a process.
Objective: This thesis seeks to assess Software Process Simulation Modeling
(SPSM) as a solution to the first shortcoming of VSM. In this regard,
guidelines to perform simulation-based studies in industry are consolidated,
and the usefulness of VSM supported with SPSM is evaluated. To overcome the
second shortcoming of VSM, a suitable approach for capturing rich information
flows in software development is identified and its usefulness to support VSM
is evaluated. Overall, an attempt is made to supplement existing guidelines for
conducting VSM to overcome its known shortcomings and support adoption of Lean
thinking in SE. The usefulness and scalability of these proposals is evaluated
in an industrial setting.
Method: Three literature reviews, one systematic literature review, four
industrial case studies, and a case study in an academic context were conducted
as part of this research.
Results: Little evidence to substantiate the claims of the usefulness of SPSM
was found. Hence, prior to combining it with VSM, we consolidated the
guidelines to conduct an SPSM based study and evaluated the use of SPSM in
academic and industrial contexts. In education, it was found to be a useful
complement to other teaching methods, and in the industry, it triggered useful
discussions and was used to challenge practitioners’ perceptions about the
impact of existing challenges and proposed improvements. The combination of VSM
with FLOW (a method and notation to capture information flows, since existing
VSM adaptions for SE are insufficient for this purpose) was successful in
identifying challenges and improvements related to information needs in the
process. Both proposals to support VSM with simulation and FLOW led to
identification of waste and improvements (which would not have been possible
with conventional VSM), generated more insightful discussions and resulted in
more realistic improvements.
Conclusion: This thesis characterizes the context and shows how SPSM was
beneficial both in the industrial and academic context. FLOW was found to be a
scalable, lightweight supplement to strengthen the information flow analysis in
VSM. Through successful industrial application and uptake, this thesis provides
evidence of the usefulness of the proposed improvements to the VSM activities
Realistic Package Opening Simulations : An Experimental Mechanics and Physics Based Approach
A finite element modeling strategy targeting package opening simulations is the
final goal with this work. The developed simulation model will be used to
proactively predict the opening compatibility early in the development process
of a new opening device and/or a new packaging material. To be able to create
such a model, the focus is to develop a combined and integrated
physical/virtual test procedure for mechanical characterization and calibration
of thin packaging materials. Furthermore, the governing mechanical properties
of the materials involved in the opening performance needs to be identified and
quantified with experiments. Different experimental techniques complemented
with video recording equipment were refined and utilized during the course of
work. An automatic or semi-automatic material model parameter identification
process involving video capturing of the deformation process and inverse
modeling is proposed for the different packaging material layers. Both an
accurate continuum model and a damage material model, used in the simulation
model, were translated and extracted from the experimental test results.
The results presented show that it is possible to select constitutive material
models in conjunction with continuum material damage models, adequately
predicting the mechanical behavior of intended failure in thin laminated
packaging materials. A thorough material mechanics understanding of individual
material layers evolution of microstructure and the micro mechanisms involved
in the deformation process is essential for appropriate selection of numerical
material models. Finally, with a slight modification of already available
techniques and functionalities in the commercial finite element software
AbaqusTM it was possible to build the suitable simulation model.
To build a realistic simulation model an accurate description of the
geometrical features is important. Therefore, advancements within the
experimental visualization techniques utilizing a combination of video
recording, photoelasticity and Scanning Electron Microscopy (SEM) of the micro
structure have enabled extraction of geometries and additional information from
ordinary standard experimental tests. Finally, a comparison of the experimental
opening and the virtual opening, showed a good correlation with the developed
finite element modeling technique.
The advantage with the developed modeling approach is that it is possible to
modify the material composition of the laminate. Individual material layers can
be altered and the mechanical properties, thickness or geometrical shape can be
changed. Furthermore, the model is flexible and a new opening device i.e.
geometry and load case can easily be adopted in the simulation model.
Therefore, this type of simulation model is a useful tool and can be used for
decision support early in the concept selection of development projects
On the Performance of Underlay Cognitive Radio Networks with Interference Constraints and Relaying
Efficiently allocating the scarce and expensive radio resources is a key
challenge for advanced radio communication systems. To this end, cognitive
radio (CR) has emerged as a promising solution which can offer considerable
improvements in spectrum utilization. Furthermore, cooperative communication is
a concept proposed to obtain spatial diversity gains through relays without
requiring multiple antennas. To benefit from both CR and cooperative
communications, a combination of CR networks (CRNs) with cooperative relaying
referred to as cognitive cooperative relay networks (CCRNs) has recently been
proposed. CCRNs can better utilize the radio spectrum by allowing the secondary
users (SUs) to opportunistically access spectrum, share spectrum with primary
users (PUs), and provide performance gains offered by cooperative relaying. In
this thesis, a performance analysis of underlay CRNs and CCRNs in different
fading channels is provided based on analytical expressions, numerical results,
and simulations. To allocate power in the CCRNs, power allocation policies are
proposed which consider the peak transmit power limit of the SUs and the outage
probability constraint of the primary network. Thus, the impact of multiuser
diversity, peak transmit power, fading parameters, and modulation schemes on
the performance of the CRNs and CCRNs can be analyzed.
The thesis is divided into an introduction and five research parts based on
peer-reviewed conference papers and journal articles. The introduction provides
fundamental background on spectrum sharing systems, fading channels, and
performance metrics. In the first part, a basic underlay CRN is analyzed where
the outage probability and the ergodic capacity of the network over general
fading channels is derived. In the second part, the outage probability and the
ergodic capacity of an underlay CRN are assessed capturing the effect of
multiuser diversity on the network subject to Nakagami-m fading. Considering
the presence of a PU transmitter (PU-Tx), a power allocation policy is derived
and utilized for CRN performance analysis under Rayleigh fading. In the third
part, the impact of multiple PU-Txs and multiple PU receivers (PU-Rxs) on the
outage probability of an underlay CCRN is studied. The outage constraint at the
PU-Rx and the peak transmit power constraint of the SUs are taken into account
to derive the power allocation policies for the SUs. In the fourth part,
analytical expressions for the outage probability and symbol error probability
for CCRNs are derived where signal combining schemes at the SU receiver (SU-Rx)
are compared. Finally, the fifth part applies a sleep/wake-up strategy and the
min(N; T) policy to an underlay CRN. The SUs of the network operate as wireless
sensor nodes under Nakagami-m fading. A power consumption function of the CRN
is derived. Further, the impact of M/G/1 queue and fading channel parameters on
the power consumption is assessed
Heterogeneous Systems Testing Techniques: An Exploratory Survey
Heterogeneous systems comprising sets of inherent subsystems are challenging to
integrate. In particular, testing for interoperability and conformance is a
challenge. Furthermore, the complexities of such systems amplify traditional
testing challenges. We explore (1) which techniques are frequently discussed in
literature in context of heterogeneous system testing that practitioners use to
test their heterogeneous systems; (2) the perception of the practitioners on
the usefulness of the techniques with respect to a defined set of outcome
variables. For that, we conducted an exploratory survey. A total of 27 complete
survey answers have been received. Search-based testing has been used by 14 out
of 27 respondents, indicating the practical relevance of the approach for
testing heterogeneous systems, which itself is relatively new and has only
recently been studied extensively. The most frequently used technique is
exploratory manual testing, followed by combinatorial testing. With respect to
the perceived performance of the testing techniques, the practitioners were
undecided regarding many of the studied variables. Manual exploratory testing
received very positive ratings across outcome variables