324 research outputs found
Outlier-Resilient Web Service QoS Prediction
The proliferation of Web services makes it difficult for users to select the
most appropriate one among numerous functionally identical or similar service
candidates. Quality-of-Service (QoS) describes the non-functional
characteristics of Web services, and it has become the key differentiator for
service selection. However, users cannot invoke all Web services to obtain the
corresponding QoS values due to high time cost and huge resource overhead.
Thus, it is essential to predict unknown QoS values. Although various QoS
prediction methods have been proposed, few of them have taken outliers into
consideration, which may dramatically degrade the prediction performance. To
overcome this limitation, we propose an outlier-resilient QoS prediction method
in this paper. Our method utilizes Cauchy loss to measure the discrepancy
between the observed QoS values and the predicted ones. Owing to the robustness
of Cauchy loss, our method is resilient to outliers. We further extend our
method to provide time-aware QoS prediction results by taking the temporal
information into consideration. Finally, we conduct extensive experiments on
both static and dynamic datasets. The results demonstrate that our method is
able to achieve better performance than state-of-the-art baseline methods.Comment: 12 pages, to appear at the Web Conference (WWW) 202
TPMCF: Temporal QoS Prediction using Multi-Source Collaborative Features
Recently, with the rapid deployment of service APIs, personalized service
recommendations have played a paramount role in the growth of the e-commerce
industry. Quality-of-Service (QoS) parameters determining the service
performance, often used for recommendation, fluctuate over time. Thus, the QoS
prediction is essential to identify a suitable service among functionally
equivalent services over time. The contemporary temporal QoS prediction methods
hardly achieved the desired accuracy due to various limitations, such as the
inability to handle data sparsity and outliers and capture higher-order
temporal relationships among user-service interactions. Even though some recent
recurrent neural-network-based architectures can model temporal relationships
among QoS data, prediction accuracy degrades due to the absence of other
features (e.g., collaborative features) to comprehend the relationship among
the user-service interactions. This paper addresses the above challenges and
proposes a scalable strategy for Temporal QoS Prediction using Multi-source
Collaborative-Features (TPMCF), achieving high prediction accuracy and faster
responsiveness. TPMCF combines the collaborative-features of users/services by
exploiting user-service relationship with the spatio-temporal auto-extracted
features by employing graph convolution and transformer encoder with multi-head
self-attention. We validated our proposed method on WS-DREAM-2 datasets.
Extensive experiments showed TPMCF outperformed major state-of-the-art
approaches regarding prediction accuracy while ensuring high scalability and
reasonably faster responsiveness.Comment: 10 Pages, 7 figure
A Location-sensitive and Network-aware Broker for Recommending Web Services
Collaborative Filtering (CF) is one of the renowned recommendation techniques that can be used for predicting unavailable Quality-of-Service (QoS) values of Web services. Although several CF-based approaches have been proposed in recent years, the accuracy of the QoS values, that these approaches provide, raises some concerns and hence, could undermine the real ”quality” of Web services. To address these concerns, context information such as communication-network configuration and user location could be integrated into the process of developing recommendations. Building upon such context information, this paper proposes a CF-based Web Services recommendation approach, which incorporates the effect of locations of users, communication-network configurations of users, andWeb services run-time environments on the recommendations. To evaluate the accuracy of the recommended Web services based on the defined QoS values a set of comprehensive experiments are conducted using a real dataset of Web services. The experiments are in line with the importance of integrating context into recommendations
Prediction, Recommendation and Group Analytics Models in the domain of Mashup Services and Cyber-Argumentation Platform
Mashup application development is becoming a widespread software development practice due to its appeal for a shorter application development period. Application developers usually use web APIs from different sources to create a new streamlined service and provide various features to end-users. This kind of practice saves time, ensures reliability, accuracy, and security in the developed applications. Mashup application developers integrate these available APIs into their applications. Still, they have to go through thousands of available web APIs and chose only a few appropriate ones for their application. Recommending relevant web APIs might help application developers in this situation. However, very low API invocation from mashup applications creates a sparse mashup-web API dataset for the recommendation models to learn about the mashups and their web API invocation pattern. One research aims to analyze these mashup-specific critical issues, look for supplemental information in the mashup domain, and develop web API recommendation models for mashup applications. The developed recommendation model generates useful and accurate web APIs to reduce the impact of low API invocations in mashup application development.
Cyber-Argumentation platform also faces a similarly challenging issue. In large-scale cyber argumentation platforms, participants express their opinions, engage with one another, and respond to feedback and criticism from others in discussing important issues online. Argumentation analysis tools capture the collective intelligence of the participants and reveal hidden insights from the underlying discussions. However, such analysis requires that the issues have been thoroughly discussed and participant’s opinions are clearly expressed and understood. Participants typically focus only on a few ideas and leave others unacknowledged and underdiscussed. This generates a limited dataset to work with, resulting in an incomplete analysis of issues in the discussion. One solution to this problem would be to develop an opinion prediction model for cyber-argumentation. This model would predict participant’s opinions on different ideas that they have not explicitly engaged.
In cyber-argumentation, individuals interact with each other without any group coordination. However, the implicit group interaction can impact the participating user\u27s opinion, attitude, and discussion outcome. One of the objectives of this research work is to analyze different group analytics in the cyber-argumentation environment. The objective is to design an experiment to inspect whether the critical concepts of the Social Identity Model of Deindividuation Effects (SIDE) are valid in our argumentation platform. This experiment can help us understand whether anonymity and group sense impact user\u27s behavior in our platform. Another section is about developing group interaction models to help us understand different aspects of group interactions in the cyber-argumentation platform.
These research works can help develop web API recommendation models tailored for mashup-specific domains and opinion prediction models for the cyber-argumentation specific area. Primarily these models utilize domain-specific knowledge and integrate them with traditional prediction and recommendation approaches. Our work on group analytic can be seen as the initial steps to understand these group interactions
Gaussian-based Probabilistic Deep Supervision Network for Noise-Resistant QoS Prediction
Quality of Service (QoS) prediction is an essential task in recommendation
systems, where accurately predicting unknown QoS values can improve user
satisfaction. However, existing QoS prediction techniques may perform poorly in
the presence of noise data, such as fake location information or virtual
gateways. In this paper, we propose the Probabilistic Deep Supervision Network
(PDS-Net), a novel framework for QoS prediction that addresses this issue.
PDS-Net utilizes a Gaussian-based probabilistic space to supervise intermediate
layers and learns probability spaces for both known features and true labels.
Moreover, PDS-Net employs a condition-based multitasking loss function to
identify objects with noise data and applies supervision directly to deep
features sampled from the probability space by optimizing the Kullback-Leibler
distance between the probability space of these objects and the real-label
probability space. Thus, PDS-Net effectively reduces errors resulting from the
propagation of corrupted data, leading to more accurate QoS predictions.
Experimental evaluations on two real-world QoS datasets demonstrate that the
proposed PDS-Net outperforms state-of-the-art baselines, validating the
effectiveness of our approach
Machine Learning in Wireless Sensor Networks: Algorithms, Strategies, and Applications
Wireless sensor networks monitor dynamic environments that change rapidly
over time. This dynamic behavior is either caused by external factors or
initiated by the system designers themselves. To adapt to such conditions,
sensor networks often adopt machine learning techniques to eliminate the need
for unnecessary redesign. Machine learning also inspires many practical
solutions that maximize resource utilization and prolong the lifespan of the
network. In this paper, we present an extensive literature review over the
period 2002-2013 of machine learning methods that were used to address common
issues in wireless sensor networks (WSNs). The advantages and disadvantages of
each proposed algorithm are evaluated against the corresponding problem. We
also provide a comparative guide to aid WSN designers in developing suitable
machine learning solutions for their specific application challenges.Comment: Accepted for publication in IEEE Communications Surveys and Tutorial
An Approach of QoS Evaluation for Web Services Design With Optimized Avoidance of SLA Violations
Quality of service (QoS) is an official agreement that governs the contractual commitments between service providers and consumers in respect to various nonfunctional requirements, such as performance, dependability, and security. While more Web services are available for the construction of software systems based upon service-oriented architecture (SOA), QoS has become a decisive factor for service consumers to choose from service providers who provide similar services. QoS is usually documented on a service-level agreement (SLA) to ensure the functionality and quality of services and to define monetary penalties in case of any violation of the written agreement. Consequently, service providers have a strong interest in keeping their commitments to avoid and reduce the situations that may cause SLA violations.However, there is a noticeable shortage of tools that can be used by service providers to either quantitively evaluate QoS of their services for the predication of SLA violations or actively adjust their design for the avoidance of SLA violations with optimized service reconfigurations. Developed in this dissertation research is an innovative framework that tackles the problem of SLA violations in three separated yet connected phases. For a given SOA system under examination, the framework employs sensitivity analysis in the first phase to identify factors that are influential to system performance, and the impact of influential factors on QoS is then quantitatively measured with a metamodel-based analysis in the second phase. The results of analyses are then used in the third phase to search both globally and locally for optimal solutions via a controlled number of experiments. In addition to technical details, this dissertation includes experiment results to demonstrate that this new approach can help service providers not only predicting SLA violations but also avoiding the unnecessary increase of the operational cost during service optimization
- …