9,649 research outputs found
Empirical study of sensor observation services server instances
The number of Sensor Observation Service (SOS) instances available online has
been increasing in the last few years. The SOS specification standardises
interfaces and data formats for exchanging sensor-related in-formation between
information providers and consumers. SOS in conjunction with other
specifications in the Sensor Web Enablement initiative, at-tempts to realise
the Sensor Web vision, a worldwide system where sensor networks of any kind are
interconnected. In this paper we present an empirical study of actual instances
of servers implementing SOS. The study focuses mostly in which parts of the
specification are more frequently included in real implementations, and how
exchanged messages follows the structure defined by XML Schema files. Our
findings can be of practical use when implementing servers and clients based on
the SOS specification, as they can be optimized for common scenarios.Comment: 25 pages, 11 tables, 6 figure
A Federated Filtering Framework for Internet of Medical Things
Based on the dominant paradigm, all the wearable IoT devices used in the
healthcare sector also known as the internet of medical things (IoMT) are
resource constrained in power and computational capabilities. The IoMT devices
are continuously pushing their readings to the remote cloud servers for
real-time data analytics, that causes faster drainage of the device battery.
Moreover, other demerits of continuous centralizing of data include exposed
privacy and high latency. This paper presents a novel Federated Filtering
Framework for IoMT devices which is based on the prediction of data at the
central fog server using shared models provided by the local IoMT devices. The
fog server performs model averaging to predict the aggregated data matrix and
also computes filter parameters for local IoMT devices. Two significant
theoretical contributions of this paper are the global tolerable perturbation
error () and the local filtering parameter (); where the
former controls the decision-making accuracy due to eigenvalue perturbation and
the later balances the tradeoff between the communication overhead and
perturbation error of the aggregated data matrix (predicted matrix) at the fog
server. Experimental evaluation based on real healthcare data demonstrates that
the proposed scheme saves upto 95\% of the communication cost while maintaining
reasonable data privacy and low latency.Comment: 6 pages, 6 Figures, accepted for oral presentation in IEEE ICC 2019,
Internet of Things, Federated Learning and Perturbation theor
QUAL : A Provenance-Aware Quality Model
The research described here is supported by the award made by the RCUK Digital Economy program to the dot.rural Digital Economy Hub; award reference: EP/G066051/1.Peer reviewedPostprin
Dealing with large schema sets in mobile SOS-based applications
Although the adoption of OGC Web Services for server, desktop and web
applications has been successful, its penetration in mobile devices has been
slow. One of the main reasons is the performance problems associated with XML
processing as it consumes a lot of memory and processing time, which are scarce
resources in a mobile device. In this paper we propose an algorithm to generate
efficient code for XML data binding for mobile SOS-based applications. The
algorithm take advantage of the fact that individual implementations use only
some portions of the standards' schemas, which allows the simplification of
large XML schema sets in an application-specific manner by using a subset of
XML instance files conforming to these schemas.Comment: 9 pages, 2 tables, 7 figure
Quality assessment technique for ubiquitous software and middleware
The new paradigm of computing or information systems is ubiquitous computing systems. The technology-oriented issues of ubiquitous computing systems have made researchers pay much attention to the feasibility study of the technologies rather than building quality assurance indices or guidelines. In this context, measuring quality is the key to developing high-quality ubiquitous computing products. For this reason, various quality models have been defined, adopted and enhanced over the years, for example, the need for one recognised standard quality model (ISO/IEC 9126) is the result of a consensus for a software quality model on three levels: characteristics, sub-characteristics, and metrics. However, it is very much unlikely that this scheme will be directly applicable to ubiquitous computing environments which are considerably different to conventional software, trailing a big concern which is being given to reformulate existing methods, and especially to elaborate new assessment techniques for ubiquitous computing environments. This paper selects appropriate quality characteristics for the ubiquitous computing environment, which can be used as the quality target for both ubiquitous computing product evaluation processes ad development processes. Further, each of the quality characteristics has been expanded with evaluation questions and metrics, in some cases with measures. In addition, this quality model has been applied to the industrial setting of the ubiquitous computing environment. These have revealed that while the approach was sound, there are some parts to be more developed in the future
Secure Pick Up: Implicit Authentication When You Start Using the Smartphone
We propose Secure Pick Up (SPU), a convenient, lightweight, in-device,
non-intrusive and automatic-learning system for smartphone user authentication.
Operating in the background, our system implicitly observes users' phone
pick-up movements, the way they bend their arms when they pick up a smartphone
to interact with the device, to authenticate the users.
Our SPU outperforms the state-of-the-art implicit authentication mechanisms
in three main aspects: 1) SPU automatically learns the user's behavioral
pattern without requiring a large amount of training data (especially those of
other users) as previous methods did, making it more deployable. Towards this
end, we propose a weighted multi-dimensional Dynamic Time Warping (DTW)
algorithm to effectively quantify similarities between users' pick-up
movements; 2) SPU does not rely on a remote server for providing further
computational power, making SPU efficient and usable even without network
access; and 3) our system can adaptively update a user's authentication model
to accommodate user's behavioral drift over time with negligible overhead.
Through extensive experiments on real world datasets, we demonstrate that SPU
can achieve authentication accuracy up to 96.3% with a very low latency of 2.4
milliseconds. It reduces the number of times a user has to do explicit
authentication by 32.9%, while effectively defending against various attacks.Comment: Published on ACM Symposium on Access Control Models and Technologies
(SACMAT) 201
- …