99,550 research outputs found
Lipschitz Optimisation for Lipschitz Interpolation
Techniques known as Nonlinear Set Membership prediction, Kinky Inference or
Lipschitz Interpolation are fast and numerically robust approaches to
nonparametric machine learning that have been proposed to be utilised in the
context of system identification and learning-based control. They utilise
presupposed Lipschitz properties in order to compute inferences over unobserved
function values. Unfortunately, most of these approaches rely on exact
knowledge about the input space metric as well as about the Lipschitz constant.
Furthermore, existing techniques to estimate the Lipschitz constants from the
data are not robust to noise or seem to be ad-hoc and typically are decoupled
from the ultimate learning and prediction task. To overcome these limitations,
we propose an approach for optimising parameters of the presupposed metrics by
minimising validation set prediction errors. To avoid poor performance due to
local minima, we propose to utilise Lipschitz properties of the optimisation
objective to ensure global optimisation success. The resulting approach is a
new flexible method for nonparametric black-box learning. We provide
experimental evidence of the competitiveness of our approach on artificial as
well as on real data
Anticipatory Mobile Computing: A Survey of the State of the Art and Research Challenges
Today's mobile phones are far from mere communication devices they were ten
years ago. Equipped with sophisticated sensors and advanced computing hardware,
phones can be used to infer users' location, activity, social setting and more.
As devices become increasingly intelligent, their capabilities evolve beyond
inferring context to predicting it, and then reasoning and acting upon the
predicted context. This article provides an overview of the current state of
the art in mobile sensing and context prediction paving the way for
full-fledged anticipatory mobile computing. We present a survey of phenomena
that mobile phones can infer and predict, and offer a description of machine
learning techniques used for such predictions. We then discuss proactive
decision making and decision delivery via the user-device feedback loop.
Finally, we discuss the challenges and opportunities of anticipatory mobile
computing.Comment: 29 pages, 5 figure
Trust and obfuscation principles for quality of information in emerging pervasive environments
Non peer reviewedPostprin
Location data and privacy: a framework for analysis
Innovative services have exploited data about usersâ physical location, sometimes but not always explicitly with their consent. As new applications that reveal usersâ location data appear on the Web it essential to focus on the privacy implications, in particular with respect to inferences about context. This paper focuses on the understanding of location and contextual privacy by developing a framework for analysis, which is applied to existing systems that exploit location data. The analysis highlights the primal role of location in linking and inferring contextual data, but also how these inferences can extend to non-contextual data
The Significance Of It All: Corporate Disclosure Obligations In Matrixx Initiatives, Inc. v. Siracusano
A Wiener model consists of a linear dynamic system followed by a static nonlinearity. The input and output are measured, but not the intermediate signal. We discuss the Maximum Likelihood estimate for Gaussian measurement and process noise, and the special cases when one of the noise sources is zero
Helmholtzâs Physiological Psychology
Hermann von Helmholtz (1821-1894) established results both controversial and enduring: analysis of mixed colors and of combination tones, arguments against nativism, and the analysis of sensation and perception using the techniques of natural science. The paper focuses on Helmholtzâs account of sensation, perception, and representation via âphysiological psychologyâ. Helmholtz emphasized that external stimuli of sensations are causes, and sensations are their effects, and he had a practical and naturalist orientation toward the analysis of phenomenal experience. However, he argued as well that sensation must be interpreted to yield representation, and that representation is geared toward objective representation (the central thesis of contemporary intentionalism). The interpretation of sensation is based on âfactsâ revealed in experiment, but extends to the analysis of the quantitative, causal relationships between stimuli and responses. A key question for Helmholtzâs theory is the extent to which mental operations are to be ascribed a role in interpreting sensation
Predictive intelligence to the edge through approximate collaborative context reasoning
We focus on Internet of Things (IoT) environments where a network of sensing and computing devices are responsible to locally process contextual data, reason and collaboratively infer the appearance of a specific phenomenon (event). Pushing processing and knowledge inference to the edge of the IoT network allows the complexity of the event reasoning process to be distributed into many manageable pieces and to be physically located at the source of the contextual information. This enables a huge amount of rich data streams to be processed in real time that would be prohibitively complex and costly to deliver on a traditional centralized Cloud system. We propose a lightweight, energy-efficient, distributed, adaptive, multiple-context perspective event reasoning model under uncertainty on each IoT device (sensor/actuator). Each device senses and processes context data and infers events based on different local context perspectives: (i) expert knowledge on event representation, (ii) outliers inference, and (iii) deviation from locally predicted context. Such novel approximate reasoning paradigm is achieved through a contextualized, collaborative belief-driven clustering process, where clusters of devices are formed according to their belief on the presence of events. Our distributed and federated intelligence model efficiently identifies any localized abnormality on the contextual data in light of event reasoning through aggregating local degrees of belief, updates, and adjusts its knowledge to contextual data outliers and novelty detection. We provide comprehensive experimental and comparison assessment of our model over real contextual data with other localized and centralized event detection models and show the benefits stemmed from its adoption by achieving up to three orders of magnitude less energy consumption and high quality of inference
Supporting Defect Causal Analysis in Practice with Cross-Company Data on Causes of Requirements Engineering Problems
[Context] Defect Causal Analysis (DCA) represents an efficient practice to
improve software processes. While knowledge on cause-effect relations is
helpful to support DCA, collecting cause-effect data may require significant
effort and time. [Goal] We propose and evaluate a new DCA approach that uses
cross-company data to support the practical application of DCA. [Method] We
collected cross-company data on causes of requirements engineering problems
from 74 Brazilian organizations and built a Bayesian network. Our DCA approach
uses the diagnostic inference of the Bayesian network to support DCA sessions.
We evaluated our approach by applying a model for technology transfer to
industry and conducted three consecutive evaluations: (i) in academia, (ii)
with industry representatives of the Fraunhofer Project Center at UFBA, and
(iii) in an industrial case study at the Brazilian National Development Bank
(BNDES). [Results] We received positive feedback in all three evaluations and
the cross-company data was considered helpful for determining main causes.
[Conclusions] Our results strengthen our confidence in that supporting DCA with
cross-company data is promising and should be further investigated.Comment: 10 pages, 8 figures, accepted for the 39th International Conference
on Software Engineering (ICSE'17
- âŠ