8,383 research outputs found
Unsupervised Learning from Narrated Instruction Videos
We address the problem of automatically learning the main steps to complete a
certain task, such as changing a car tire, from a set of narrated instruction
videos. The contributions of this paper are three-fold. First, we develop a new
unsupervised learning approach that takes advantage of the complementary nature
of the input video and the associated narration. The method solves two
clustering problems, one in text and one in video, applied one after each other
and linked by joint constraints to obtain a single coherent sequence of steps
in both modalities. Second, we collect and annotate a new challenging dataset
of real-world instruction videos from the Internet. The dataset contains about
800,000 frames for five different tasks that include complex interactions
between people and objects, and are captured in a variety of indoor and outdoor
settings. Third, we experimentally demonstrate that the proposed method can
automatically discover, in an unsupervised manner, the main steps to achieve
the task and locate the steps in the input videos.Comment: Appears in: 2016 IEEE Conference on Computer Vision and Pattern
Recognition (CVPR 2016). 21 page
AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments
This report considers the application of Articial Intelligence (AI) techniques to
the problem of misuse detection and misuse localisation within telecommunications
environments. A broad survey of techniques is provided, that covers inter alia
rule based systems, model-based systems, case based reasoning, pattern matching,
clustering and feature extraction, articial neural networks, genetic algorithms, arti
cial immune systems, agent based systems, data mining and a variety of hybrid
approaches. The report then considers the central issue of event correlation, that
is at the heart of many misuse detection and localisation systems. The notion of
being able to infer misuse by the correlation of individual temporally distributed
events within a multiple data stream environment is explored, and a range of techniques,
covering model based approaches, `programmed' AI and machine learning
paradigms. It is found that, in general, correlation is best achieved via rule based approaches,
but that these suffer from a number of drawbacks, such as the difculty of
developing and maintaining an appropriate knowledge base, and the lack of ability
to generalise from known misuses to new unseen misuses. Two distinct approaches
are evident. One attempts to encode knowledge of known misuses, typically within
rules, and use this to screen events. This approach cannot generally detect misuses
for which it has not been programmed, i.e. it is prone to issuing false negatives.
The other attempts to `learn' the features of event patterns that constitute normal
behaviour, and, by observing patterns that do not match expected behaviour, detect
when a misuse has occurred. This approach is prone to issuing false positives,
i.e. inferring misuse from innocent patterns of behaviour that the system was not
trained to recognise. Contemporary approaches are seen to favour hybridisation,
often combining detection or localisation mechanisms for both abnormal and normal
behaviour, the former to capture known cases of misuse, the latter to capture
unknown cases. In some systems, these mechanisms even work together to update
each other to increase detection rates and lower false positive rates. It is concluded
that hybridisation offers the most promising future direction, but that a rule or state
based component is likely to remain, being the most natural approach to the correlation
of complex events. The challenge, then, is to mitigate the weaknesses of
canonical programmed systems such that learning, generalisation and adaptation
are more readily facilitated
Business Process Maturity at Agricultural Commodities Company
The agricultural commodities company nowadays strive to keep
transforming their business processes in accordance with the fast changing demands
to survive the intense global competition.
In an attempt to provide stakeholder with an insight business process, this paper investigates how to model
business processes with Business Process Modelling Notation and assesses
the process maturity using Gartner�s model.
This research is based on an in-depth observation at purchasing and production
division of agricultural commodities company.
It is found the business process model presents the knowledge from business in
existing context.The other findings show the maturity level of the people is
on phase 1, the maturity level of IT achieves phase including strategic alignment, culture and leadership, governance and methods have already arrived phase 2.
These findings will help company to do big planning for improvement in the future
Data Mining Framework for Monitoring Attacks In Power Systems
Vast deployment of Wide Area Measurement Systems (WAMS) has facilitated in increased understanding and intelligent management of the current complex power systems. Phasor Measurement Units (PMU\u27s), being the integral part of WAMS transmit high quality system information to the control centers every second. With the North American Synchro Phasor Initiative (NAPSI), the number of PMUs deployed across the system has been growing rapidly. With this increase in the number of PMU units, the amount of data accumulated is also growing in a tremendous manner. This increase in the data necessitates the use of sophisticated data processing, data reduction, data analysis and data mining techniques. WAMS is also closely associated with the information and communication technologies that are capable of implementing intelligent protection and control actions in order to improve the reliability and efficiency of the existing power systems. Along with the myriad of advantages that these measurements systems, informational and communication technologies bring, they also lead to a close synergy between heterogeneous physical and cyber components which unlocked access points for easy cyber intrusions. This easy access has resulted in various cyber attacks on control equipment consequently increasing the vulnerability of the power systems.;This research proposes a data mining based methodology that is capable of identifying attacks in the system using the real time data. The proposed methodology employs an online clustering technique to monitor only limited number of measuring units (PMU\u27s) deployed across the system. Two different classification algorithms are implemented to detect the occurrence of attacks along with its location. This research also proposes a methodology to differentiate physical attacks with malicious data attacks and declare attack severity and criticality. The proposed methodology is implemented on IEEE 24 Bus reliability Test System using data generated for attacks at different locations, under different system topologies and operating conditions. Different cross validation studies are performed to determine all the user defined variables involved in data mining studies. The performance of the proposed methodology is completely analyzed and results are demonstrated. Finally the strengths and limitations of the proposed approach are discussed
Recommended from our members
Application of Advanced Early Warning Systems with Adaptive Protection
This project developed and field-tested two methods of Adaptive Protection systems utilizing synchrophasor data. One method detects conditions of system stress that can lead to unintended relay operation, and initiates a supervisory signal to modify relay response in real time to avoid false trips. The second method detects the possibility of false trips of impedance relays as stable system swings “encroach” on the relays’ impedance zones, and produces an early warning so that relay engineers can re-evaluate relay settings. In addition, real-time synchrophasor data produced by this project was used to develop advanced visualization techniques for display of synchrophasor data to utility operators and engineers
Parallelisation of EST clustering
Master of Science - ScienceThe field of bioinformatics has been developing steadily, with computational problems related
to biology taking on an increased importance as further advances are sought. The large data sets involved in problems within computational biology have dictated a search for good, fast approximations to computationally complex problems. This research aims to improve a method used to discover and understand genes, which are small subsequences of DNA. A difficulty arises because genes contain parts we know to be functional and other parts we assume are non-functional as there functions have not been
determined. Isolating the functional parts requires the use of natural biological processes
which perform this separation. However, these processes cannot read long sequences, forcing
biologists to break a long sequence into a large number of small sequences, then reading these. This creates the computational difficulty of categorizing the short fragments according to gene membership.
Expressed Sequence Tag Clustering is a technique used to facilitate the identification of expressed genes by grouping together similar fragments with the assumption that they belong to the same gene.
The aim of this research was to investigate the usefulness of distributed memory parallelisation
for the Expressed Sequence Tag Clustering problem. This was investigated empirically,
with a distributed system tested for speed against a sequential one. It was found that distributed memory parallelisation can be very effective in this domain.
The results showed a super-linear speedup for up to 100 processors, with higher numbers not tested, and likely to produce further speedups. The system was able to cluster 500000 ESTs in 641 minutes using 101 processors
- …