7,830 research outputs found
The Evolution of Neural Network-Based Chart Patterns: A Preliminary Study
A neural network-based chart pattern represents adaptive parametric features,
including non-linear transformations, and a template that can be applied in the
feature space. The search of neural network-based chart patterns has been
unexplored despite its potential expressiveness. In this paper, we formulate a
general chart pattern search problem to enable cross-representational
quantitative comparison of various search schemes. We suggest a HyperNEAT
framework applying state-of-the-art deep neural network techniques to find
attractive neural network-based chart patterns; These techniques enable a fast
evaluation and search of robust patterns, as well as bringing a performance
gain. The proposed framework successfully found attractive patterns on the
Korean stock market. We compared newly found patterns with those found by
different search schemes, showing the proposed approach has potential.Comment: 8 pages, In proceedings of Genetic and Evolutionary Computation
Conference (GECCO 2017), Berlin, German
Lucene4IR: Developing information retrieval evaluation resources using Lucene
The workshop and hackathon on developing Information Retrieval Evaluation Resources using Lucene (L4IR) was held on the 8th and 9th of September, 2016 at the University of Strathclyde in Glasgow, UK and funded by the ESF Elias Network. The event featured three main elements: (i) a series of keynote and invited talks on industry, teaching and evaluation; (ii) planning, coding and hacking where a number of groups created modules and infrastructure to use Lucene to undertake TREC based evaluations; and (iii) a number of breakout groups discussing challenges, opportunities and problems in bridging the divide between academia and industry, and how we can use Lucene for teaching and learning Information Retrieval (IR). The event was composed of a mix and blend of academics, experts and students wanting to learn, share and create evaluation resources for the community. The hacking was intense and the discussions lively creating the basis of many useful tools but also raising numerous issues. It was clear that by adopting and contributing to most widely used and supported Open Source IR toolkit, there were many benefits for academics, students, researchers, developers and practitioners - providing a basis for stronger evaluation practices, increased reproducibility, more efficient knowledge transfer, greater collaboration between academia and industry, and shared teaching and training resources
Log Anomaly Detection on EuXFEL Nodes
This article introduces a method to detect anomalies in the log data
generated by control system nodes at the European XFEL accelerator. The primary
aim of this proposed method is to provide operators a comprehensive
understanding of the availability, status, and problems specific to each node.
This information is vital for ensuring the smooth operation. The sequential
nature of logs and the absence of a rich text corpus that is specific to our
nodes poses significant limitations for traditional and learning-based
approaches for anomaly detection. To overcome this limitation, we propose a
method that uses word embedding and models individual nodes as a sequence of
these vectors that commonly co-occur, using a Hidden Markov Model (HMM). We
score individual log entries by computing a probability ratio between the
probability of the full log sequence including the new entry and the
probability of just the previous log entries, without the new entry. This ratio
indicates how probable the sequence becomes when the new entry is added. The
proposed approach can detect anomalies by scoring and ranking log entries from
EuXFEL nodes where entries that receive high scores are potential anomalies
that do not fit the routine of the node. This method provides a warning system
to alert operators about these irregular log events that may indicate issues
Recommended from our members
Harvesting online ontologies for ontology evolution
Ontologies need to evolve to keep their domain representation adequate. However, the process of identifying new domain changes, and applying them to the ontology is tedious and time-consuming. Our hypothesis is that online ontologies can provide background knowledge to decrease user efforts during ontology evolution, by integrating new domain concepts through automated relation discovery and relevance assessment techniques, while resulting in ontologies of similar qualities to when the ontology engineers' knowledge is solely used. We propose, implement and evaluate solutions that exploit the conceptual connections and structure of online ontologies to first, automatically suggest new additions to the ontology in the form of concepts derived from domain data, and their corresponding connections to existing elements in the ontology; and second, to automatically evaluate the proposed changes in terms of relevance with respect to the ontology under evolution, by relying on a novel pattern-based technique for relevance assessment. We also present in this thesis various experiments to test the feasibility of each proposed approach separately, in addition to an overall evaluation that validates our hypothesis that user time during evolution is indeed decreased through the use of online ontologies, with comparable results to a fully manual ontology evolution
A structured approach to malware detection and analysis in digital forensics investigation
A thesis submitted to the University of Bedfordshire in partial fulfilment of the requirement for the degree of PhDWithin the World Wide Web (WWW), malware is considered one of the most serious threats to system security with complex system issues caused by malware and spam. Networks and systems can be accessed and compromised by various types of malware, such as viruses, worms, Trojans, botnet and rootkits, which compromise systems through coordinated attacks. Malware often uses anti-forensic techniques to avoid detection and investigation. Moreover, the results of investigating such attacks are often ineffective and can create barriers for obtaining clear evidence due to the lack of sufficient tools and the immaturity of forensics methodology. This research addressed various complexities faced by investigators in the detection and analysis of malware. In this thesis, the author identified the need for a new approach towards malware detection that focuses on a robust framework, and proposed a solution based on an extensive literature review and market research analysis. The literature review focussed on the different trials and techniques in malware detection to identify the parameters for developing a solution design, while market research was carried out to understand the precise nature of the current problem. The author termed the new approaches and development of the new framework the triple-tier centralised online real-time environment (tri-CORE) malware analysis (TCMA). The tiers come from three distinctive phases of detection and analysis where the entire research pattern is divided into three different domains. The tiers are the malware acquisition function, detection and analysis, and the database operational function. This framework design will contribute to the field of computer forensics by making the investigative process more effective and efficient. By integrating a hybrid method for malware detection, associated limitations with both static and dynamic methods are eliminated. This aids forensics experts with carrying out quick, investigatory processes to detect the behaviour of the malware and its related elements. The proposed framework will help to ensure system confidentiality, integrity, availability and accountability. The current research also focussed on a prototype (artefact) that was developed in favour of a different approach in digital forensics and malware detection methods. As such, a new Toolkit was designed and implemented, which is based on a simple architectural structure and built from open source software that can help investigators develop the skills to critically respond to current cyber incidents and analyses
Ubicorder: A mobile device for situated interactions with sensor networks
The Ubicorder is a mobile, location and orientation
aware device for browsing and interacting with real-time sensor
network data. In addition to browsing data, the Ubicorder also
provides a graphical user interface (GUI) that users can use to
define inference rules. These inference rules detect sensor data
patterns, and translate them to higher-order events. Rules can
also be recursively combined to form an expressive and robust
vocabulary for detecting real-world phenomena, thus enabling
users to script higher level and relevant responses to distributed
sensor stimuli. The Ubicorder’s mobile, handheld form-factor
enables users to easily bring the device to the phenomena of
interest, hence simultaneously observe or cause real-world stimuli
and manipulate in-situ the event detection rules easily using its
graphical interface. In a first-use user study, participants without
any prior sensor network experience rated the Ubicorder highly
for its usefulness and usability when interacting with a sensor
network.Things That Think Consortiu
- …