223 research outputs found

    Power system security boundary visualization using intelligent techniques

    Get PDF
    In the open access environment, one of the challenges for utilities is that typical operating conditions tend to be much closer to security boundaries. Consequently, security levels for the transmission network must be accurately assessed and easily identified on-line by system operators;Security assessment through boundary visualization provides the operator with knowledge of system security levels in terms of easily monitorable pre-contingency operating parameters. The traditional boundary visualization approach results in a two-dimensional graph called a nomogram. However, an intensive labor involvement, inaccurate boundary representation, and little flexibility in integrating with the energy management system greatly restrict use of nomograms under competitive utility environment. Motivated by the new operating environment and based on the traditional nomogram development procedure, an automatic security boundary visualization methodology has been developed using neural networks with feature selection. This methodology provides a new security assessment tool for power system operations;The main steps for this methodology include data generation, feature selection, neural network training, and boundary visualization. In data generation, a systematic approach to data generation has been developed to generate high quality data. Several data analysis techniques have been used to analyze the data before neural network training. In feature selection, genetic algorithm based methods have been used to select the most predicative precontingency operating parameters. Following neural network training, a confidence interval calculation method to measure the neural network output reliability has been derived. Sensitivity analysis of the neural network output with respect to input parameters has also been derived. In boundary visualization, a composite security boundary visualization algorithm has been proposed to present accurate boundaries in two dimensional diagrams to operators for any type of security problem;This methodology has been applied to thermal overload, voltage instability problems for a sample system

    An academic review: applications of data mining techniques in finance industry

    Get PDF
    With the development of Internet techniques, data volumes are doubling every two years, faster than predicted by Moore’s Law. Big Data Analytics becomes particularly important for enterprise business. Modern computational technologies will provide effective tools to help understand hugely accumulated data and leverage this information to get insights into the finance industry. In order to get actionable insights into the business, data has become most valuable asset of financial organisations, as there are no physical products in finance industry to manufacture. This is where data mining techniques come to their rescue by allowing access to the right information at the right time. These techniques are used by the finance industry in various areas such as fraud detection, intelligent forecasting, credit rating, loan management, customer profiling, money laundering, marketing and prediction of price movements to name a few. This work aims to survey the research on data mining techniques applied to the finance industry from 2010 to 2015.The review finds that Stock prediction and Credit rating have received most attention of researchers, compared to Loan prediction, Money Laundering and Time Series prediction. Due to the dynamics, uncertainty and variety of data, nonlinear mapping techniques have been deeply studied than linear techniques. Also it has been proved that hybrid methods are more accurate in prediction, closely followed by Neural Network technique. This survey could provide a clue of applications of data mining techniques for finance industry, and a summary of methodologies for researchers in this area. Especially, it could provide a good vision of Data Mining Techniques in computational finance for beginners who want to work in the field of computational finance

    Transdisciplinarity in energy retrofit: A conceptual framework

    Get PDF
    This study explores the role of Energy Retrofit (ER) in Low Carbon Transition (LCT). The literature recognises the need to move towards a transdisciplinary approach in ER, which encompasses multidisciplinarity and interdisciplinarity. However, the fragmentation between different disciplines remains a significant problem, mainly due to challenges associated with knowledge exchange across the allied disciplines that play a role in ER. The authors posit that ER projects has been conceptualised and implemented using a Systems perspective so that an integrated approach that is akin to transdisciplinarity could become commonplace. Against this background, the aim of this paper is to establish to what extent ER has been conceptualised as a System in the literature so that complexities can effectively be managed through a transdisciplinary approach. This work is based on a literature review of 136 peer-reviewed journal papers. The content analysis demonstrates that current research on transdisciplinarity in ER can be conceptualised in five categories and 15 lines of research. They are presented as a Conceptual Framework, which is this paper’s main contribution to existing knowledge. It reveals the direction of innovation in ER for LCT, and is illustrated as a cognitive map. This map exposes the current fragmentation implicit in the literature, and proposes critical connections that need to be established for a transdisciplinary approach. It also shows that the discourse on LCT changed by moving beyond the building scale; and recognising the need to embrace disruptive and local technologies, and integrating the social and technical aspects of ER. Innovative technical solutions and robust information modelling approaches emerge as key vehicles towards making decisions that pay regard to the economic, social and technical factors and that empower the prosumers to play an active role in LCT

    A survey of the application of soft computing to investment and financial trading

    Get PDF

    Automated anomaly recognition in real time data streams for oil and gas industry.

    Get PDF
    There is a growing demand for computer-assisted real-time anomaly detection - from the identification of suspicious activities in cyber security, to the monitoring of engineering data for various applications across the oil and gas, automotive and other engineering industries. To reduce the reliance on field experts' knowledge for identification of these anomalies, this thesis proposes a deep-learning anomaly-detection framework that can help to create an effective real-time condition-monitoring framework. The aim of this research is to develop a real-time and re-trainable generic anomaly-detection framework, which is capable of predicting and identifying anomalies with a high level of accuracy - even when a specific anomalous event has no precedent. Machine-based condition monitoring is preferable in many practical situations where fast data analysis is required, and where there are harsh climates or otherwise life-threatening environments. For example, automated conditional monitoring systems are ideal in deep sea exploration studies, offshore installations and space exploration. This thesis firstly reviews studies about anomaly detection using machine learning. It then adopts the best practices from those studies in order to propose a multi-tiered framework for anomaly detection with heterogeneous input sources, which can deal with unseen anomalies in a real-time dynamic problem environment. The thesis then applies the developed generic multi-tiered framework to two fields of engineering: data analysis and malicious cyber attack detection. Finally, the framework is further refined based on the outcomes of those case studies and is used to develop a secure cross-platform API, capable of re-training and data classification on a real-time data feed

    Environmental Sustainability in Maritime Infrastructures

    Get PDF
    This Special Issue is entitled “Environmental Sustainability in Maritime Infrastructures”. Oceans and coastal areas are essential in our lives from several different points of view: social, economic, and health. Given the importance of these areas for human life, not only for the present but also for the future, it is necessary to plan future infrastructures, and maintain and adapt to the changes the existing ones. All of this taking into account the sustainability of our planet. A very significant percentage of the world's population lives permanently or enjoys their vacation periods in coastal zones, which makes them very sensitive areas, with a very high economic value and as a focus of adverse effects on public health and ecosystems. Therefore, it is considered very relevant and of great interest to launch this Special Issue to cover any aspects related to the vulnerability of coastal systems and their inhabitants (water pollution, coastal flooding, climate change, overpopulation, urban planning, waste water, plastics at sea, effects on ecosystems, etc.), as well as the use of ocean resources (fisheries, energy, tourism areas, etc.)

    Algorithmic Techniques in Gene Expression Processing. From Imputation to Visualization

    Get PDF
    The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.Siirretty Doriast

    Combined optimization algorithms applied to pattern classification

    Get PDF
    Accurate classification by minimizing the error on test samples is the main goal in pattern classification. Combinatorial optimization is a well-known method for solving minimization problems, however, only a few examples of classifiers axe described in the literature where combinatorial optimization is used in pattern classification. Recently, there has been a growing interest in combining classifiers and improving the consensus of results for a greater accuracy. In the light of the "No Ree Lunch Theorems", we analyse the combination of simulated annealing, a powerful combinatorial optimization method that produces high quality results, with the classical perceptron algorithm. This combination is called LSA machine. Our analysis aims at finding paradigms for problem-dependent parameter settings that ensure high classifica, tion results. Our computational experiments on a large number of benchmark problems lead to results that either outperform or axe at least competitive to results published in the literature. Apart from paxameter settings, our analysis focuses on a difficult problem in computation theory, namely the network complexity problem. The depth vs size problem of neural networks is one of the hardest problems in theoretical computing, with very little progress over the past decades. In order to investigate this problem, we introduce a new recursive learning method for training hidden layers in constant depth circuits. Our findings make contributions to a) the field of Machine Learning, as the proposed method is applicable in training feedforward neural networks, and to b) the field of circuit complexity by proposing an upper bound for the number of hidden units sufficient to achieve a high classification rate. One of the major findings of our research is that the size of the network can be bounded by the input size of the problem and an approximate upper bound of 8 + √2n/n threshold gates as being sufficient for a small error rate, where n := log/SL and SL is the training set
    corecore