4,620 research outputs found

    An adaptive methodology to discretize and select features

    Get PDF
    A lot of significant data describing the behavior or/and actions of systems can be collected in several domains. These data define some aspects, called features, that can be clustered in several classes. A qualitative or quantitative value for each feature is stored from measurements or observations. In this paper, the problem of finding independent features for getting the best accuracy on classification problems is considered. Obtaining these features is the main objective of this work, where an automatic method to select features is proposed. The method extends the functionality of Ameva coefficient to use it in other tasks of machine learning where it has not been defined.Ministerio de Ciencia e Innovación ARTEMISA TIN2009-14378-C02-01Junta de Andalucia Simon TIC-805

    Energy performance forecasting of residential buildings using fuzzy approaches

    Get PDF
    The energy consumption used for domestic purposes in Europe is, to a considerable extent, due to heating and cooling. This energy is produced mostly by burning fossil fuels, which has a high negative environmental impact. The characteristics of a building are an important factor to determine the necessities of heating and cooling loads. Therefore, the study of the relevant characteristics of the buildings, regarding the heating and cooling needed to maintain comfortable indoor air conditions, could be very useful in order to design and construct energy-efficient buildings. In previous studies, different machine-learning approaches have been used to predict heating and cooling loads from the set of variables: relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area and glazing area distribution. However, none of these methods are based on fuzzy logic. In this research, we study two fuzzy logic approaches, i.e., fuzzy inductive reasoning (FIR) and adaptive neuro fuzzy inference system (ANFIS), to deal with the same problem. Fuzzy approaches obtain very good results, outperforming all the methods described in previous studies except one. In this work, we also study the feature selection process of FIR methodology as a pre-processing tool to select the more relevant variables before the use of any predictive modelling methodology. It is proven that FIR feature selection provides interesting insights into the main building variables causally related to heating and cooling loads. This allows better decision making and design strategies, since accurate cooling and heating load estimations and correct identification of parameters that affect building energy demands are of high importance to optimize building designs and equipment specifications.Peer ReviewedPostprint (published version

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    Realtime market microstructure analysis: online Transaction Cost Analysis

    Full text link
    Motivated by the practical challenge in monitoring the performance of a large number of algorithmic trading orders, this paper provides a methodology that leads to automatic discovery of the causes that lie behind a poor trading performance. It also gives theoretical foundations to a generic framework for real-time trading analysis. Academic literature provides different ways to formalize these algorithms and show how optimal they can be from a mean-variance, a stochastic control, an impulse control or a statistical learning viewpoint. This paper is agnostic about the way the algorithm has been built and provides a theoretical formalism to identify in real-time the market conditions that influenced its efficiency or inefficiency. For a given set of characteristics describing the market context, selected by a practitioner, we first show how a set of additional derived explanatory factors, called anomaly detectors, can be created for each market order. We then will present an online methodology to quantify how this extended set of factors, at any given time, predicts which of the orders are underperforming while calculating the predictive power of this explanatory factor set. Armed with this information, which we call influence analysis, we intend to empower the order monitoring user to take appropriate action on any affected orders by re-calibrating the trading algorithms working the order through new parameters, pausing their execution or taking over more direct trading control. Also we intend that use of this method in the post trade analysis of algorithms can be taken advantage of to automatically adjust their trading action.Comment: 33 pages, 12 figure

    Kinetic Solvers with Adaptive Mesh in Phase Space

    Full text link
    An Adaptive Mesh in Phase Space (AMPS) methodology has been developed for solving multi-dimensional kinetic equations by the discrete velocity method. A Cartesian mesh for both configuration (r) and velocity (v) spaces is produced using a tree of trees data structure. The mesh in r-space is automatically generated around embedded boundaries and dynamically adapted to local solution properties. The mesh in v-space is created on-the-fly for each cell in r-space. Mappings between neighboring v-space trees implemented for the advection operator in configuration space. We have developed new algorithms for solving the full Boltzmann and linear Boltzmann equations with AMPS. Several recent innovations were used to calculate the discrete Boltzmann collision integral with dynamically adaptive mesh in velocity space: importance sampling, multi-point projection method, and the variance reduction method. We have developed an efficient algorithm for calculating the linear Boltzmann collision integral for elastic and inelastic collisions in a Lorentz gas. New AMPS technique has been demonstrated for simulations of hypersonic rarefied gas flows, ion and electron kinetics in weakly ionized plasma, radiation and light particle transport through thin films, and electron streaming in semiconductors. We have shown that AMPS allows minimizing the number of cells in phase space to reduce computational cost and memory usage for solving challenging kinetic problems

    Solid/FEM integration at SNLA

    Get PDF
    The effort at Sandia National Labs. on the methodologies and techniques being used to generate strict hexahedral finite element meshes from a solid model is described. The functionality of the modeler is used to decompose the solid into a set of nonintersecting meshable finite element primitives. The description of the decomposition is exported, via a Boundary Representative format, to the meshing program which uses the information for complete finite element model specification. Particular features of the program are discussed in some detail along with future plans for development which includes automation of the decomposition using artificial intelligence techniques
    corecore