3,368 research outputs found

    Cross‐lingual link discovery in the Web of Data

    Get PDF
    Cross‐lingual link discovery in the Web of Dat

    What is the current state of the Multilingual Web of Data?

    Get PDF
    The Semantic Web is growing at a fast pace, recently boosted by the creation of the Linked Data initiative and principles. Methods, standards, techniques and the state of technology are becoming more mature and therefore are easing the task of publication and consumption of semantic information on the Web

    Debates—Stochastic subsurface hydrology from theory to practice: why stochastic modeling has not yet permeated into practitioners?

    Get PDF
    This is the peer reviewed version of the following article: [Sanchez-Vila, X., and D. Fernàndez-Garcia (2016), Debates—Stochastic subsurface hydrology from theory to practice: Why stochastic modeling has not yet permeated into practitioners?, Water Resour. Res., 52, 9246–9258, doi:10.1002/2016WR019302], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2016WR019302/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-ArchivingWe address modern topics of stochastic hydrogeology from their potential relevance to real modeling efforts at the field scale. While the topics of stochastic hydrogeology and numerical modeling have become routine in hydrogeological studies, nondeterministic models have not yet permeated into practitioners. We point out a number of limitations of stochastic modeling when applied to real applications and comment on the reasons why stochastic models fail to become an attractive alternative for practitioners. We specifically separate issues corresponding to flow, conservative transport, and reactive transport. The different topics addressed are emphasis on process modeling, need for upscaling parameters and governing equations, relevance of properly accounting for detailed geological architecture in hydrogeological modeling, and specific challenges of reactive transport. We end up by concluding that the main responsible for nondeterministic models having not yet permeated in industry can be fully attributed to researchers in stochastic hydrogeology.Peer ReviewedPostprint (author's final draft

    Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data

    Get PDF
    This is the peer reviewed version of the following article: [Freixas, G., D. Fernàndez-Garcia, and X. Sanchez-Vila (2017), Stochastic estimation of hydraulic transmissivity fields using flow connectivity indicator data, Water Resour. Res., 53, 602–618, doi:10.1002/2015WR018507], which has been published in final form at http://onlinelibrary.wiley.com/doi/10.1002/2015WR018507/abstract. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.Most methods for hydraulic test interpretation rely on a number of simplified assumptions regarding the homogeneity and isotropy of the underlying porous media. This way, the actual heterogeneity of any natural parameter, such as transmissivity ( math formula), is transferred to the corresponding estimates in a way heavily dependent on the interpretation method used. An example is a long-term pumping test interpreted by means of the Cooper-Jacob method, which implicitly assumes a homogeneous isotropic confined aquifer. The estimates obtained from this method are not local values, but still have a clear physical meaning; the estimated math formula represents a regional-scale effective value, while the log-ratio of the normalized estimated storage coefficient, indicated by math formula, is an indicator of flow connectivity, representative of the scale given by the distance between the pumping and the observation wells. In this work we propose a methodology to use math formula, together with sampled local measurements of transmissivity at selected points, to map the expected value of local math formula values using a technique based on cokriging. Since the interpolation involves two variables measured at different support scales, a critical point is the estimation of the covariance and crosscovariance matrices. The method is applied to a synthetic field displaying statistical anisotropy, showing that the inclusion of connectivity indicators in the estimation method provide maps that effectively display preferential flow pathways, with direct consequences in solute transport.Peer ReviewedPostprint (published version

    Process parameter optimization for Laser Metal Deposition using Hastelloy X blown powder

    Get PDF
    Hastelloy X (HX) is a nickel-base superalloy with strong resistance to oxidation and stress corrosion cracking combined with high temperature strength, these characteristics make it perfect for gas turbine engine components. The lack of machinability of this material makes laser metal deposition (LMD) suitable for the production and/or repair of components of HX, which uses a laser beam to melt the powder particles. Moreover, LMD has the features and versatility to implement this technology in the industry. The objective is to analyse a design of experiment covering a range of process parameters (laser power, scanning speed and powder feed rate) to find a suitable parameter set for using LMD with Hastelloy X powder. Properties of interest are build rate, microstructure, and minimum defects such as pores and cracks. Based on a design of experiments, 11 samples with 4 walls of different layers (15 layers, 5 layers, 3 layers and 1 layer) were printed. Then the samples were analysed using a light optical microscope and a hardness test along the build section. The results showed that each parameter has a different effect on the geometry accuracy. On the other hand, only scanning speed had a directly proportional effect on the microhardness. However, the powder feed affected the distance where the highest hardness point was found. Regarding the defects such as cracks and pores, any parameter had an important role, all the pores where evenly distributed on the laterals of the build. Moreover, cracks caused by lack of fusion where only found on one sample. Concerning the microstructure, all the samples presented the same evolution, the lower layers microstructure consists of fine columnar dendrites with an equiaxed morphology growing perpendicularly to the substrate in vertical direction, whereas the top layers consist of coarser columnar dendrites with bigger grain width. To sum up, with the analysis made in this study, it can be concluded that the parameter range used to print HX with LMD are suitable for the fabrication of engine complex part

    Particle Density Estimation with Grid-Projected Adaptive Kernels

    Full text link
    The reconstruction of smooth density fields from scattered data points is a procedure that has multiple applications in a variety of disciplines, including Lagrangian (particle-based) models of solute transport in fluids. In random walk particle tracking (RWPT) simulations, particle density is directly linked to solute concentrations, which is normally the main variable of interest, not just for visualization and post-processing of the results, but also for the computation of non-linear processes, such as chemical reactions. Previous works have shown the superiority of kernel density estimation (KDE) over other methods such as binning, in terms of its ability to accurately estimate the "true" particle density relying on a limited amount of information. Here, we develop a grid-projected KDE methodology to determine particle densities by applying kernel smoothing on a pilot binning; this may be seen as a "hybrid" approach between binning and KDE. The kernel bandwidth is optimized locally. Through simple implementation examples, we elucidate several appealing aspects of the proposed approach, including its computational efficiency and the possibility to account for typical boundary conditions, which would otherwise be cumbersome in conventional KDE

    Ergodicity of pumping tests

    Get PDF
    Standard interpretations of pumping tests in heterogeneous formations rely on effective representations of porous media, which replace spatially varying hydraulic properties with their constant counterparts averaged over the support volume of a test. Rigorous approaches for deriving representative (effective, apparent, upscaled, etc.) parameters employ either ensemble or spatial averaging. We derive a set of conditions under which these two paradigms yield identical results. We refer to them as conditions for the ergodicity of pumping tests. This allows one to use stochastic approaches to estimate the statistics of the spatial variability of hydraulic parameters on scales smaller than the support volume of a pumping test

    Linked Data at the Spanish National Library and the application of IFLA RDFS models

    Full text link
    The Spanish National Library (Biblioteca Nacional de España1. BNE) and the Ontology Engineering Group2 of Universidad Politécnica de Madrid are working on the joint project ?Preliminary Study of Linked Data?, whose aim is to enrich the Web of Data with the BNE authority and bibliographic records. To this end, they are transforming the BNE information to RDF following the Linked Data principles3 proposed by Tim Berners Lee
    corecore