80,666 research outputs found

    Data mining: a tool for detecting cyclical disturbances in supply networks.

    Get PDF
    Disturbances in supply chains may be either exogenous or endogenous. The ability automatically to detect, diagnose, and distinguish between the causes of disturbances is of prime importance to decision makers in order to avoid uncertainty. The spectral principal component analysis (SPCA) technique has been utilized to distinguish between real and rogue disturbances in a steel supply network. The data set used was collected from four different business units in the network and consists of 43 variables; each is described by 72 data points. The present paper will utilize the same data set to test an alternative approach to SPCA in detecting the disturbances. The new approach employs statistical data pre-processing, clustering, and classification learning techniques to analyse the supply network data. In particular, the incremental k-means clustering and the RULES-6 classification rule-learning algorithms, developed by the present authors’ team, have been applied to identify important patterns in the data set. Results show that the proposed approach has the capability automatically to detect and characterize network-wide cyclical disturbances and generate hypotheses about their root cause

    A High-Definition Spatially Explicit Modeling Approach for National Greenhouse Gas Emissions from Industrial Processes: Reducing the Errors and Uncertainties in Global Emission Modeling

    Get PDF
    Spatially-explicit (gridded) emission inventories (EIs) should allow us to analyse sectoral emissions patterns to estimate potential impacts of emission policies and support decisions on reducing emissions. However, such EIs are often based on simple downscaling of national level emissions estimate and the changes in subnational emissions distributions are not necessarily reflecting the actual changes driven by the local emissions drivers. This article presents a high definition,100m resolution bottom-up inventory of greenhouse gas (GHG) emissions from the industrial processes (fuel combustion activities in energy and manufacturing industry, fugitive emissions, mineral products, chemical industry, metal production, food and drink) that is exemplified on data for Poland. We propose an improved emission disaggregation algorithmthat fully utilizes a collection of activity data available at national/provincial level to the level of individual point and diffused (area) emission sources. To ensure the accuracy of the resulting 100m emission fields, the geospatial data used for mapping emission sources (point source geolocation and land cover classification) were subject to thorough human visual inspection.The resulting 100m emission field even hold cadastres of emissions separately for each industrial emission category, while we start with IPCC-compliant national sectoral GHG estimates that we made using Polish official statistics. We aggregated the resulting emissions to the level of administrative units such as municipalities, districts and provinces. We also compiled cadastres in regular grids and then compared them with EDGAR results. Quantitative analysis of discrepancies between both results revealed quite frequent misallocations of point sources used in the EDGAR compilation that considerably deteriorates high resolution inventories. We also propose a Monte-Carlo method-based uncertainty assessment that yields a detailed estimation of the GHG emission uncertainty in the main categories of the analysed processes. We found that the above mentioned geographical coordinates and patterns used for emission disaggregation have the greatest impact on overall uncertainty of GHG inventoriesfrom the industrial processes

    The ethics of uncertainty for data subjects

    Get PDF
    Modern health data practices come with many practical uncertainties. In this paper, I argue that data subjects’ trust in the institutions and organizations that control their data, and their ability to know their own moral obligations in relation to their data, are undermined by significant uncertainties regarding the what, how, and who of mass data collection and analysis. I conclude by considering how proposals for managing situations of high uncertainty might be applied to this problem. These emphasize increasing organizational flexibility, knowledge, and capacity, and reducing hazard

    Managing Water under Uncertainty and Risk: The United Nations World Water Development Report 4

    Get PDF
    This report introduces new aspects of water issues: 1) it reintroduces the 12 challenge area reports that provided the foundation for the first two World Water Development Reports (WWDR); 2) 4 new reports on water quality, groundwater, gender, and desertification, land degradation and drought; 3) in recognition that the global challenges of water can vary considerably across countries and regions, a series of 5 regional reports have been included; 4) a deeper analysis of the main external forces of freshwater resources and possibilities for their future evolution; 5) managing water under uncertainty and risk

    An Integrated Approach for Characterizing Aerosol Climate Impacts and Environmental Interactions

    Get PDF
    Aerosols exert myriad influences on the earth's environment and climate, and on human health. The complexity of aerosol-related processes requires that information gathered to improve our understanding of climate change must originate from multiple sources, and that effective strategies for data integration need to be established. While a vast array of observed and modeled data are becoming available, the aerosol research community currently lacks the necessary tools and infrastructure to reap maximum scientific benefit from these data. Spatial and temporal sampling differences among a diverse set of sensors, nonuniform data qualities, aerosol mesoscale variabilities, and difficulties in separating cloud effects are some of the challenges that need to be addressed. Maximizing the long-term benefit from these data also requires maintaining consistently well-understood accuracies as measurement approaches evolve and improve. Achieving a comprehensive understanding of how aerosol physical, chemical, and radiative processes impact the earth system can be achieved only through a multidisciplinary, inter-agency, and international initiative capable of dealing with these issues. A systematic approach, capitalizing on modern measurement and modeling techniques, geospatial statistics methodologies, and high-performance information technologies, can provide the necessary machinery to support this objective. We outline a framework for integrating and interpreting observations and models, and establishing an accurate, consistent, and cohesive long-term record, following a strategy whereby information and tools of progressively greater sophistication are incorporated as problems of increasing complexity are tackled. This concept is named the Progressive Aerosol Retrieval and Assimilation Global Observing Network (PARAGON). To encompass the breadth of the effort required, we present a set of recommendations dealing with data interoperability; measurement and model integration; multisensor synergy; data summarization and mining; model evaluation; calibration and validation; augmentation of surface and in situ measurements; advances in passive and active remote sensing; and design of satellite missions. Without an initiative of this nature, the scientific and policy communities will continue to struggle with understanding the quantitative impact of complex aerosol processes on regional and global climate change and air quality

    Liability for Spatial Data Quality

    Get PDF
    Liability in data, products, and services related to geographic information systems, spatial data infrastructure, location based services and web mapping services, is complicated by the complexities and uncertainties in liability for information system products and services generally, as well as by legal theory uncertainties surrounding liability for maps. Each application of geospatial technologies to a specific use may require integration of different types of data from multiple sources, assessment of attributes, adherence to accuracy and fitness-for-use requirements, and selection from among different analytical processing methods. All of these actions may be fraught with possible misjudgments and errors. A variety of software programs may be run against a single geographic database, while a wide range of users may have very different use objectives. The complexity of the legal questions surrounding liability for geospatial data, combined with the diversity of problems to which geospatial data and technologies may be applied and the continually changing technological environment, have created un-settling and often unclear concerns over liability for geospatial technology development and use. This article selects a single data quality issue to illustrate that liability expo-sure. In regard to that issue, it may have a substantial stifling effect on the widespread use of web-based geospatial technologies for such purposes as geographic data mining and interoperable web mapping services. The article concludes with a recommendation for a potential web-wide community solution for substantially reducing the liability exposure of geospatial technology and geographic data producers and users
    • …
    corecore