501 research outputs found

    ІНФОРМАЦІЙНІ ТЕХНОЛОГІЇ ПРОСТОРОВОЇ ІНВЕНТАРИЗАЦІЇ ПАРНИКОВИХ ГАЗІВ У ЕНЕРГЕТИЧНОМУ СЕКТОРІ СІЛЕЗЬКОГО ВОЄВОДСТВА

    Get PDF
    GIS technology of spatial inventory of greenhouse gases (carbon dioxide, methane, etc.) in the energy sector of Silesia Region in Poland has been presented. Georeferenced databases, GIS software, and international inventory methodologies have been used. The mathematical models for inventory of carbon dioxide, methane and other greenhouse gases during the combustion of fuel in the production of electricity, in the residential sector, industry, construction, and transport have beencreated. These models allow to obtain the spatial distribution of total emissions of greenhouse gases of Silesia Region, taking into account the contribution of each region in the overall processes of emission.Представлено геоінформаційні технології просторової інвентаризації парникових газів (двоокису вуглецю, метану та ін.) в енергетичному секторі в Сілезькому воєводстві Польщі. Використано георозподілені бази даних, програмне забезпечення геоінформаційної системи та міжнародні методології інвентаризації. Розроблено математичні моделі для інвентаризації двоокису вуглецю, метану та інших парникових газів в процесі спалювання палива на виробництво електроенергії, в житловому секторі, у промисловості та будівництві, на транспорті. Ці моделі дали змогу отримати просторовий розподіл сумарних викидів парникових газів Сілезького воєводства з врахуванням внеску кожного району в загальні процеси емісії

    Exploratory Analysis of Highly Heterogeneous Document Collections

    Full text link
    We present an effective multifaceted system for exploratory analysis of highly heterogeneous document collections. Our system is based on intelligently tagging individual documents in a purely automated fashion and exploiting these tags in a powerful faceted browsing framework. Tagging strategies employed include both unsupervised and supervised approaches based on machine learning and natural language processing. As one of our key tagging strategies, we introduce the KERA algorithm (Keyword Extraction for Reports and Articles). KERA extracts topic-representative terms from individual documents in a purely unsupervised fashion and is revealed to be significantly more effective than state-of-the-art methods. Finally, we evaluate our system in its ability to help users locate documents pertaining to military critical technologies buried deep in a large heterogeneous sea of information.Comment: 9 pages; KDD 2013: 19th ACM SIGKDD Conference on Knowledge Discovery and Data Minin

    Hydrogen-atom Attack on Phenol and Toluene is \u3cem\u3eortho\u3c/em\u3e-directed

    Get PDF
    The reaction of H + phenol and H/D + toluene has been studied in a supersonic expansion after electric discharge. The (1 + 1′) resonance-enhanced multiphoton ionization (REMPI) spectra of the reaction products, at m/z = parent + 1, or parent + 2 amu, were measured by scanning the first (resonance) laser. The resulting spectra are highly structured. Ionization energies were measured by scanning the second (ionization) laser, while the first laser was tuned to a specific transition. Theoretical calculations, benchmarked to the well-studied H + benzene → cyclohexadienyl radical reaction, were performed. The spectrum arising from the reaction of H + phenol is attributed solely to the ortho-hydroxy-cyclohexadienyl radical, which was found in two conformers (syn and anti). Similarly, the reaction of H/D + toluene formed solely the ortho isomer. The preference for the ortho isomer at 100–200 K in the molecular beam is attributed to kinetic, not thermodynamic effects, caused by an entrance channel barrier that is ∼5 kJ mol−1 lower for ortho than for other isomers. Based on these results, we predict that the reaction of H + phenol and H + toluene should still favour the ortho isomer under elevated temperature conditions in the early stages of combustion (200–400 °C)

    Taking advantage of the UNFCCC Kyoto Policy Process: What can we learn about learning?

    Get PDF
    Learning is difficult to anticipate when it happen instantaneously, e.g. in the context of innovations [2]. However, even if learning is anticipated to happen continuously, it is difficult to grasp, e.g. when it occurs outside well-defined lab conditions, because adequate monitoring had not been put in place. Our study is retrospective. It focuses on the emissions of greenhouse gases (GHGs)that had been reported by countries (Parties) under the Kyoto Protocol (KP) to the United Nations Framework on Climate Change (UNFCCC). Discussions range widely on (i) whether the KP is considered a failure [6] or a success [5] ; and (ii) whether international climate policy should transit from a centralized model of governance to a 'hybrid' decentralized approach that combines country-level mitigation pledges with common principles for accounting and monitoring [1] . Emissions of GHGs - in the following we refer to CO2 emissions from burning fossil fuels at country level, particularly in the case of Austria - provide a perfect means to study learning in a globally relevant context. We are not aware of a similar data treasure of global relevance. Our mode of grasping learning is novel, i.e. it may have been referred to in general but, to the best of our knowledge, had not been quantifed so far. (That is, we consider the KP a success story potentially and advocate for the hybrid decentralized approach.) Learning requires 'measuring' differences or deviations. Here we follow Marland et al. [3] who discuss this issue in the context of emissions accounting: 'Many of the countries and organizations that make estimates of CO2 emissions provide annual updates in which they add another year of data to the time series and revise the estimates for earlier years. Revisions may reflect revised or more complete energy data and ... more complete and detailed understanding of the emissions processes and emissions coefficients. In short, we expect revisions to reflect learning and a convergence toward more complete and accurate estimates.' The United Nations Framework Convention on Climate Change (UNFCCC)requires exactly this to be done. Each year UNFCCC signatory countries are obliged to provide an annual inventory of emissions (and removals) of specified GHGs from five sectors (energy; industrial processes and product use; agriculture; land use, land use change and forestry; and waste) and revisit the emissions (and removals) for all previous years, back to the country specified base years (or periods). These data are made available by means of a database [4]. The time series of revised emission estimates reflect learning, but they are 'contaminated' by (i) structural change (e.g., when a coal-power plant is substituted by a gas-power plant); (ii) changes in consumption; and, rare but possible, (iii)methodological changes in surveying emission related activities. De-trending time series of revised emission estimates allows this contamination to be isolated by country, for which we provide three approaches: (I) parametric approach employing polynomial trend; (II) non-parametric approach employing smoothing splines; and (III) approach in which the most recent estimate is used as trend. That is, after de-trending for each year we are left with a set of revisions that reflect 'pure'(uncontaminated) learning which, is expected to be independent of the year under consideration (i.e., identical from year to year). However, we are confronted with two non-negligible problems (P): (P.1) the problem of small numbers - the remaining differences in emissions are small (before and after de-trending); and (P.2) the problem of non-monotonic learning - our knowledge of emission-generating activities and emission factors may not become more accurate from revision to revision

    Assessing Information Systems and Computer Information Systems Programs from a Balanced Scorecard Perspective

    Get PDF
    Assessment of educational programs is one of the important means used in academia for accountability, accreditation, and improvement of program quality. The assessment practices, guidelines, and requirements are very broad and vary widely among academic programs and from one institution to the other. In this paper, from the theoretical lenses of a strategic planning and management methodology, the Balanced Scorecard, we try to integrate various perspectives into a performance assessment framework for an educational assessment of computing and information systems. Particularly, based on the actual accreditation experience, we propose two assessment models: a conceptual model and a process model. This modeling approach addresses the critical conceptual elements required for educational assessment and provides practical guidelines to follow for a complete, smooth and successful assessment process. In addition, we present a set of robust tools and techniques, incorporated into the process steps, team work, and task-driven management process. We were successful in our accreditation efforts, and improved the quality of our computing and information systems programs by using these presented assessment methods. We share our views and thoughts in the form of lessons learned and suggested best practices so as to streamline program assessment and simplify its procedures and steps

    Imaging Gold Nanoparticles in Living Cells Environments using Heterodyne Digital Holographic Microscopy

    Full text link
    This paper describes an imaging microscopic technique based on heterodyne digital holography where subwavelength-sized gold colloids can be imaged in cell environment. Surface cellular receptors of 3T3 mouse fibroblasts are labeled with 40 nm gold nanoparticles, and the biological specimen is imaged in a total internal reflection configuration with holographic microscopy. Due to a higher scattering efficiency of the gold nanoparticles versus that of cellular structures, accurate localization of a gold marker is obtained within a 3D mapping of the entire sample's scattered field, with a lateral precision of 5 nm and 100 nm in the x,y and in the z directions respectively, demonstrating the ability of holographic microscopy to locate nanoparticles in living cells environments

    Supplementary Material to Quantifying Memory and Persistence in the Atmosphere–Land/Ocean Carbon System

    Get PDF
    The Supplementary Material download gives access to 1. the Supplementary Information (SI) file; and 2. the Supplementary Data (SD) file supporting the manuscript Quantifying memory and persistence in the atmosphere–land/ocean carbon system, submitted for publication as a research article in Earth Systems Dynamics (https://doi.org/10.5194/esd-13-439-2022). The SI file is a PDF document. It combines ten sections which provide full explanations of the mathematics used in the manuscript or other useful information allowing to keep the manuscript short and readable. The ten sections are referred to as Supplement Information 1 to Supplement Information 10 in the manuscript. The SD file is an Excel document. It consists of 16 worksheets which are referred to as Supplement Data 1 to Supplement Data 16 in the manuscript. The introductory worksheet (Supplementary Data Guide) to the SD file provides a list of contents to guide a user through the 16 worksheets

    Spatial inventory of GHG emissions from fossil fuels extraction and processing: An uncertainty analysis

    Get PDF
    This article discusses bottom-up inventory analysis for greenhouse gas (GHG) emissions from fossil fuels extraction and processing in Poland. The approaches to modelling geo-referenced cadastres of emissions from fossil fuels extraction and processing are described as well as methods of uncertainty reduction using the knowledge on spatial greenhouse gas emissions distribution. The results of GHG emissions spatial inventory contain the information on geographical coordinates of emission sources. This information is useful for indication the largest emission sources. In this article we present the obtained results on spatial GHG inventory from fossil fuels extraction and processing in Poland, based on IPCC guidelines taking into account locations of emissions sources, official statistics and digital maps of territories investigated. Monte-Carlo method was applied for a detailed estimation of GHG emissions and results uncertainty in the main categories of analyzed sector

    Quantifying memory and persistence in the atmosphere–land and ocean carbon system

    Get PDF
    Here we intend to further the understanding of the planetary burden (and its dynamics) caused by the effect of the continued increase in carbon dioxide (CO2) emissions from fossil fuel burning and land use as well as by global warming from a new rheological (stress–strain) perspective. That is, we perceive the emission of anthropogenic CO2 into the atmosphere as a stressor and survey the condition of Earth in stress–strain units (stress in units of Pa, strain in units of 1) – allowing access to and insight into previously unknown characteristics reflecting Earth's rheological status. We use the idea of a Maxwell body consisting of elastic and damping (viscous) elements to reflect the overall behavior of the atmosphere–land and ocean system in response to the continued increase in CO2 emissions between 1850 and 2015. Thus, from the standpoint of a global observer, we see that the CO2 concentration in the atmosphere is increasing (rather quickly). Concomitantly, the atmosphere is warming and expanding, while some of the carbon is being locked away (rather slowly) in land and oceans, likewise under the influence of global warming. It is not known how reversible and how out of sync the latter process (uptake of carbon by sinks) is in relation to the former (expansion of the atmosphere). All we know is that the slower process remembers the influence of the faster one, which runs ahead. Important questions arise as to whether this global-scale memory – Earth's memory – can be identified and quantified, how it behaves dynamically, and, last but not least, how it interlinks with persistence by which we understand Earth's path dependency. We go beyond textbook knowledge by introducing three parameters that characterize the system: delay time, memory, and persistence. The three parameters depend, ceteris paribus, solely on the system's characteristic viscoelastic behavior and allow deeper and novel insights into that system. The parameters come with their own limits which govern the behavior of the atmosphere–land and ocean carbon system, independently from any external target values (such as temperature targets justified by means of global change research). We find that since 1850, the atmosphere–land and ocean system has been trapped progressively in terms of persistence (i.e., it will become progressively more difficult to relax the system), while its ability to build up memory has been reduced. The ability of a system to build up memory effectively can be understood as its ability to respond still within its natural regime or, if the build-up of memory is limited, as a measure for system failures globally in the future. Approximately 60 % of Earth's memory had already been exploited by humankind prior to 1959. Based on these stress–strain insights we expect that the atmosphere–land and ocean carbon system will be forced outside its natural regime well before 2050 if the current trend in emissions is not reversed immediately and sustainably
    corecore