1,185 research outputs found

    Water quality assessment, trophic classification and water resources management

    Get PDF
    Quantification of water quality (WQ) is an integral part of scientifically based water resources management. The main objective of this study was comparative analysis of two approaches applied for quantitative assessment of WQ: the trophic level index (TLI) and the Delphi method (DM). We analyzed the following features of these conceptually different approaches: A. similarity of estimates of lake WQ; B. sensitivity to indicating disturbances in the aquatic ecosystem structure and functioning; C. capacity to reflect the impact of major management measures on the quality of water resources. We compared the DM and TLI based on results from a series of lakes covering varying productivity levels, mixing regimes and climatic zones. We assumed that the conservation of aquatic ecosystem in some predefined, “reference”, state is a major objective of sustainable water resources management in the study lakes. The comparison between the two approaches was quantified as a relationship between the DM ranks and respective TLI values. We show that being a classification system, the TLI does not account for specific characteristics of aquatic ecosystems and the array of different potential uses of the water resource. It indirectly assumes that oligotrophication is identical to WQ improvement, and reduction of economic activity within the lake catchment area is the most effective way to improve WQ. WQ assessed with the TLI is more suitable for needs of natural water resources management if eutrophication is a major threat. The DM allows accounting for several water resource uses and therefore it may serve as a more robust and comprehensive tool for WQ quantification and thus for sustainable water resources management

    The dynamics of employment growth: new evidence from 18 countries

    Get PDF
    Motivated by the on-going interest of policy makers in the sources of job creation, this paper presents results from a new OECD project on the dynamics of employment (DynEmp) based on an innovative methodology using firm-level data (i.e. national business registers or similar sources). It demonstrates that among small and medium sized enterprises (SMEs), young firms play a central role in creating jobs, whereas old SMEs tend to destroy jobs. This pattern holds robustly across 17 OECD countries and Brazil, extending recent evidence found in the United States. The paper also shows that young firms are always net job creators throughout the business cycle, even during the financial crisis. During the crisis, entry and post-entry growth by young firms were affected most heavily, although downsizing by old firms was responsible for most job losses. The results also highlight large cross-country differences in the growth potential of young firms, pointing to the role played by national policies in enabling successful firms to create jobs

    The best versus the rest: divergence across firms during the global productivity slowdown

    Get PDF
    We document that labor productivity of the globally most productive firms – the “frontier” – has diverged from all other firms – the “rest” – throughout the 2000s. This divergence remains after controlling for capital intensity and markups, and is strongest in ICT services, indicative of “winnertakes-all” dynamics. We also find weakening catch-up and market selection below the frontier, which can explain why this divergence at the firm level is linked to weaker aggregate productivity. The divergence is found to be stronger in industries where product market regulations are less competition friendly, highlighting the need for regulatory policy to improve the contestability of markets

    Towards Safe Machine Learning for CPS: Infer Uncertainty from Training Data

    Full text link
    Machine learning (ML) techniques are increasingly applied to decision-making and control problems in Cyber-Physical Systems among which many are safety-critical, e.g., chemical plants, robotics, autonomous vehicles. Despite the significant benefits brought by ML techniques, they also raise additional safety issues because 1) most expressive and powerful ML models are not transparent and behave as a black box and 2) the training data which plays a crucial role in ML safety is usually incomplete. An important technique to achieve safety for ML models is "Safe Fail", i.e., a model selects a reject option and applies the backup solution, a traditional controller or a human operator for example, when it has low confidence in a prediction. Data-driven models produced by ML algorithms learn from training data, and hence they are only as good as the examples they have learnt. As pointed in [17], ML models work well in the "training space" (i.e., feature space with sufficient training data), but they could not extrapolate beyond the training space. As observed in many previous studies, a feature space that lacks training data generally has a much higher error rate than the one that contains sufficient training samples [31]. Therefore, it is essential to identify the training space and avoid extrapolating beyond the training space. In this paper, we propose an efficient Feature Space Partitioning Tree (FSPT) to address this problem. Using experiments, we also show that, a strong relationship exists between model performance and FSPT score.Comment: Publication rights licensed to AC

    Vox-E: Text-guided Voxel Editing of 3D Objects

    Full text link
    Large scale text-guided diffusion models have garnered significant attention due to their ability to synthesize diverse images that convey complex visual concepts. This generative power has more recently been leveraged to perform text-to-3D synthesis. In this work, we present a technique that harnesses the power of latent diffusion models for editing existing 3D objects. Our method takes oriented 2D images of a 3D object as input and learns a grid-based volumetric representation of it. To guide the volumetric representation to conform to a target text prompt, we follow unconditional text-to-3D methods and optimize a Score Distillation Sampling (SDS) loss. However, we observe that combining this diffusion-guided loss with an image-based regularization loss that encourages the representation not to deviate too strongly from the input object is challenging, as it requires achieving two conflicting goals while viewing only structure-and-appearance coupled 2D projections. Thus, we introduce a novel volumetric regularization loss that operates directly in 3D space, utilizing the explicit nature of our 3D representation to enforce correlation between the global structure of the original and edited object. Furthermore, we present a technique that optimizes cross-attention volumetric grids to refine the spatial extent of the edits. Extensive experiments and comparisons demonstrate the effectiveness of our approach in creating a myriad of edits which cannot be achieved by prior works.Comment: Project webpage: https://tau-vailab.github.io/Vox-E

    Humidifier with a web interface control

    Get PDF
    PFC del programa Erasmus EPS elaborat a Instituto Superior de Engenharia do PortoTreball desenvolupat dins el marc del programa 'European Project Semester'.In the text we will show how tomake a humidifier with a web interface control. The goal of our project is to keep under full control the humidity level of a 80 m³ server room. We will go through, step by step, from choosingthetype of humidifier,itcontrol,andtheweb interfacewe are planning to buildup

    Quantifying structural reforms in OECD countries: a new framework

    Get PDF

    Late-Time Spectral Observations of the Strongly Interacting Type Ia Supernova PTF11kx

    Get PDF
    PTF11kx was a Type Ia supernova (SN Ia) that showed time-variable absorption features, including saturated Ca II H&K lines that weakened and eventually went into emission. The strength of the emission component of H{\alpha} increased, implying that the SN was undergoing significant interaction with its circumstellar medium (CSM). These features were blueshifted slightly and showed a P-Cygni profile, likely indicating that the CSM was directly related to, and probably previously ejected by, the progenitor system itself. These and other observations led Dilday et al. (2012) to conclude that PTF11kx came from a symbiotic nova progenitor like RS Oph. In this work we extend the spectral coverage of PTF11kx to 124-680 rest-frame days past maximum brightness. These spectra of PTF11kx are dominated by H{\alpha} emission (with widths of ~2000 km/s), strong Ca II emission features (~10,000 km/s wide), and a blue "quasi-continuum" due to many overlapping narrow lines of Fe II. Emission from oxygen, He I, and Balmer lines higher than H{\alpha} is weak or completely absent at all epochs, leading to large observed H{\alpha}/H{\beta} intensity ratios. The broader (~2000 km/s) H{\alpha} emission appears to increase in strength with time for ~1 yr, but it subsequently decreases significantly along with the Ca II emission. Our latest spectrum also indicates the possibility of newly formed dust in the system as evidenced by a slight decrease in the red wing of H{\alpha}. During the same epochs, multiple narrow emission features from the CSM temporally vary in strength. The weakening of the H{\alpha} and Ca II emission at late times is possible evidence that the SN ejecta have overtaken the majority of the CSM and agrees with models of other strongly interacting SNe Ia. The varying narrow emission features, on the other hand, may indicate that the CSM is clumpy or consists of multiple thin shells.Comment: 12 pages, 7 figures, 1 table, re-submitted to Ap
    corecore