41 research outputs found

    Ecological network design based on optimizing ecosystem services:case study in the Huang-Huai-Hai region, China

    Get PDF
    In modern agricultural landscapes, constructing ‘ecological networks’ is regarded as an efficient way to conserve biodiversity and maintain ecosystem services. Here we aimed to develop an approach to design ecological corridor by employing the ecological source - resistance surface - ecological corridor framework in combination with semi-natural habitat planning and ecosystem service trade-off assessment. ‘Ecological source patches’ were identified based on a ‘Remote Sensing Ecological Index’ (RSEI) to objectively classify ecological and environmental conditions. Our resulting spatial resistance surface was further modified used based on the ‘Cultivated Land Use Intensity’ index, to derive a high accuracy and rationality of ecological corridor extraction in agriculture landscape. While planning the ecological network, key nodes and resulting semi-natural habitat (SNH) distribution were identified using Linkage Mapper tools and circuit theory. We constructed ecological network scenarios with different amounts of semi-natural habitats and calculated resulting regional ecosystem service values (ESV) using an equivalence factor method to explore optimal spatial layouts. The results showed, while regional ecosystem service values generally increased in line with semi-natural habitat area contained within the ecological network, ecological networks with forests covering 10% of the total area were predicted as an optimal scenario balancing ecosystem services with agricultural yield in the study region. Networks with mixed forest and grassland cover totaling 20% of the area represented an alternative choice that strongly enhanced regional ecosystem services while may still allowing for high agricultural productivity. In constructing corridors, identifying, restoring and protecting key ecological nodes using targeted management and habitat restoration, while protecting existing wetlands and other water bodies that support regional water cycle and supply services, should be prioritized. Regional policy measures furthermore need to promote targeted ecological network planning to help improve the overall sustainability of agricultural production

    Collaborative Noisy Label Cleaner: Learning Scene-aware Trailers for Multi-modal Highlight Detection in Movies

    Full text link
    Movie highlights stand out of the screenplay for efficient browsing and play a crucial role on social media platforms. Based on existing efforts, this work has two observations: (1) For different annotators, labeling highlight has uncertainty, which leads to inaccurate and time-consuming annotations. (2) Besides previous supervised or unsupervised settings, some existing video corpora can be useful, e.g., trailers, but they are often noisy and incomplete to cover the full highlights. In this work, we study a more practical and promising setting, i.e., reformulating highlight detection as "learning with noisy labels". This setting does not require time-consuming manual annotations and can fully utilize existing abundant video corpora. First, based on movie trailers, we leverage scene segmentation to obtain complete shots, which are regarded as noisy labels. Then, we propose a Collaborative noisy Label Cleaner (CLC) framework to learn from noisy highlight moments. CLC consists of two modules: augmented cross-propagation (ACP) and multi-modality cleaning (MMC). The former aims to exploit the closely related audio-visual signals and fuse them to learn unified multi-modal representations. The latter aims to achieve cleaner highlight labels by observing the changes in losses among different modalities. To verify the effectiveness of CLC, we further collect a large-scale highlight dataset named MovieLights. Comprehensive experiments on MovieLights and YouTube Highlights datasets demonstrate the effectiveness of our approach. Code has been made available at: https://github.com/TencentYoutuResearch/HighlightDetection-CLCComment: Accepted to CVPR202

    D3G: Exploring Gaussian Prior for Temporal Sentence Grounding with Glance Annotation

    Full text link
    Temporal sentence grounding (TSG) aims to locate a specific moment from an untrimmed video with a given natural language query. Recently, weakly supervised methods still have a large performance gap compared to fully supervised ones, while the latter requires laborious timestamp annotations. In this study, we aim to reduce the annotation cost yet keep competitive performance for TSG task compared to fully supervised ones. To achieve this goal, we investigate a recently proposed glance-supervised temporal sentence grounding task, which requires only single frame annotation (referred to as glance annotation) for each query. Under this setup, we propose a Dynamic Gaussian prior based Grounding framework with Glance annotation (D3G), which consists of a Semantic Alignment Group Contrastive Learning module (SA-GCL) and a Dynamic Gaussian prior Adjustment module (DGA). Specifically, SA-GCL samples reliable positive moments from a 2D temporal map via jointly leveraging Gaussian prior and semantic consistency, which contributes to aligning the positive sentence-moment pairs in the joint embedding space. Moreover, to alleviate the annotation bias resulting from glance annotation and model complex queries consisting of multiple events, we propose the DGA module, which adjusts the distribution dynamically to approximate the ground truth of target moments. Extensive experiments on three challenging benchmarks verify the effectiveness of the proposed D3G. It outperforms the state-of-the-art weakly supervised methods by a large margin and narrows the performance gap compared to fully supervised methods. Code is available at https://github.com/solicucu/D3G.Comment: ICCV202

    Unified and Dynamic Graph for Temporal Character Grouping in Long Videos

    Full text link
    Video temporal character grouping locates appearing moments of major characters within a video according to their identities. To this end, recent works have evolved from unsupervised clustering to graph-based supervised clustering. However, graph methods are built upon the premise of fixed affinity graphs, bringing many inexact connections. Besides, they extract multi-modal features with kinds of models, which are unfriendly to deployment. In this paper, we present a unified and dynamic graph (UniDG) framework for temporal character grouping. This is accomplished firstly by a unified representation network that learns representations of multiple modalities within the same space and still preserves the modality's uniqueness simultaneously. Secondly, we present a dynamic graph clustering where the neighbors of different quantities are dynamically constructed for each node via a cyclic matching strategy, leading to a more reliable affinity graph. Thirdly, a progressive association method is introduced to exploit spatial and temporal contexts among different modalities, allowing multi-modal clustering results to be well fused. As current datasets only provide pre-extracted features, we evaluate our UniDG method on a collected dataset named MTCG, which contains each character's appearing clips of face and body and speaking voice tracks. We also evaluate our key components on existing clustering and retrieval datasets to verify the generalization ability. Experimental results manifest that our method can achieve promising results and outperform several state-of-the-art approaches

    Ecological network design based on optimizing ecosystem services:case study in the Huang-Huai-Hai region, China

    No full text
    In modern agricultural landscapes, constructing ‘ecological networks’ is regarded as an efficient way to conserve biodiversity and maintain ecosystem services. Here we aimed to develop an approach to design ecological corridor by employing the ecological source - resistance surface - ecological corridor framework in combination with semi-natural habitat planning and ecosystem service trade-off assessment. ‘Ecological source patches’ were identified based on a ‘Remote Sensing Ecological Index’ (RSEI) to objectively classify ecological and environmental conditions. Our resulting spatial resistance surface was further modified used based on the ‘Cultivated Land Use Intensity’ index, to derive a high accuracy and rationality of ecological corridor extraction in agriculture landscape. While planning the ecological network, key nodes and resulting semi-natural habitat (SNH) distribution were identified using Linkage Mapper tools and circuit theory. We constructed ecological network scenarios with different amounts of semi-natural habitats and calculated resulting regional ecosystem service values (ESV) using an equivalence factor method to explore optimal spatial layouts. The results showed, while regional ecosystem service values generally increased in line with semi-natural habitat area contained within the ecological network, ecological networks with forests covering 10% of the total area were predicted as an optimal scenario balancing ecosystem services with agricultural yield in the study region. Networks with mixed forest and grassland cover totaling 20% of the area represented an alternative choice that strongly enhanced regional ecosystem services while may still allowing for high agricultural productivity. In constructing corridors, identifying, restoring and protecting key ecological nodes using targeted management and habitat restoration, while protecting existing wetlands and other water bodies that support regional water cycle and supply services, should be prioritized. Regional policy measures furthermore need to promote targeted ecological network planning to help improve the overall sustainability of agricultural production

    Measurement of inclusive and differential cross sections for single top quark production in association with a W boson in proton-proton collisions at s \sqrt{s} = 13 TeV

    No full text
    International audienceMeasurements of the inclusive and normalised differential cross sections are presented for the production of single top quarks in association with a W boson in proton-proton collisions at a centre-of-mass energy of 13 TeV. The data used were recorded with the CMS detector at the LHC during 2016–2018, and correspond to an integrated luminosity of 138 fb1^{−1}. Events containing one electron and one muon in the final state are analysed. For the inclusive measurement, a multivariate discriminant, exploiting the kinematic properties of the events is used to separate the signal from the dominant tt \textrm{t}\overline{\textrm{t}} background. A cross section of 79.2±0.9(stat)8.0+7.7(syst)±1.2(lumi) 79.2\pm 0.9{\left(\textrm{stat}\right)}_{-8.0}^{+7.7}\left(\textrm{syst}\right)\pm 1.2\left(\textrm{lumi}\right) pb is obtained, consistent with the predictions of the standard model. For the differential measurements, a fiducial region is defined according to the detector acceptance, and the requirement of exactly one jet coming from the fragmentation of a bottom quark. The resulting distributions are unfolded to particle level and agree with the predictions at next-to-leading order in perturbative quantum chromodynamics.[graphic not available: see fulltext

    Measurement of the <math altimg="si1.svg"><mi mathvariant="normal">t</mi><mover accent="true"><mrow><mi mathvariant="normal">t</mi></mrow><mrow><mo stretchy="false">¯</mo></mrow></mover></math> charge asymmetry in events with highly Lorentz-boosted top quarks in pp collisions at <math altimg="si2.svg"><msqrt><mrow><mi>s</mi></mrow></msqrt><mo linebreak="goodbreak" linebreakstyle="after">=</mo><mn>13</mn></math> TeV

    No full text
    International audienceThe measurement of the charge asymmetry in top quark pair events with highly Lorentz-boosted top quarks decaying to a single lepton and jets is presented. The analysis is performed using proton-proton collisions at s=13TeV with the CMS detector at the LHC and corresponding to an integrated luminosity of 138 fb−1. The selection is optimized for top quarks produced with large Lorentz boosts, resulting in nonisolated leptons and overlapping jets. The top quark charge asymmetry is measured for events with a tt¯ invariant mass larger than 750 GeV and corrected for detector and acceptance effects using a binned maximum likelihood fit. The measured top quark charge asymmetry of (0.42−0.69+0.64)% is in good agreement with the standard model prediction at next-to-next-to-leading order in quantum chromodynamic perturbation theory with next-to-leading-order electroweak corrections. The result is also presented for two invariant mass ranges, 750–900 and &gt;900GeV

    Search for pair production of vector-like quarks in leptonic final states in proton-proton collisions at s \sqrt{s} = 13 TeV

    No full text
    A search is presented for vector-like T \mathrm{T} and B \mathrm{B} quark-antiquark pairs produced in proton-proton collisions at a center-of-mass energy of 13 TeV. Data were collected by the CMS experiment at the CERN LHC in 2016-2018, with an integrated luminosity of 138 fb1 ^{-1} . Events are separated into single-lepton, same-sign charge dilepton, and multilepton channels. In the analysis of the single-lepton channel a multilayer neural network and jet identification techniques are employed to select signal events, while the same-sign dilepton and multilepton channels rely on the high-energy signature of the signal to distinguish it from standard model backgrounds. The data are consistent with standard model background predictions, and the production of vector-like quark pairs is excluded at 95% confidence level for T \mathrm{T} quark masses up to 1.54 TeV and B \mathrm{B} quark masses up to 1.56 TeV, depending on the branching fractions assumed, with maximal sensitivity to decay modes that include multiple top quarks. The limits obtained in this search are the strongest limits to date for TT \mathrm{T} \overline{\mathrm{T}} production, excluding masses below 1.48 TeV for all decays to third generation quarks, and are the strongest limits to date for BB \mathrm{B} \overline{\mathrm{B}} production with B \mathrm{B} quark decays to tW.A search is presented for vector-like T and B quark-antiquark pairs produced in proton-proton collisions at a center-of-mass energy of 13 TeV. Data were collected by the CMS experiment at the CERN LHC in 2016–2018, with an integrated luminosity of 138 fb1^{−1}. Events are separated into single-lepton, same-sign charge dilepton, and multi-lepton channels. In the analysis of the single-lepton channel a multilayer neural network and jet identification techniques are employed to select signal events, while the same-sign dilepton and multilepton channels rely on the high-energy signature of the signal to distinguish it from standard model backgrounds. The data are consistent with standard model background predictions, and the production of vector-like quark pairs is excluded at 95% confidence level for T quark masses up to 1.54 TeV and B quark masses up to 1.56 TeV, depending on the branching fractions assumed, with maximal sensitivity to decay modes that include multiple top quarks. The limits obtained in this search are the strongest limits to date for TT \textrm{T}\overline{\textrm{T}} production, excluding masses below 1.48 TeV for all decays to third generation quarks, and are the strongest limits to date for BB \textrm{B}\overline{\textrm{B}} production with B quark decays to tW.[graphic not available: see fulltext]A search is presented for vector-like T and B quark-antiquark pairs produced in proton-proton collisions at a center-of-mass energy of 13 TeV. Data were collected by the CMS experiment at the CERN LHC in 2016-2018, with an integrated luminosity of 138 fb1^{-1}. Events are separated into single-lepton, same-sign charge dilepton, and multilepton channels. In the analysis of the single-lepton channel a multilayer neural network and jet identification techniques are employed to select signal events, while the same-sign dilepton and multilepton channels rely on the high-energy signature of the signal to distinguish it from standard model backgrounds. The data are consistent with standard model background predictions, and the production of vector-like quark pairs is excluded at 95% confidence level for T quark masses up to 1.54 TeV and B quark masses up to 1.56 TeV, depending on the branching fractions assumed, with maximal sensitivity to decay modes that include multiple top quarks. The limits obtained in this search are the strongest limits to date for TT\mathrm{T\overline{T}} production, excluding masses below 1.48 TeV for all decays to third generation quarks, and are the strongest limits to date for BB\mathrm{B\overline{B}} production with B quark decays to tW

    Search for new physics using effective field theory in 13 TeV pppp collision events that contain a top quark pair and a boosted ZZ or Higgs boson

    No full text
    A data sample containing top quark pairs (ttˉ\mathrm{t\bar{t}}) produced in association with a Lorentz-boosted Z or Higgs boson is used to search for signs of new physics using effective field theory. The data correspond to an integrated luminosity of 138 fb1^{-1} of proton-proton collisions produced at a center-of-mass energy of 13 TeV at the LHC and collected by the CMS experiment. Selected events contain a single lepton and hadronic jets, including two identified with the decay of bottom quarks, plus an additional large-radius jet with high transverse momentum identified as a Z or Higgs boson decaying to a bottom quark pair. Machine learning techniques are employed to discriminate between ttˉ\mathrm{t\bar{t}}Z or ttˉ\mathrm{t\bar{t}}H events and events from background processes, which are dominated by ttˉ\mathrm{t\bar{t}} + jets production. No indications of new physics are observed. The signal strengths of boosted ttˉ\mathrm{t\bar{t}}Z and ttˉ\mathrm{t\bar{t}}H production are measured, and upper limits are placed on the ttˉ\mathrm{t\bar{t}}Z and ttˉ\mathrm{t\bar{t}}H differential cross sections as functions of the Z or Higgs boson transverse momentum. The effects of new physics are probed using a framework in which the standard model is considered to be the low-energy effective field theory of a higher energy scale theory. Eight possible dimension-six operators are added to the standard model Lagrangian and their corresponding coefficients are constrained via fits to the data
    corecore