161 research outputs found

    Seasonal behaviour of tidal damping and residual water level slope in the Yangtze River estuary: identifying the critical position and river discharge for maximum tidal damping

    Get PDF
    As a tide propagates into the estuary, river discharge affects tidal damping, primarily via a friction term, attenuating tidal motion by increasing the quadratic velocity in the numerator, while reducing the effective friction by increasing the water depth in the denominator. For the first time, we demonstrate a third effect of river discharge that may lead to the weakening of the channel convergence (i.e. landward reduction of channel width and/or depth). In this study, monthly averaged tidal water levels (2003–2014) at six gauging stations along the Yangtze River estuary are used to understand the seasonal behaviour of tidal damping and residual water level slope. Observations show that there is a critical value of river discharge, beyond which the tidal damping is reduced with increasing river discharge. This phenomenon is clearly observed in the upstream part of the Yangtze River estuary (between the Maanshan and Wuhu reaches), which suggests an important cumulative effect of residual water level on tide–river dynamics. To understand the underlying mechanism, an analytical model has been used to quantify the seasonal behaviour of tide–river dynamics and the corresponding residual water level slope under various external forcing conditions. It is shown that a critical position along the estuary.info:eu-repo/semantics/publishedVersio

    Cathode Pressure Control of Air Supply System for PEMFC

    Get PDF
    This paper proposes a backstepping method controller for the polymer electrolyte membrane fuel cell (PEMFC)air supply system. The control objective is adjusting the cathode pressure to its reference value quickly, in order to solve the problem of excessive extreme pressure difference between anode and cathode in practice. Considering model uncertainty and some disturbances, we design an extend state observer (ESO) to estimate disturbances. Next, a backstepping method is proposed to adjust control law. Finally, the experiment results demonstrate the effectiveness and robustness of the control strategy

    Learning to Break the Loop: Analyzing and Mitigating Repetitions for Neural Text Generation

    Full text link
    While large-scale neural language models, such as GPT2 and BART, have achieved impressive results on various text generation tasks, they tend to get stuck in undesirable sentence-level loops with maximization-based decoding algorithms (\textit{e.g.}, greedy search). This phenomenon is counter-intuitive since there are few consecutive sentence-level repetitions in human corpora (e.g., 0.02\% in Wikitext-103). To investigate the underlying reasons for generating consecutive sentence-level repetitions, we study the relationship between the probabilities of the repetitive tokens and their previous repetitions in the context. Through our quantitative experiments, we find that 1) Language models have a preference to repeat the previous sentence; 2) The sentence-level repetitions have a \textit{self-reinforcement effect}: the more times a sentence is repeated in the context, the higher the probability of continuing to generate that sentence; 3) The sentences with higher initial probabilities usually have a stronger self-reinforcement effect. Motivated by our findings, we propose a simple and effective training method \textbf{DITTO} (Pseu\underline{D}o-Repet\underline{IT}ion Penaliza\underline{T}i\underline{O}n), where the model learns to penalize probabilities of sentence-level repetitions from pseudo repetitive data. Although our method is motivated by mitigating repetitions, experiments show that DITTO not only mitigates the repetition issue without sacrificing perplexity, but also achieves better generation quality. Extensive experiments on open-ended text generation (Wikitext-103) and text summarization (CNN/DailyMail) demonstrate the generality and effectiveness of our method.Comment: Accepted by NeurIPS 2022. Code is released at https://github.com/Jxu-Thu/DITT
    • …
    corecore