601 research outputs found

    Curriculum Reform of C Language Programming and Cultivation of Computational Thinking

    Get PDF
    In the traditional teaching mode,students passively receive knowledge, in the way that hindered the development of students’ thinking. It limits training comprehensive analysis capabilities, innovation capability. The computational thinking is one of the basic objectives of teaching computer. This paper describes the methods of using computational thinking to analyze and solve problems, combined with C language programming with its characteristics, explained by an example in the theory and practice of teaching. After that, reform proposals put forward.  

    Stock market trading volumes and economic uncertainty dependence: before and during Sino-U.S. trade friction

    Get PDF
    This article mainly studies the interaction between the economic uncertainty and stock market trading volumes changes before and during Sino-U.S. trade friction using multifractal detrended fluctuation analysis (M.F.-D.F.A.) and multifractal detrended crosscorrelation analysis (M.F.-D.C.C.A.). Our research aims to reveal whether the economic uncertainty increased by Sino-U.S. trade friction affects stock market trading volume more susceptible, as well as how policymaker strengthen risk management and maintain financial stability. The results show that the dynamic volatility linkages between economic uncertainty and stock market trading volumes changes are multifractal, and the cross-correlation of volatility linkages are anti-persistent. Through the rolling-windows analysis, we also find that the economic uncertainty and trading volumes are anti-persistent dynamic cross-correlated. This means that while economic uncertainty increases, trading volume decreases. Besides, Sino-U.S. trade friction has impact on the cross-correlated behaviour significantly, suggesting that stock markets’ risks are relatively large and trading volumes changes are more susceptible by economic uncertainty during Sino-U.S. trade friction in the U.S. Our study complements existing literature about the stock markets trading volumes and economic uncertainty dependence relationship by multifractal theory’s methods. The overall findings imply that the increased economic uncertainty caused by Sino-U.S. trade friction exacerbates financial risks, which are useful for policymakers and investors

    Study designs of randomized controlled trials not based on Chinese medicine theory are improper

    Get PDF
    Current biomedical research methods to evaluate the efficacy of Chinese medicine interventions are often conceptually incompatible with the theory and clinical practice of Chinese medicine. In this commentary, we (1) highlight the theory and principles underlying Chinese medicine clinical practice; (2) use ginseng as an example to describe clinical indications in Chinese medicine; (3) propose a framework guided by Chinese medicine theory for the evaluation of study designs in Chinese medicine research; and (4) evaluate 19 randomized, double-blind, placebo-controlled trials of ginseng. Our analysis indicates that all 19 trials with both positive and negative results confirm the specific effects of ginseng indicated by Chinese medicine theory. Study designs guided by Chinese medicine theory are necessary to validate and improve future randomized controlled clinical trials in Chinese medicine

    The Application of Mobile Learning in College Experimental Teaching

    Get PDF
    First we analyzed the current forms of higher education and learning characteristics of the experimental courses, then we introduced mobile devices to the teaching process of experimental courses in colleges and universities. The introduction of mobile learning can meet the needs of higher education and to achieve the requirements of the reform. In this paper, we focuses on how to construct the learning platform in the integration of mobile learning and experimental courses. As well as the session framework for mobile learning activity design. Practice teaching proves that this method can better improve the efficiency of classroom teaching, and expand the depth and breadth of the students’ study. At the same time, it can also promote the improvement of students’ comprehensive ability

    Bridging Convex and Nonconvex Optimization in Robust PCA: Noise, Outliers, and Missing Data

    Full text link
    This paper delivers improved theoretical guarantees for the convex programming approach in low-rank matrix estimation, in the presence of (1) random noise, (2) gross sparse outliers, and (3) missing data. This problem, often dubbed as robust principal component analysis (robust PCA), finds applications in various domains. Despite the wide applicability of convex relaxation, the available statistical support (particularly the stability analysis vis-a-vis random noise) remains highly suboptimal, which we strengthen in this paper. When the unknown matrix is well-conditioned, incoherent, and of constant rank, we demonstrate that a principled convex program achieves near-optimal statistical accuracy, in terms of both the Euclidean loss and the \ell_{\infty} loss. All of this happens even when nearly a constant fraction of observations are corrupted by outliers with arbitrary magnitudes. The key analysis idea lies in bridging the convex program in use and an auxiliary nonconvex optimization algorithm, and hence the title of this paper

    Model-Based Reinforcement Learning for Offline Zero-Sum Markov Games

    Full text link
    This paper makes progress towards learning Nash equilibria in two-player zero-sum Markov games from offline data. Specifically, consider a γ\gamma-discounted infinite-horizon Markov game with SS states, where the max-player has AA actions and the min-player has BB actions. We propose a pessimistic model-based algorithm with Bernstein-style lower confidence bounds -- called VI-LCB-Game -- that provably finds an ε\varepsilon-approximate Nash equilibrium with a sample complexity no larger than CclippedS(A+B)(1γ)3ε2\frac{C_{\mathsf{clipped}}^{\star}S(A+B)}{(1-\gamma)^{3}\varepsilon^{2}} (up to some log factor). Here, CclippedC_{\mathsf{clipped}}^{\star} is some unilateral clipped concentrability coefficient that reflects the coverage and distribution shift of the available data (vis-\`a-vis the target data), and the target accuracy ε\varepsilon can be any value within (0,11γ]\big(0,\frac{1}{1-\gamma}\big]. Our sample complexity bound strengthens prior art by a factor of min{A,B}\min\{A,B\}, achieving minimax optimality for the entire ε\varepsilon-range. An appealing feature of our result lies in algorithmic simplicity, which reveals the unnecessity of variance reduction and sample splitting in achieving sample optimality.Comment: accepted to Operations Researc

    Inference and Uncertainty Quantification for Noisy Matrix Completion

    Full text link
    Noisy matrix completion aims at estimating a low-rank matrix given only partial and corrupted entries. Despite substantial progress in designing efficient estimation algorithms, it remains largely unclear how to assess the uncertainty of the obtained estimates and how to perform statistical inference on the unknown matrix (e.g.~constructing a valid and short confidence interval for an unseen entry). This paper takes a step towards inference and uncertainty quantification for noisy matrix completion. We develop a simple procedure to compensate for the bias of the widely used convex and nonconvex estimators. The resulting de-biased estimators admit nearly precise non-asymptotic distributional characterizations, which in turn enable optimal construction of confidence intervals\,/\,regions for, say, the missing entries and the low-rank factors. Our inferential procedures do not rely on sample splitting, thus avoiding unnecessary loss of data efficiency. As a byproduct, we obtain a sharp characterization of the estimation accuracy of our de-biased estimators, which, to the best of our knowledge, are the first tractable algorithms that provably achieve full statistical efficiency (including the preconstant). The analysis herein is built upon the intimate link between convex and nonconvex optimization --- an appealing feature recently discovered by \cite{chen2019noisy}.Comment: published at Proceedings of the National Academy of Sciences Nov 2019, 116 (46) 22931-2293

    Iterative Resource Allocation Algorithm for EONs Based on a Linearized GN Model

    Get PDF
    Elastic optical networks (EONs) rely on efficient resource planning to meet future communication needs and avoid resource overprovisioning. Estimation of physical-layer impairments (PLIs) in EONs plays an important role in the network planning stage. Traditionally, the transmission reach (TR) and Gaussian noise (GN) models have been broadly employed in the estimation of the PLIs. However, the TR model cannot accurately estimate PLIs, whereas the GN model is incompatible with state of the art linear optimization solvers. In this paper, we propose a physical-layer estimation model based on the GN model, referred to as the conservative linearized Gaussian noise (CLGN) model. To address the routing, spectrum, and regeneration assignment problem accounting for PLIs, we introduce a link-based mixed integer linear programming formulation employing the CLGN, whose heavy computational burden is relieved by a heuristic approach referred to as the sequential iterative optimization algorithm. We show through simulation that network resources such as spectrum and regeneration nodes can be saved utilizing the CLGN model rather than the TR model. Our proposed heuristic algorithm speeds up the optimization process and provides better resource usage compared to state of the art algorithms on benchmark networks
    corecore