344 research outputs found

    New JLS-Factor Model versus the Standard JLS Model: A Case Study on Chinese Stock Bubbles

    Get PDF
    In this paper, we extend the Johansen-Ledoit-Sornette (JLS) model by introducing fundamental economic factors in China (including the interest rate and deposit reserve rate) and the historical volatilities of targeted and US equity indices into the original model, which is a flexible tool to detect bubbles and predict regime changes in financial markets. We then derive a general method to incorporate these selected factors in addition to the log-periodic power law signature of herding and compare the prediction accuracy of the critical time between the original and the new JLS models (termed the JLS-factor model) by applying these two models to fit two well-known Chinese stock indices in three bubble periods. The results show that the JLS-factor model with Chinese characteristics successfully depicts the evolutions of bubbles and “antibubbles” and constructs efficient end-of-bubble signals for all bubbles in Chinese stock markets. In addition, the results of standard statistical tests demonstrate the excellent explanatory power of these additive factors and confirm that the new JLS model provides useful improvements over the standard JLS model

    Unsupervised Summarization by Jointly Extracting Sentences and Keywords

    Full text link
    We present RepRank, an unsupervised graph-based ranking model for extractive multi-document summarization in which the similarity between words, sentences, and word-to-sentence can be estimated by the distances between their vector representations in a unified vector space. In order to obtain desirable representations, we propose a self-attention based learning method that represent a sentence by the weighted sum of its word embeddings, and the weights are concentrated to those words hopefully better reflecting the content of a document. We show that salient sentences and keywords can be extracted in a joint and mutual reinforcement process using our learned representations, and prove that this process always converges to a unique solution leading to improvement in performance. A variant of absorbing random walk and the corresponding sampling-based algorithm are also described to avoid redundancy and increase diversity in the summaries. Experiment results with multiple benchmark datasets show that RepRank achieved the best or comparable performance in ROUGE.Comment: 10 pages(includes 2 pages references), 1 figur

    Coadjoint orbits and induced representations

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mathematics, 1993.Includes bibliographical references (leaves 82-84).by Zongyi Li.Ph.D

    The Nonlocal Neural Operator: Universal Approximation

    Full text link
    Neural operator architectures approximate operators between infinite-dimensional Banach spaces of functions. They are gaining increased attention in computational science and engineering, due to their potential both to accelerate traditional numerical methods and to enable data-driven discovery. A popular variant of neural operators is the Fourier neural operator (FNO). Previous analysis proving universal operator approximation theorems for FNOs resorts to use of an unbounded number of Fourier modes and limits the basic form of the method to problems with periodic geometry. Prior work relies on intuition from traditional numerical methods, and interprets the FNO as a nonstandard and highly nonlinear spectral method. The present work challenges this point of view in two ways: (i) the work introduces a new broad class of operator approximators, termed nonlocal neural operators (NNOs), which allow for operator approximation between functions defined on arbitrary geometries, and includes the FNO as a special case; and (ii) analysis of the NNOs shows that, provided this architecture includes computation of a spatial average (corresponding to retaining only a single Fourier mode in the special case of the FNO) it benefits from universal approximation. It is demonstrated that this theoretical result unifies the analysis of a wide range of neural operator architectures. Furthermore, it sheds new light on the role of nonlocality, and its interaction with nonlinearity, thereby paving the way for a more systematic exploration of nonlocality, both through the development of new operator learning architectures and the analysis of existing and new architectures

    Conditional Linear Regression

    Get PDF
    Work in machine learning and statistics commonly focuses on building models that capture the vast majority of data, possibly ignoring a segment of the population as outliers. However, there may not exist a good, simple model for the distribution, so we seek to find a small subset where there exists such a model. We give a computationally efficient algorithm with theoretical analysis for the conditional linear regression task, which is the joint task of identifying a significant portion of the data distribution, described by a k-DNF, along with a linear predictor on that portion with a small loss. In contrast to work in robust statistics on small subsets, our loss bounds do not feature a dependence on the density of the portion we fit, and compared to previous work on conditional linear regression, our algorithm’s running time scales polynomially with the sparsity of the linear predictor. We also demonstrate empirically that our algorithm can leverage this advantage to obtain a k-DNF with a better linear predictor in practice

    ACL Recovery Aid

    Get PDF
    This project changed its scope throughout the two quarters. Originally, it was a student proposed project to create a low-cost brace to help patients after anterior cruciate ligament (ACL) reconstruction surgery. The brace was to be worn during the day and help stretch the ACL to speed up the recovery time. After meeting with Dr. McSorley, a physical therapist local to San Luis Obispo, we decided to change the scope of the project. Dr. McSorley mentioned that the recovery process for ACL patients stops at night when they go to sleep. Creating a brace that could be worn at night would improve the recovery process. The brace had to be adjustable for different extends of stretching and comfortable to be worn overnight. Most of the design was done before the project change was made. However, most of the elements could be kept as mainly the application was changed. The brace fits around the knee and attaches at the upper and lower leg. There is a gear mechanism on the side that allows the user to adjust the stretch of the knee. One important aspect that Dr. McSorley brought up was that the brace had to apply the correct forces on the leg to prevent reinjury of the ACL. The brace must apply a force on the top front of the knee and the bottom back of the leg. The device functions by attaching around the knee. The user can adjust the brace, so it stretches the knee to a comfortable degree. Although most testing was not possible because of the COVID-19 situation, the brace should work as intended. The brace is made of aluminum and attaches to the leg with Velcro straps to provide support and durability. The padding provides comfort to the user

    Fourier Neural Operator with Learned Deformations for PDEs on General Geometries

    Full text link
    Deep learning surrogate models have shown promise in solving partial differential equations (PDEs). Among them, the Fourier neural operator (FNO) achieves good accuracy, and is significantly faster compared to numerical solvers, on a variety of PDEs, such as fluid flows. However, the FNO uses the Fast Fourier transform (FFT), which is limited to rectangular domains with uniform grids. In this work, we propose a new framework, viz., geo-FNO, to solve PDEs on arbitrary geometries. Geo-FNO learns to deform the input (physical) domain, which may be irregular, into a latent space with a uniform grid. The FNO model with the FFT is applied in the latent space. The resulting geo-FNO model has both the computation efficiency of FFT and the flexibility of handling arbitrary geometries. Our geo-FNO is also flexible in terms of its input formats, viz., point clouds, meshes, and design parameters are all valid inputs. We consider a variety of PDEs such as the Elasticity, Plasticity, Euler's, and Navier-Stokes equations, and both forward modeling and inverse design problems. Geo-FNO is 10510^5 times faster than the standard numerical solvers and twice more accurate compared to direct interpolation on existing ML-based PDE solvers such as the standard FNO
    • …
    corecore