11,638 research outputs found

    Does a change in debt structure matter in earnings management? the application of nonlinear panel threshold test

    Get PDF
    In this study, we apply Hansen¡¦s (1999) nonlinear panel threshold test, the most powerful test of its kind, to investigate the relationship between debt ratio and earnings management of 474 selected Taiwan-listed companies during the September 2002 - June 2005 period. Rather than a fixed positive relation that is determined from the OLS, our empirical results strongly suggest that when a firm¡¦s debt ratio exceeds 46.79% and 62.17%, its debt structure changes, which in turn leads to changes in earnings management. With an increase in debt ratio, managers tend to manage earnings to a greater extent and at a higher speed. In other words, the threshold effect of debt on the relationship between debt ratio and earnings management generates an increasingly positive impact. These empirical results provide concerned investors and authorities with an enhanced understanding of earnings management, as manipulated by managers confronted with different debt structures.

    Open-world Learning and Application to Product Classification

    Full text link
    Classic supervised learning makes the closed-world assumption, meaning that classes seen in testing must have been seen in training. However, in the dynamic world, new or unseen class examples may appear constantly. A model working in such an environment must be able to reject unseen classes (not seen or used in training). If enough data is collected for the unseen classes, the system should incrementally learn to accept/classify them. This learning paradigm is called open-world learning (OWL). Existing OWL methods all need some form of re-training to accept or include the new classes in the overall model. In this paper, we propose a meta-learning approach to the problem. Its key novelty is that it only needs to train a meta-classifier, which can then continually accept new classes when they have enough labeled data for the meta-classifier to use, and also detect/reject future unseen classes. No re-training of the meta-classifier or a new overall classifier covering all old and new classes is needed. In testing, the method only uses the examples of the seen classes (including the newly added classes) on-the-fly for classification and rejection. Experimental results demonstrate the effectiveness of the new approach.Comment: accepted by The Web Conference (WWW 2019) Previous title: Learning to Accept New Classes without Trainin
    • …
    corecore