45,945 research outputs found

    A COMPARATIVE STUDY ON TEACHING WRITING THROUGH INDUCTIVE AND DEDUCTIVE METHOD (An Experimental Study at the Tenth Grade of SMA Negeri 4 Surakarta in the 2009/2010 Academic Year)

    Get PDF
    The objectives of this research are to know whether there is any significant difference in writing achievement between the students taught using inductive method and those taught using deductive method and to know which group has higher achievement, the group taught using inductive method or the one taught using deductive method. Related to the objectives of the research, the writer used experimental method. The research was conducted in April-Mei 2010. The population of this research is the tenth grade students of SMA Negeri 4 Surakarta, in the academic year of 2009/2010. From the whole population, two classes were taken as the sample where each class consists of 36 students. The samples were class X-I as the experimental group and X-J as the control group. The sampling technique used was cluster random sampling. In collecting the data, she used a test. In analyzing the data, she used t-test formula. The result of the research shows that there is a significant difference in the achievement of writing between the students taught using inductive method and those taught using deductive method. The mean of the experimental group taught using inductive method is 82 while the mean of the control group taught using deductive method is 78.09. It shows that the mean score of experimental group is higher than the control group. Therefore, it can be concluded that the students taught using inductive method have higher achievement than those taught using deductive method

    Relative Entropy and Inductive Inference

    Full text link
    We discuss how the method of maximum entropy, MaxEnt, can be extended beyond its original scope, as a rule to assign a probability distribution, to a full-fledged method for inductive inference. The main concept is the (relative) entropy S[p|q] which is designed as a tool to update from a prior probability distribution q to a posterior probability distribution p when new information in the form of a constraint becomes available. The extended method goes beyond the mere selection of a single posterior p, but also addresses the question of how much less probable other distributions might be. Our approach clarifies how the entropy S[p|q] is used while avoiding the question of its meaning. Ultimately, entropy is a tool for induction which needs no interpretation. Finally, being a tool for generalization from special examples, we ask whether the functional form of the entropy depends on the choice of the examples and we find that it does. The conclusion is that there is no single general theory of inductive inference and that alternative expressions for the entropy are possible.Comment: Presented at MaxEnt23, the 23rd International Workshop on Bayesian Inference and Maximum Entropy Methods (August 3-8, 2003, Jackson Hole, WY, USA

    A Correction Rule for Inductive Methods

    Get PDF
    I will discuss the problem of choosing the correct inductive\ud method from Carnap"s (1952) continuum. My proposal is to\ud use a correction rule to adjust the method according to\ud obtained evidence. I will discuss a minimum requirement\ud such a rule has to satisfy, especially from a consturctive\ud point of view. The question of refuting inductive scepticism\ud by means of a correction rule is assessed

    Impredicative Encodings of (Higher) Inductive Types

    Full text link
    Postulating an impredicative universe in dependent type theory allows System F style encodings of finitary inductive types, but these fail to satisfy the relevant {\eta}-equalities and consequently do not admit dependent eliminators. To recover {\eta} and dependent elimination, we present a method to construct refinements of these impredicative encodings, using ideas from homotopy type theory. We then extend our method to construct impredicative encodings of some higher inductive types, such as 1-truncation and the unit circle S1

    A Generalized Method for Integrating Rule-based Knowledge into Inductive Methods Through Virtual Sample Creation

    Get PDF
    Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for classification. Methods that use domain knowledge have been shown to perform better than inductive learners. However, there is no general method to include domain knowledge into all inductive learning algorithms as all hybrid methods are highly specialized for a particular algorithm. We present an algorithm that will take domain knowledge in the form of propositional rules, generate artificial examples from the rules and also remove instances likely to be flawed. This enriched dataset then can be used by any learning algorithm. Experimental results of different scenarios are shown that demonstrate this method to be more effective than simple inductive learning
    • …
    corecore