265,670 research outputs found

    A Semantic Information Formula Compatible with Shannon and Popper's Theories

    Get PDF
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information

    Bayes Factors for Forensic Decision Analyses with R

    Get PDF
    Bayes Factors for Forensic Decision Analyses with R provides a self-contained introduction to computational Bayesian statistics using R. With its primary focus on Bayes factors supported by data sets, this book features an operational perspective, practical relevance, and applicability—keeping theoretical and philosophical justifications limited. It offers a balanced approach to three naturally interrelated topics: – Probabilistic Inference: Relies on the core concept of Bayesian inferential statistics, to help practicing forensic scientists in the logical and balanced evaluation of the weight of evidence. – Decision Making: Features how Bayes factors are interpreted in practical applications to help address questions of decision analysis involving the use of forensic science in the law. – Operational Relevance: Combines inference and decision, backed up with practical examples and complete sample code in R, including sensitivity analyses and discussion on how to interpret results in context. Over the past decades, probabilistic methods have established a firm position as a reference approach for the management of uncertainty in virtually all areas of science, including forensic science, with Bayes' theorem providing the fundamental logical tenet for assessing how new information—scientific evidence—ought to be weighed. Central to this approach is the Bayes factor, which clarifies the evidential meaning of new information, by providing a measure of the change in the odds in favor of a proposition of interest, when going from the prior to the posterior distribution. Bayes factors should guide the scientist's thinking about the value of scientific evidence and form the basis of logical and balanced reporting practices, thus representing essential foundations for rational decision making under uncertainty. This book would be relevant to students, practitioners, and applied statisticians interested in inference and decision analyses in the critical field of forensic science. It could be used to support practical courses on Bayesian statistics and decision theory at both undergraduate and graduate levels, and will be of equal interest to forensic scientists and practitioners of Bayesian statistics for driving their evaluations and the use of R for their purposes

    Weighted Class Complexity: A Measure of Complexity for Object Oriented System

    Get PDF
    Software complexity metrics are used to predict critical information about reliability and maintainability of software systems. Object oriented software development requires a different approach to software complexity metrics. In this paper, we propose a metric to compute the structural and cognitive complexity of class by associating a weight to the class, called as Weighted Class Complexity (WCC). On the contrary, of the other metrics used for object oriented systems, proposed metric calculates the complexity of a class due to methods and attributes in terms of cognitive weight. The proposed metric has been demonstrated with OO examples. The theoretical and practical evaluations based on the information theory have shown that the proposed metric is on ratio scale and satisfies most of the parameters required by the measurement theor

    Congregational bonding social capital and psychological type : an empirical enquiry among Australian churchgoers

    Get PDF
    This study explores the variation in levels of bonding social capital experienced by individual churchgoers, drawing on data generated by the Australian National Church Life Survey, and employing a five-item measure of church-related bonding social capital. Data provided by 2065 Australian churchgoers are used to test the thesis that individual differences in bonding social capital are related to a psychological model of psychological types (employing the Jungian distinctions). The data demonstrated that higher levels of bonding social capital were found among extraverts (compared with introverts), among intuitive types (compared with sensing types) and among feeling types (compared with thinking types), but no significant differences were found between judging types and perceiving types

    How Far Can We Go Through Social System?

    Get PDF
    The paper elaborates an endeavor on applying the algorithmic information-theoretic computational complexity to meta-social-sciences. It is motivated by the effort on seeking the impact of the well-known incompleteness theorem to the scientific methodology approaching social phenomena. The paper uses the binary string as the model of social phenomena to gain understanding on some problems faced in the philosophy of social sciences or some traps in sociological theories. The paper ends on showing the great opportunity in recent social researches and some boundaries that limit them
    • …
    corecore