72,568 research outputs found

    Bayesian Synthesis: Combining subjective analyses, with an application to ozone data

    Full text link
    Bayesian model averaging enables one to combine the disparate predictions of a number of models in a coherent fashion, leading to superior predictive performance. The improvement in performance arises from averaging models that make different predictions. In this work, we tap into perhaps the biggest driver of different predictions---different analysts---in order to gain the full benefits of model averaging. In a standard implementation of our method, several data analysts work independently on portions of a data set, eliciting separate models which are eventually updated and combined through a specific weighting method. We call this modeling procedure Bayesian Synthesis. The methodology helps to alleviate concerns about the sizable gap between the foundational underpinnings of the Bayesian paradigm and the practice of Bayesian statistics. In experimental work we show that human modeling has predictive performance superior to that of many automatic modeling techniques, including AIC, BIC, Smoothing Splines, CART, Bagged CART, Bayes CART, BMA and LARS, and only slightly inferior to that of BART. We also show that Bayesian Synthesis further improves predictive performance. Additionally, we examine the predictive performance of a simple average across analysts, which we dub Convex Synthesis, and find that it also produces an improvement.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS444 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Conceptualisation of intellectual capital in analysts’ narratives: a performative view

    Get PDF
    Purpose: This study tests the performativity of Intellectual Capital (IC) from the perspective of sell-side analysts, a type of actor who consumes and creates IC information and in whose practice IC information plays a significant role. Design/methodology/approach: The empirical component of the study comprises a narrative analysis of the text of a large corpus of sell-side analysts’ initiation coverage reports. We adopt Mouritsen’s (2006) performative and ostensive conceptualisations of IC as our theoretical framework. Findings: We find that the identities and properties of IC elements are variable, dynamic and transformative. The relevance of IC elements in the eyes of analysts is conditional on the context, temporally contingent and bestowed indirectly. IC elements are attributed to firm value both directly, in a linear manner, and indirectly, via various non-linear interrelationships established with other IC elements, tangible capital and financial capital. Research limitations/implications: This study challenges the conventional IC research paradigm and contributes towards a performativity-inspired conceptualisation of IC and a resultant situated model of IC in place of a predictive model. Originality/value: This is the first study to apply a performative lens to study IC identities, roles and relationships from the perspective of a field of practice that is external to the organisation where IC is hosted. Examining IC from analysts’ perspective is important because not only can it provide an alternative perspective of IC, it also enables an understanding of analysts’ field of practice

    ShinyRBase: Near real-time energy saving models using reactive programming

    Get PDF
    To document energy savings from retrofitting a building, a reliable baseline model is needed. The development and implementation of the baseline model is an important step in the measurement and verification (M&V) process. Usually, an energy analyst enters the stage, collects data, do the estimation and delivers the baseline model. The modeling work of the energy analyst is done on either a proprietary or open-source statistical software, often using a coding script. If stakeholders want an updated report on energy savings, the analyst must re-do the whole process, for example on a monthly basis. This workflow is based on an imperative programming paradigm. The analyst holds on to the code that performs the analysis and re-run the code when agreed upon. The consequence of this workflow is that stakeholders are dependent on the energy analyst and that updated energy savings results must be planned and scheduled. However, emerging M&V 2.0 technologies enables automation of the energy saving reports. This paper demonstrates how energy savings from retrofitting’s in the Norwegian food retail sector is continuously monitored and documented in a web application. The application is built using open-source tools where the baseline model is delivered through a reactive programming framework. As an energy savings baseline model, the Tao Vanilla benchmarking model (TVB) was set into production in the web application. The TVB is a linear regression model with well specified features, easy to interpret and has a history of excellent prediction performance. The proposed web application framework allows for a fast development cycle without any need-to-know web programming languages like HTML, CSS or JavaScript. The reactive framework delivers several advantages. First, the stakeholders will always have a current and real-time report on the savings. Second, complex methodologies are dynamically used by the end-user. Third, increased involvement by stakeholders and interaction with the analyst related to the methods used in the energy savings analysis leads to collaborative benefits such as faster disseminating of knowledge. These synergy effect leads to a better technical understanding from the end user perspective and enhanced practical understanding for the analyst. Finally, the paper presents an integrated look at the energy kWh savings versus the cost of the retrofitting’s

    Approximately Optimal Mechanism Design: Motivation, Examples, and Lessons Learned

    Full text link
    Optimal mechanism design enjoys a beautiful and well-developed theory, and also a number of killer applications. Rules of thumb produced by the field influence everything from how governments sell wireless spectrum licenses to how the major search engines auction off online advertising. There are, however, some basic problems for which the traditional optimal mechanism design approach is ill-suited --- either because it makes overly strong assumptions, or because it advocates overly complex designs. The thesis of this paper is that approximately optimal mechanisms allow us to reason about fundamental questions that seem out of reach of the traditional theory. This survey has three main parts. The first part describes the approximately optimal mechanism design paradigm --- how it works, and what we aim to learn by applying it. The second and third parts of the survey cover two case studies, where we instantiate the general design paradigm to investigate two basic questions. In the first example, we consider revenue maximization in a single-item auction with heterogeneous bidders. Our goal is to understand if complexity --- in the sense of detailed distributional knowledge --- is an essential feature of good auctions for this problem, or alternatively if there are simpler auctions that are near-optimal. The second example considers welfare maximization with multiple items. Our goal here is similar in spirit: when is complexity --- in the form of high-dimensional bid spaces --- an essential feature of every auction that guarantees reasonable welfare? Are there interesting cases where low-dimensional bid spaces suffice?Comment: Based on a talk given by the author at the 15th ACM Conference on Economics and Computation (EC), June 201

    Software visualisation for object-oriented program comprehension

    Get PDF
    Software visualisation is the process of modelling software systems for comprehension. The comprehension of software systems both during and after development is a crucial component of the software process. The complex interactions inherent in the object-oriented paradigm make visualisation a particularly appropriate comprehension technique, and the large volume of information typically generated during visualisation necessitates tool support

    Affine arithmetic-based methodology for energy hub operation-scheduling in the presence of data uncertainty

    Get PDF
    In this study, the role of self-validated computing for solving the energy hub-scheduling problem in the presence of multiple and heterogeneous sources of data uncertainties is explored and a new solution paradigm based on affine arithmetic is conceptualised. The benefits deriving from the application of this methodology are analysed in details, and several numerical results are presented and discussed

    RELEVANCE OF POLICY ANALYSIS: NEEDS FOR DESIGN, IMPLEMENTATION AND PACKAGING

    Get PDF
    This article challenges the traditional model of the economist as a humble technocrat who simply provides analysis given the preferences of policy decision-makers. Since decision-makers rarely reveal their preferences, it is important that the would-be policy research/analyst know the political economy and be willing to identify potential performance goals for society. Researchers who are willing to incur the transaction cost associated with becoming involved in useful policy research must learn to work within the imperfect policy process. Policy research that considers the importance of implementation and that acknowledges the institutions and the history will have the highest chance of being useful to policy-makers.Implementation, Institutions, Policy research, Political economy, Agricultural and Food Policy,

    Where did that come from? Countertransference and the Oedipal triangle in family therapy

    Get PDF
    Family systems therapists are uncomfortable using psychoanalytic terms. This reluctance restricts discussion of therapeutic process. How does one describe, for example, the therapist’s subjective experiences of the patient or family? Psychoanalysts call this countertransference yet there is no equivalent word commonly used in systemic practice. Therapists who avoid the word may also avoid the experience and thereby risk losing sight of fundamental clinical events
    corecore