915 research outputs found

    The information continuum model of evolution

    Get PDF
    Most biologists agree that evolution is contingent on inherited information shaped by natural selection. This apparent consensus could be taken to indicate agreement on the forces shaping evolution, but vivid discussions reveal divergences on how evolution is perceived. The predominant Modern Synthesis (MS) paradigm holds the position that evolution occurs through random changes acting on genomic inheritance. However, studies from recent decades have revealed that evolutionary inheritance also includes DNA-methylation, RNA, symbionts, and culture, among other factors. This has fueled a demand of a broader evolutionary perspective, for example from the proponents of the Extended Evolutionary Synthesis (EES). Despite fundamental disagreements the different views agree that natural selection happens through dissimilar perpetuation of inheritable information. Yet, neither the MS, nor the ESS dwell extensively on the nature of hereditary information. We do - and conclude that information in and of itself is immaterial. We then argue that the quality upon which natural selection acts henceforth is also immaterial. Based on these notions, we arrive at the information-centric Information Continuum Model (ICM) of evolution. The ICM asserts that hereditary information is embedded in diverse physical forms (DNA, RNA, symbionts etc.) representing a continuum of evolutionary qualities, and that information may migrate between these physical forms. The ICM leaves theoretical exploration of evolution unrestricted by the limitations imposed by the individual physical forms wherein the hereditary information is embedded (e.g. genomes). ICM bestows us with a simple heuristic model that adds explanatory dimensions to be considered in the evolution of biological systems.publishedVersio

    IT-enabled Process Innovation: A Literature Review

    Get PDF
    The importance of Information Technology (IT) is growing, and in a hypercompetitive market IT must be used as a strategic asset for companies to succeed. In order to gain strategic benefits from IT, companies need to be innovative when deploying IT. This can be achieved by reengineering business processes to take advantage of the possibilities IT provides. In 1993 Thomas H. Davenport presented a framework describing the role of IT in process innovation . Based on this framework, the purpose of this paper is to conduct a literature review to answer the following research question: What kind of opportunities does IT provide for process innovation? . Davenport\u27s framework is used as an analytical lens to review articles from the top 20 IS and management journals. The paper provides an overview and an in-depth analysis of the literature on IT-enabled process innovation and suggests avenues for future research as well as recommendations for practitioners. Our analyses reveal five distinct themes related to opportunities for IT-enabled process innovation, all of which offer guidance to practitioners and highlight gaps in our current knowledge about how to leverage IT for innovation purposes

    På jakt etter alfa, oppside beta og andre skjevheter. Test av investeringsstrategi i lys av adferdsøkonomiske fenomen

    Get PDF
    Denne masteroppgaven tester om det er mulig å oppnå risikojustert meravkastning i det norske aksjemarkedet gjennom en investeringsstrategi som holder billige selskaper målt på inntjening. Oppgaven er motivert av tidligere studier som indikerer overreaksjon hos investorer og brudd på hypotesen om effisiente markeder. En konsentrert portefølje som inneholder de 10 aksjene med lavest price earnings ratio (P/E) holdes derfor i ett år før den rebalanseres etter samme kriterier. Hovedperioden som studeres består av 240 månedlige observasjoner fra 1999:3 til 2019:2. Det dannes også to underperioder, en før finanskrisen (n = 105), og en fra finanskrisen og til enden av utvalget (n = 135). Den viktigste metoden som anvendes for å måle den risikojusterte avkastningen er en Pástor Stambaugh 5-faktormodell. Modellen estimeres med minste kvadraters metode (OLS), lineær spesifisering og robuste standardfeil. Sharpe Ratio for porteføljen og hovedindeksen sammenlignes også i tillegg til beregning av Information Ratio. Hovedresultatet til oppgaven er at strategien ikke oppnår risikojustert meravkastning i hovedperioden. Dette demonstreres gjennom et ikke statistisk signifikant konstantledd i 5-faktormodellen, og en marginalt lavere Sharpe Ratio på porteføljen sammenlignet med hovedindeksen. Oppgaven anvender 5 % signifikansnivå på alle t-testene. I underperiode 1 fra 1999:3 til 2007:7 oppnår porteføljen en månedlig risikojustert meravkastning på 0,9 %, eller 11,4 % årlig. Sharpe Ratio i perioden er også betydelig høyere for porteføljen sammenlignet med hovedindeksen (0,259 mot 0,166). Samtidig forsvinner meravkastningen i underperiode 2 fra 2007:8 til 2019:2. Den statistiske programvaren R er brukt til å utføre de kvantitative analysene og fremstilling av figurer. Tabeller er laget i Word. Nøkkelord: Meravkastning, effisiens, faktormodeller, overreaksjon, P/E

    Fast and effortless computation of profile likelihoods using CONNECT

    Full text link
    The frequentist method of profile likelihoods has recently received renewed attention in the field of cosmology. This is because the results of inferences based on the latter may differ from those of Bayesian inferences, either because of prior choices or because of non-Gaussianity in the likelihood function. Consequently, both methods are required for a fully nuanced analysis. However, in the last decades, cosmological parameter estimation has largely been dominated by Bayesian statistics due to the numerical complexity of constructing profile likelihoods, arising mainly from the need for a large number of gradient-free optimisations of the likelihood function. In this paper, we show how to accommodate the computational requirements of profile likelihoods using the publicly available neural network framework CONNECT together with a novel modification of the gradient-based basinbasin-hoppinghopping optimisation algorithm. Apart from the reduced evaluation time of the likelihood due to the neural network, we also achieve an additional speed-up of 1-2 orders of magnitude compared to profile likelihoods computed with the gradient-free method of simulatedsimulated annealingannealing, with excellent agreement between the two. This allows for the production of typical triangle plots normally associated with Bayesian marginalisation within cosmology (and previously unachievable using likelihood maximisation because of the prohibitive computational cost). We have tested the setup on three cosmological models: the Λ\LambdaCDM model, an extension with varying neutrino mass, and finally a decaying cold dark matter model. Given the default precision settings in CONNECT, we achieve a high precision in χ2\chi^2 with a difference to the results obtained by CLASS of Δχ20.2\Delta\chi^2\approx0.2 (and, importantly, without any bias in inferred parameter values) - easily good enough for profile likelihood analyses.Comment: 23 pages, 9 figure
    corecore