436 research outputs found

    The Ontology of Intentional Agency in Light of Neurobiological Determinism: Philosophy Meets Folk Psychology

    Get PDF
    The moot point of the Western philosophical rhetoric about free will consists in examining whether the claim of authorship to intentional, deliberative actions fits into or is undermined by a one-way causal framework of determinism. Philosophers who think that reconciliation between the two is possible are known as metaphysical compatibilists. However, there are philosophers populating the other end of the spectrum, known as the metaphysical libertarians, who maintain that claim to intentional agency cannot be sustained unless it is assumed that indeterministic causal processes pervade the action-implementation apparatus employed by the agent. The metaphysical libertarians differ among themselves on the question of whether the indeterministic causal relation exists between the series of intentional states and processes, both conscious and unconscious, and the action, making claim for what has come to be known as the event-causal view, or between the agent and the action, arguing that a sort of agent causation is at work. In this paper, I have tried to propose that certain features of both event-causal and agent-causal libertarian views need to be combined in order to provide a more defendable compatibilist account accommodating deliberative actions with deterministic causation. The ‘‘agent-executed-eventcausal libertarianism’’, the account of agency I have tried to develop here, integrates certain plausible features of the two competing accounts of libertarianism turning them into a consistent whole. I hope to show in the process that the integration of these two variants of libertarianism does not challenge what some accounts of metaphysical compatibilism propose—that there exists a broader deterministic relation between the web of mental and extra-mental components constituting the agent’s dispositional system—the agent’s beliefs, desires, short-term and long-term goals based on them, the acquired social, cultural and religious beliefs, the general and immediate and situational environment in which the agent is placed, etc. on the one hand and the decisions she makes over her lifetime on the basis of these factors. While in the ‘‘Introduction’’ the philosophically assumed anomaly between deterministic causation and the intentional act of deciding has been briefly surveyed, the second section is devoted to the task of bridging the gap between compatibilism and libertarianism. The next section of the paper turns to an analysis of folk-psychological concepts and intuitions about the effects of neurochemical processes and prior mental events on the freedom of making choices. How philosophical insights can be beneficially informed by taking into consideration folk-psychological intuitions has also been discussed, thus setting up the background for such analysis. It has been suggested in the end that support for the proposed theory of intentional agency can be found in the folk-psychological intuitions, when they are taken in the right perspective

    Simulating complex social behaviour with the genetic action tree kernel

    Get PDF
    The concept of genetic action trees combines action trees with genetic algorithms. In this paper, we create a multi-agent simulation on the base of this concept and provide the interested reader with a software package to apply genetic action trees in a multi-agent simulation to simulate complex social behaviour. An example model is introduced to conduct a feasibility study with the described method. We find that our library can be used to simulate the behaviour of agents in a complex setting and observe a convergence to a global optimum in spite of the absence of stable states

    Use of linear mixed models for genetic evaluation of gestation length and birth weight allowing for heavy-tailed residual effects

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The distribution of residual effects in linear mixed models in animal breeding applications is typically assumed normal, which makes inferences vulnerable to outlier observations. In order to mute the impact of outliers, one option is to fit models with residuals having a heavy-tailed distribution. Here, a Student's-<it>t </it>model was considered for the distribution of the residuals with the degrees of freedom treated as unknown. Bayesian inference was used to investigate a bivariate Student's-<it>t </it>(BS<it>t</it>) model using Markov chain Monte Carlo methods in a simulation study and analysing field data for gestation length and birth weight permitted to study the practical implications of fitting heavy-tailed distributions for residuals in linear mixed models.</p> <p>Methods</p> <p>In the simulation study, bivariate residuals were generated using Student's-<it>t </it>distribution with 4 or 12 degrees of freedom, or a normal distribution. Sire models with bivariate Student's-<it>t </it>or normal residuals were fitted to each simulated dataset using a hierarchical Bayesian approach. For the field data, consisting of gestation length and birth weight records on 7,883 Italian Piemontese cattle, a sire-maternal grandsire model including fixed effects of sex-age of dam and uncorrelated random herd-year-season effects were fitted using a hierarchical Bayesian approach. Residuals were defined to follow bivariate normal or Student's-<it>t </it>distributions with unknown degrees of freedom.</p> <p>Results</p> <p>Posterior mean estimates of degrees of freedom parameters seemed to be accurate and unbiased in the simulation study. Estimates of sire and herd variances were similar, if not identical, across fitted models. In the field data, there was strong support based on predictive log-likelihood values for the Student's-<it>t </it>error model. Most of the posterior density for degrees of freedom was below 4. Posterior means of direct and maternal heritabilities for birth weight were smaller in the Student's-<it>t </it>model than those in the normal model. Re-rankings of sires were observed between heavy-tailed and normal models.</p> <p>Conclusions</p> <p>Reliable estimates of degrees of freedom were obtained in all simulated heavy-tailed and normal datasets. The predictive log-likelihood was able to distinguish the correct model among the models fitted to heavy-tailed datasets. There was no disadvantage of fitting a heavy-tailed model when the true model was normal. Predictive log-likelihood values indicated that heavy-tailed models with low degrees of freedom values fitted gestation length and birth weight data better than a model with normally distributed residuals.</p> <p>Heavy-tailed and normal models resulted in different estimates of direct and maternal heritabilities, and different sire rankings. Heavy-tailed models may be more appropriate for reliable estimation of genetic parameters from field data.</p

    Individual Actions as Community Informative Resources. A Collective Informative Systems Approach

    Get PDF
    This paper conceives communities (in this case, partnerships) as being able to become collective informative repositories of individual and collective actions that may better-inform their members. This paper presents one approach for studying if a community has become such an informative repository. The approach used here consists of introducing a formal language (Viable Systems Modelling, VSM) into one of the community nodes (a participant) and tracing if its use is seen in another node (another participant) - indicating the presence of a process of diffusion. This research design has been tested in a crime-reduction partnership in the UK. One of its members was asked to engage in the design and testing of this approach as a co-researcher. As a result, a questionnaire to map communication and control devices inside an organization was jointly developed. In keeping with VSM principles, the questionnaire encouraged participants to reflect on attenuation and amplification processes within their communications channels. To test the quality of the outcomes of this approach, members from another crime-reduction partnership were also invited to answer the survey; this was to confirm that VSM notions were not evident for those outside the development and testing of the questionnaire. The questionnaire indicated also its capability to make visible communication and organizational processes within collectives and its potential to stimulate self-organization, for those individuals who became familiar with VSM. Furthermore, this approach provided the authors with the capability to study information flows inside the two collectives, and contributed to an understanding of these flows as a model for building and maintaining a Community Informative System

    Intergenic and Genic Sequence Lengths Have Opposite Relationships with Respect to Gene Expression

    Get PDF
    Eukaryotic genomes are mostly composed of noncoding DNA whose role is still poorly understood. Studies in several organisms have shown correlations between the length of the intergenic and genic sequences of a gene and the expression of its corresponding mRNA transcript. Some studies have found a positive relationship between intergenic sequence length and expression diversity between tissues, and concluded that genes under greater regulatory control require more regulatory information in their intergenic sequences. Other reports found a negative relationship between expression level and gene length and the interpretation was that there is selection pressure for highly expressed genes to remain small. However, a correlation between gene sequence length and expression diversity, opposite to that observed for intergenic sequences, has also been reported, and to date there is no testable explanation for this observation. To shed light on these varied and sometimes conflicting results, we performed a thorough study of the relationships between sequence length and gene expression using cell-type (tissue) specific microarray data in Arabidopsis thaliana. We measured median gene expression across tissues (expression level), expression variability between tissues (expression pattern uniformity), and expression variability between replicates (expression noise). We found that intergenic (upstream and downstream) and genic (coding and noncoding) sequences have generally opposite relationships with respect to expression, whether it is tissue variability, median, or expression noise. To explain these results we propose a model, in which the lengths of the intergenic and genic sequences have opposite effects on the ability of the transcribed region of the gene to be epigenetically regulated for differential expression. These findings could shed light on the role and influence of noncoding sequences on gene expression

    RDR2 Partially Antagonizes the Production of RDR6-Dependent siRNA in Sense Transgene-Mediated PTGS

    Get PDF
    Background: RNA-DEPENDENT RNA POLYMERASE6 (RDR6) and SUPPRESSOR of GENE SILENCING 3 (SGS3) are required for DNA methylation and post-transcriptional gene silencing (PTGS) mediated by 21-nt siRNAs produced by sense transgenes (S-PTGS). In contrast, RDR2, but not RDR6, is required for DNA methylation and TGS mediated by 24-nt siRNAs, and for cellto-cell spreading of IR-PTGS mediated by 21-nt siRNAs produced by inverted repeat transgenes under the control of a phloem-specific promoter. Principal Findings: In this study, we examined the role of RDR2 and RDR6 in S-PTGS. Unlike RDR6, RDR2 is not required for DNA methylation of transgenes subjected to S-PTGS. RDR6 is essential for the production of siRNAs by transgenes subjected to S-PTGS, but RDR2 also contributes to the production of transgene siRNAs when RDR6 is present because rdr2 mutations reduce transgene siRNA accumulation. However, the siRNAs produced via RDR2 likely are counteractive in wildtype plants because impairement of RDR2 increases S-PTGS efficiency at a transgenic locus that triggers limited silencing, and accelerates S-PTGS at a transgenic locus that triggers efficient silencing. Conclusions/Significance: These results suggest that RDR2 and RDR6 compete for RNA substrates produced by transgenes subjected to S-PTGS. RDR2 partially antagonizes RDR6 because RDR2 action likely results in the production of counteractiv

    Proactive and integrated primary care for frail older people: design and methodological challenges of the Utrecht primary care PROactive frailty intervention trial (U-PROFIT)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Currently, primary care for frail older people is reactive, time consuming and does not meet patients' needs. A transition is needed towards proactive and integrated care, so that daily functioning and a good quality of life can be preserved. To work towards these goals, two interventions were developed to enhance the care of frail older patients in general practice: a screening and monitoring intervention using routine healthcare data (U-PRIM) and a nurse-led multidisciplinary intervention program (U-CARE). The U-PROFIT trial was designed to evaluate the effectiveness of these interventions. The aim of this paper is to describe the U-PROFIT trial design and to discuss methodological issues and challenges.</p> <p>Methods/Design</p> <p>The effectiveness of U-PRIM and U-CARE is being tested in a three-armed, cluster randomized trial in 58 general practices in the Netherlands, with approximately 5000 elderly individuals expected to participate. The primary outcome is the effect on activities of daily living as measured with the Katz ADL index. Secondary outcomes are quality of life, mortality, nursing home admission, emergency department and out-of-hours General Practice (GP), surgery visits, and caregiver burden.</p> <p>Discussion</p> <p>In a large, pragmatic trial conducted in daily clinical practice with frail older patients, several challenges and methodological issues will occur. Recruitment and retention of patients and feasibility of the interventions are important issues. To enable broad generalizability of results, careful choices of the design and outcome measures are required. Taking this into account, the U-PROFIT trial aims to provide robust evidence for a structured and integrated approach to provide care for frail older people in primary care.</p> <p>Trial registration</p> <p><a href="http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2288">NTR2288</a></p

    Diagnosis of lethal or prenatal-onset autosomal recessive disorders by parental exome sequencing.

    Get PDF
    OBJECTIVE: Rare genetic disorders resulting in prenatal or neonatal death are genetically heterogeneous, but testing is often limited by the availability of fetal DNA, leaving couples without a potential prenatal test for future pregnancies. We describe our novel strategy of exome sequencing parental DNA samples to diagnose recessive monogenic disorders in an audit of the first 50 couples referred. METHOD: Exome sequencing was carried out in a consecutive series of 50 couples who had 1 or more pregnancies affected with a lethal or prenatal-onset disorder. In all cases, there was insufficient DNA for exome sequencing of the affected fetus. Heterozygous rare variants (MAF < 0.001) in the same gene in both parents were selected for analysis. Likely, disease-causing variants were tested in fetal DNA to confirm co-segregation. RESULTS: Parental exome analysis identified heterozygous pathogenic (or likely pathogenic) variants in 24 different genes in 26/50 couples (52%). Where 2 or more fetuses were affected, a genetic diagnosis was obtained in 18/29 cases (62%). In most cases, the clinical features were typical of the disorder, but in others, they result from a hypomorphic variant or represent the most severe form of a variable phenotypic spectrum. CONCLUSION: We conclude that exome sequencing of parental samples is a powerful strategy with high clinical utility for the genetic diagnosis of lethal or prenatal-onset recessive disorders. Š 2017 The Authors Prenatal Diagnosis published by John Wiley & Sons Ltd

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
    • …
    corecore